Volume 22, Issue 2 (9-2025)                   JSDP 2025, 22(2): 127-138 | Back to browse issues page


XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Nasiri M, Daneshpour N. Presenting a new method for multi label classification based on neural network. JSDP 2025; 22 (2) : 8
URL: http://jsdp.rcisp.ac.ir/article-1-1433-en.html
Associate Professor, Computer Engineering Department, Shahid Rajaee Teacher Training University, Tehran, Iran
Abstract:   (233 Views)
The problem of classification can be divided into two categories: single-label and multi-label. Single-label classification consists of binary and multi-class classification. In binary classification, the task is to predict one in two possible classes, such as distinguishing between spam and non-spam emails. In multi-class classification, the goal is to classify instances into more than two classes, such as identifying different species of flowers based on petal measurements. In contrast to single-label classification, multi-label classification is more complex because each instance could belong to multiple categories simultaneously. In multi-label learning, instead of assigning a single label to each instance, a set of labels is assigned. This means that each sample may have zero, one, or more than one associated label. For example, in a text classification task, a news article about technology and business might be labeled as both "Technology" and "Business". To handle multi-label classification, several approaches have been developed. One of the simplest methods is Binary Relevance (BR), which transforms the multi-label problem into multiple independent binary classification tasks—one for each label. Although this approach is easy to implement, it treats each label independently and ignores possible relationships among them. However, in real-world applications, labels are often correlated; for instance, in medical diagnosis, certain diseases frequently appear together. In another approach, Label Powerset (LP), considers label dependencies by treating each unique combination of labels as a separate class. While this method captures relationships between labels, it suffers from scalability issues while dealing with a large number of labels, as the number of possible label combinations increases exponentially. To address these challenges, the proposed method incorporates k-means constraint clustering to group both labels and features prior to applying classification. In the first step, clustering is performed to group similar labels together, ensuring that label correlations are preserved. This also helps to mitigate the issue of imbalanced classification, where certain labels may be underrepresented in the dataset. Once the labels are being clustered, a separate multi-layer neural network would be assigned to each cluster. Instead of using a single large neural network for all labels, multiple smaller networks would be trained for different label clusters. This approach enhances learning efficiency and improves accuracy by focusing on relevant label groups. However, using multiple classifiers increases computational costs and training time. To mitigate this issue, a scatter-add dimension reduction technique is applied. Using scatter-add, attributes are efficiently assigned to the input of each neural network, ensuring that each classifier receives only the relevant feature subset. Each neural network then predicts labels within its designated cluster. Eventually, the predictions from all classifiers are combined to generate the final multi-label output for each instance. To evaluate the effectiveness of the proposed method, experiments were conducted on various text datasets. The results were compared with traditional multi-label classification methods, including Binary Relevance and Label Powerset. The evaluation has been based on several performance metrics, such as accuracy, precision, and hamming-loss. The results demonstrated that the proposed approach achieved superior performance across multiple datasets, ranking first in several evaluation criteria. Notably, it outperformed existing methods by a margin of approximately 1% in accuracy. These findings suggest that clustering-based multi-label classification using k-means constraint clustering and multi-layer neural networks is a promising approach. By leveraging label correlations and reducing dimensionality, the proposed method effectively improves classification performance while addressing issues such as label imbalance and computational inefficiency. Future research may further explore optimization techniques to reduce training time while maintaining high accuracy.
Article number: 8
Full-Text [PDF 1432 kb]   (83 Downloads)    
Type of Study: Research | Subject: Paper
Received: 2024/07/14 | Accepted: 2025/03/15 | Published: 2025/09/13 | ePublished: 2025/09/13

References
1. Han, Jiawei, Jian Pei, and Micheline Kamber, Data mining: concepts and techniques, Elsevier, 2011.
2. Zhang, Min-Ling, and Zhi-Hua Zhou. "A review on multi-label learning algorithms", IEEE transactions on knowledge and data engineering, 26.8, 1819-1837, 2013. [DOI:10.1109/TKDE.2013.39]
3. Chalkidis, Ilias, et al. "Large-scale multi-label text classification on EU legislation", arXiv preprint arXiv, 1906.02192, 2019. [DOI:10.18653/v1/P19-1636]
4. Spyromitros-Xioufis, Eleftherios, et al. "Multi-target regression via input space expansion: treating targets as inputs", Machine Learning, 104, 55-98, 2016. [DOI:10.1007/s10994-016-5546-z]
5. Yang, Qi, et al. "Amnn: Attention-based multimodal neural network model for hashtag recommendation", IEEE Transactions on Computational Social Systems, 7.3, 768-779, 2020. [DOI:10.1109/TCSS.2020.2986778]
6. Lee, Jaesung, et al. "Compact feature subset-based multi-label music categorization for mobile devices", Multimedia Tools and Applications, 78, 4869-4883, 2019. [DOI:10.1007/s11042-018-6100-8]
7. Wang, Jiang, et al. "Cnn-rnn: A unified framework for multi-label image classification", Proceedings of the IEEE conference on computer vision and pattern recognition, 2016. [DOI:10.1109/CVPR.2016.251]
8. Khandagale, Sujay, Han Xiao, and Rohit Babbar, "Bonsai: diverse and shallow trees for extreme multi-label classification", Machine Learning 109 (11), 2099-2119, 2020. [DOI:10.1007/s10994-020-05888-2]
9. Tanaka, Erica Akemi, et al. "A multi-label approach using binary relevance and decision trees applied to functional genomics", Journal of biomedical informatics, 54, 85-95, 2015. [DOI:10.1016/j.jbi.2014.12.011] [PMID]
10. Prajapati, Purvi, Thakkar, Amit, "Performance improvement of extreme multi-label classification using K-way tree construction with parallel clustering algorithm", Journal of King Saud University-Computer and Information Sciences, 34(8), 6354-6364, 2021. [DOI:10.1016/j.jksuci.2021.02.014]
11. Prabhu, Yashoteja, et al. "Parabel: Partitioned label trees for extreme classification with application to dynamic search advertising", Proceedings of the 2018 World Wide Web Conference, 2018, 993-1002. [DOI:10.1145/3178876.3185998]
12. Zhang, Min-Ling, et al. "Binary relevance for multi-label learning: an overview", Frontiers of Computer Science, 12(2), 191-202, (2018). [DOI:10.1007/s11704-017-7031-7]
13. Jun, Xie, et al. "Conditional entropy based classifier chains for multi-label classification", Neurocomputing, 335, 185-194, 2019. [DOI:10.1016/j.neucom.2019.01.039]
14. Wang, Ran, et al. "Active k-labelsets ensemble for multi-label classification", Pattern Recognition, 109, 107583, (2021). [DOI:10.1016/j.patcog.2020.107583]
15. Moyano, Jose M., et al. "Combining multi-label classifiers based on projections of the output space using Evolutionary algorithms", Knowledge-Based Systems, 196, 105770, 2020. [DOI:10.1016/j.knosys.2020.105770]
16. Cerri, Ricardo, Rodrigo C. Barros, and André CPLF De Carvalho, "Hierarchical multi-label classification using local neural networks", Journal of Computer and System Sciences, 80.1, 39-56, 2014. [DOI:10.1016/j.jcss.2013.03.007]
17. Li, Junlong, et al. "Learning common and label-specific features for multi-Label classification with correlation information", Pattern Recognition , 121, 108259, 2022. https://doi.org/10.1016/j.patcog.2021.108259 [DOI:10.1016/j.patcog.2021.108256]
18. Zhu, Xiaoyan, et al. "Dynamic ensemble learning for multi-label classification", Information Sciences, 623, 94-111, 2023. [DOI:10.1016/j.ins.2022.12.022]
19. J. Huang, G. Li, Q. Huang, X. Wu, "Learning label specific features for multi-label classification", IEEE ICDM 2015, pp. 181-190, 2015. [DOI:10.1109/ICDM.2015.67]
20. J. Huang, G. Li, Q. Huang, X. Wu, "Learning label-specific features and class dependent labels for multi-label classification", IEEE Trans. Knowl. Data Eng, 28 (12), 3309-3323, 2016. [DOI:10.1109/TKDE.2016.2608339]
21. A. Braytee, W. Liu, A. Anaissi, P.J. Kennedy, "Correlated multi-label classification with incomplete label space and class imbalance", ACM Trans. Intell. Syst. Technol. 10 (5), 56:1-56:26, 2019. [DOI:10.1145/3342512]
22. Y. Wang, W. Zheng, Y. Cheng, D. Zhao, "Joint label completion and label-specific features for multi-label learning algorithm", Soft Comput, 24 (11), 6553-6569, 2020. [DOI:10.1007/s00500-020-04775-1]
23. H. Han, M. Huang, Y. Zhang, X. Yang, W. Feng, "Multi-label learning with label specific features using correlation information", IEEE Access 7, 11474- 11484, 2017. [DOI:10.1109/ACCESS.2019.2891611]
24. X. Jia, S. Zhu, W. Li, "Joint label-specific features and correlation information for multi-label learning", J. Comput. Sci. Technol. 35 (2) (2020) 247-258 [DOI:10.1007/s11390-020-9900-z]
25. صامت عمرانی، مسلم، صنیعی آباده، محمد، مقدم چرکری، نصراله، «تشخیص شایعه در شبکه اجتماعی توییتر با استفاده از ویژگی‌های توییت و کاربر»، فصلنامة پردازش علائم و داده‌ها، دورة 21، شمارة 2، صص 15-28، 1403.
25. Moslem Samet Omrani, Mohammad Saniee Abadeh, Nasrollah Moghaddam Charkari, "Rumor Detection on Twitter using tweet and user features", Signal and Data Processing, 21(2), 15-28. 2024. [DOI:10.61186/jsdp.21.2.15]
26. پروین نیا، الهام، صفری، محمد، خیامی، سید علیرضا، «تشخیص حالت غیر نرمال ماشین های دوار با داده کاوی در پارامترهای حفاظتی»، فصلنامة پردازش علائم و داده‌ها، دورة 21، شمارة 1، صص 27-38، 1403.
26. Elham Parvinnia, Mohammad Safari, Seyed Alireza Khayami, "Exploring on rotating machines abnormal state with data mining in protective parameters", Signal and Data Processing, 21(1), 27-38, 2024. [DOI:10.61186/jsdp.21.1.27]

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2015 All Rights Reserved | Signal and Data Processing