Sensitivity Analysis of Parameter Control in Leukemia Classification Using Variable-Length Particle Swarm Optimization
DOI:
https://doi.org/10.31849/digitalzone.v16i2.27473Keywords:
Leukemia Classification, Variable-Length Particle Swarm Optimization, Adaptive Parameter Control, Generalization Consistency, Parameter Sensitivity AnalysisAbstract
Machine learning has the potential to support hematologists in classifying leukemia by identifying abnormal chromosomes and specific gene markers. One effective technique for feature selection is Variable-Length Particle Swarm Optimization (VLPSO), where its performance depends heavily on parameter control, specifically the inertia weight (w) and acceleration factors (c), which regulate the search process. In previous VLPSO, static types of parameter control were applied to the Factor, and time-varying types were used by the Factor. Although its results showed good performance in VLPSO, there was no separation in the treatment of training data and test data, leaving a gap in understanding their impacts for real-world applications. This study explores how different parameter control strategies (static, time-varying, and adaptive) affect the performance of VLPSO with two comparison adaptive parameter control approaches, Adaptive 1 and Adaptive 2, in the VLPSO framework, each designed to dynamically adjust the control parameters w and c in different ways. The 10-fold cross-validation shows that VLPSO with an Adaptive one-parameter setting achieves better generalization with low train-test differences, especially in Decision Tree and Naïve Bayes classifiers, though with higher variability. Adaptive 2-parameter setting of VLPSO offers more consistent results with narrower variability across different settings. Static methods are the least reliable, while time-varying controls show moderate but unstable performance. Adaptive parameter tuning is recommended to improve VLPSO's stability, flexibility, and classification accuracy in biomedical applications. The results provide recommendations for parameter settings using an adaptive approach that has been proven to enhance the performance of VLPSO
References
[1] J. M. Castillo, M. Arif, W. J. Niessen, I. G. Schoots, and J. F. Veenland, “Automated Classification of Significant Prostate Cancer on MRI: A Systematic Review on the Performance of Machine Learning Applications.” Cancers, vol. 12, no. 6, pp. 1-13, 2022, doi: https://doi.org/10.3390/cancers12061606
[2] R. Alfano, G. S. Bauman, J. A. Gomez, M. Gaed, M. Mousa, J. Chin, et al, “Prostate cancer classification using radiomics and machine learning on mp-MRI validated using co-registered histology,” European Journal of Radiology, vol. 156, no. 110494, pp. 1-9, 2022, doi: https://doi.org/10.1016/j.ejrad.2022.110494
[3] M. A. Mazurowski, M. Buda, A. Saha, and M. R. Bashir, “Deep Learning in Radiology: An Overview of the Concepts and a Survey of the State of the Art with Focus on MRI,” Journal of Magnetic Resonance Imaging, vol. 49, no. 4, pp. 939–954, 2018, doi: 10.1002/jmri.. 26534.
[5] T. R. Mahesh, D. Santhakumar, A. Balajee, H. S. Shreenidhi, V. V. Kumar, and J. Rajkumar Annand, “Hybrid Ant Lion Mutated Ant Colony Optimizer Technique With Particle Swarm Optimization for Leukemia Prediction Using Microarray Gene Data,” IEEE Access, vol. 12, no. December 2023, pp. 10910–10919, 2024, doi: 10.1109/ACCESS.2024.3351871.
[6] L. Alzubaidi et al., “Towards a better understanding of transfer learning for medical imaging: A case study,” Applied Sciences (Switzerland), vol. 10, no. 13, pp. 1–21, 2020, doi: 10.3390/app10134523.
[7] D. Santhakumar and S. Logeswari, “Efficient attribute selection technique for leukaemia prediction using microarray gene data,” Soft Comput, vol. 24, no. 18, pp. 14265–14274, 2020, doi: 10.1007/s00500-020-04793-z.
[8] B. Tran, B. Xue, and M. Zhang, “Variable-Length Particle Swarm Optimization for Feature Selection on High-Dimensional Classification,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 3, pp. 473–487, 2019, doi: 10.1109/tevc.2018.2869405.
[9] K. Hamad, R. M. Ahmed, and B. Celik, “Using Machine Learning to early detection and classification of breast cancer masses based on medical image processing.” Research Square, vol. 1, pp. 1-22, 2023, doi: https://doi.org/10.21203/rs.3.rs-2973046/v1
[10] K. Sangeetha and S. Prakash, “An Early Breast Cancer Detection System Using Stacked Auto Encoder Deep Neural Network with Particle Swarm Optimization Based Classification Method,” J Med Imaging Health Inform, vol. 11, no. 12, pp. 2897–2906, 2021, doi: 10.1166/jmihi.2021.3886.
[11] J. M. C. T. Veenland, M. Arif, W. J. Niessen, I. G. Schoots, and J. F. Veenland, “Automated classification of significant prostate cancer on MRI: A systematic review on the performance of machine learning applications,” Cancers (Basel), vol. 12, no. 6, p. 1606, 2020, doi: 10.3390/cancers12061606
[12] T. M. Shami, A. A. El-Saleh, M. Alswaitti, Q. Al-Tashi, M. A. Summakieh, and S. Mirjalili, “Particle Swarm Optimization: A Comprehensive Survey,” IEEE Access, vol. 10, pp. 10031–10061, 2022, doi: 10.1109/ACCESS.2022.3142859.
[13] N. S. Mohammed and H. Beitollahi, “Accurate Classification in Uncertainty Dataset Using Particle Swarm Optimization-trained Radial Basis Function,” SSRN Electronic Journal, vol. 20, no. 9, pp. 1-14, 2022, doi: 10.2139/ssrn.4203377.
[14] J. Kennedy, “Particle swarm: Social adaptation of knowledge,” in Proceedings of the IEEE Conference on Evolutionary Computation, ICEC, 1997, pp. 303–308. doi: 10.1109/icec.1997.592326.
[15] J. Kennedy and R. C. Eberhart, “Discrete binary version of the particle swarm algorithm,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 1997, pp. 4104–4108. doi: 10.1109/icsmc.1997.637339.
[16] B. Tran, B. Xue, M. Zhang, and S. Nguyen, “Investigation on particle swarm optimisation for feature selection on high-dimensional data: local search and selection bias,” Conn Sci, vol. 28, no. 3, pp. 270–294, 2016, doi: 10.1080/09540091.2016.1185392.
[17] I. C. Trelea, “The particle swarm optimization algorithm: Convergence analysis and parameter selection,” Inf Process Lett, vol. 85, no. 6, pp. 317–325, 2003, doi: 10.1016/S0020-0190(02)00447-7.
[18] S. Jingqaio, Zahng; Arthur C., “JADE: Self-Adaptive Differential Evolution with Fast and Reliable Convergence Performance,” Evol Comput, vol. 13, no. 5, pp. 2251–2258, 2007. https://dl.acm.org/doi/abs/10.1109/tevc.2009.2014613?utm
[19] M. Isiet and M. Gadala, “Sensitivity analysis of control parameters in particle swarm optimization,” J Comput Sci, vol. 41, pp. 101086, 2020, doi: 10.1016/j.jocs.2020.101086.
[20] M. Jain, V. Saihjpal, N. Singh, and S. B. Singh, “An Overview of Variants and Advancements of PSO Algorithm,” vol. 12, no.17, pp. 1-21 2022, MDPI. Doi: 10.3390/app12178392.
[21] E. K. Ampomah, Z. Qin, and G. Nyame, “Evaluation of tree-based ensemble machine learning models in predicting stock price direction of movement,” Information (Switzerland), vol. 11, no. 6, pp. 1-22, 2020, doi: 10.3390/info11060332.
[22] S. Camalan et al., “Convolutional neural network-based clinical predictors of oral dysplasia: Class activation map analysis of deep learning results,” Cancers (Basel), vol. 13, no. 6, pp. 1–18, Mar. 2021, doi: 10.3390/cancers13061291.
[23] A. Ratnaweera, S. K. Halgamuge, and H. C. Watson, “Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 3, pp. 240–255, 2004, doi: 10.1109/TEVC.2004.826071.
[24] M. Li, H. Chen, X. Shi, S. Liu, M. Zhang, and S. Lu, “A multi-information fusion `triple variables with iteration’ inertia weight PSO algorithm and its application,” Appl. Soft Comput., vol. 84, no. Nov, p. 105677, 2019. doi: 10.1016/j.asoc.2019.105677
[25] D. Cao, Y. Xu, Z. Yang, H. Dong, and X. Li, “An enhanced whale optimization algorithm with improved dynamic opposite learning and adaptive inertia weight strategy,” Complex & Intelligent Systems, vol. 9, no. 1, pp. 767–795, 2023, doi: 10.1007/s40747-022-00827-1.
[26] S. Ajibade et al., “An Insight Into The Modification Of Inertia Weight Of Pso For Feature Selection,” Article in Journal of Biomechanical Science and Engineering, vol. July, pp. 683-700, 2023, doi: 10.17605/OSF.IO/R2ZUF.
[27] A. T. Azar, Z. I. Khan, S. U. Amin, and K. M. Fouad, “Hybrid Global Optimization Algorithm for Feature Selection,” Computers, Materials and Continua, vol. 74, no. 1, pp. 2021–2037, 2023, doi: 10.32604/cmc.2023.032183.
[28] B. Ji, X. Lu, G. Sun, W. Zhang, J. Li, and Y. Xiao, “Bio-Inspired Feature Selection: An Improved Binary Particle Swarm Optimization Approach,” IEEE Access, vol. 8, pp. 85989–86002, 2020, doi: 10.1109/ACCESS.2020.2992752.
[29] B. Nouri-Moghaddam, M. Ghazanfari, and M. Fathian, “A novel bio-inspired hybrid multi-filter wrapper gene selection method with ensemble classifier for microarray data.” Springer Nature, vol. 35, no. 16, pp. 11531-11561, 2023. doi: 10.1007/s00521-021-06459-9.
[30] T. O. Q. Saraf, N. Fuad, and N. S. A. M. Taujuddin, “Framework of Meta-Heuristic Variable Length Searching for Feature Selection in High-Dimensional Data,” Computers, vol. 12, no. 1, pp. 1–13, 2023, doi: 10.3390/computers12010007.
[31] B. Tran, B. Xue, and M. Zhang, “Variable-Length Particle Swarm Optimization for Feature Selection on High-Dimensional Classification,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 3, pp. 473–487, 2019, doi: 10.1109/tevc. 2018.2869405.
[32] Y. Cao, H. Zhang, W. Li, M. Zhou, Y. Zhang, and W. A. Chaovalitwongse, “Comprehensive Learning Particle Swarm Optimization Algorithm with Local Search for Multimodal Functions,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 4, pp. 718–731, 2019, doi: 10.1109/TEVC.2018.2885075.
[33] B. Tran, B. Xue, and M. Zhang, “Variable-Length Particle Swarm Optimization for Feature Selection on High-Dimensional Classification,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 3, pp. 473–487, 2019, doi: 10.1109/TEVC.2018.2869405.
[34] S. Sazzed, “ANOVA-SRC-BPSO: a hybrid filter and swarm optimization-based method for gene selection and cancer classification using gene expression profiles,” Proceedings of the Canadian Conference on Artificial Intelligence, pp. 1-12. 2021, doi: 10.21428/594757db.9e9e0337.
[35] P. Moradi and M. Gholampour, “‘A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy,’” Applied Soft Computing Journal, vol. 43, no. C, pp. 117–130, 2016, doi: 10.1016/j.asoc.2016.01.044.
[36] C. Qiu, “Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification,” International Journal of Computational Intelligence Systems, vol. 11, no. 1, pp. 1–14, 2018, doi: 10.2991/ijcis.11.1.1.
[37] L. Y. Chuang, C. S. Yang, K. C. Wu, and C. H. Yang, “Gene selection and classification using Taguchi chaotic binary particle swarm optimization,” Expert Syst Appl, vol. 38, no. 10, pp. 13367–13377, 2011, doi: 10.1016/j.eswa.2011.04.165.
[38] P. Kadlec, “Multi-Objective PSO with Variable Number of Dimensions for Space Robot Path Optimization,” Algorithms, vol. 16, no. 6, 2023, doi: 10.3390/a16060307.
[39] Y. O. M. Sekyere, F. B. Effah, and P. Y. Okyere, “An Enhanced Particle Swarm Optimization Algorithm via Adaptive Dynamic Inertia Weight and Acceleration Coefficients,” Journal of Electronics and Electrical Engineering, vol. 3, no. 3, pp. 53-67, doi: 10.37256/jeee. 3120243868.
[40] B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms,” Applied Soft Computing Journal, vol. 18, pp. 261–276, 2014, doi: 10.1016/j.asoc.2013.09.018.
[41] X. F. Song, Y. Zhang, Y. N. Guo, X. Y. Sun, and Y. L. Wang, “Variable-Size Cooperative Coevolutionary Particle Swarm Optimization for Feature Selection on High-Dimensional Data,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 5, pp. 882–895, 2020, doi: 10.1109/TEVC.2020.2968743.
[42] K. Tadist, S. Najah, N. S. Nikolov, F. Mrabti, and A. Zahi, “Feature selection methods and genomic big data: a systematic review,” J Big Data, vol. 6, no. 1, pp. 1–4, 2019, doi: 10.1186/s40537-019-0241-0.
[43] M. Charikar, C. Pabbaraju, and K. Shiragur, “Quantifying the Gain in Weak-to-Strong Generalization,” in 38th Conference on Neural Information Processing Systems (NeurIPS 2024), 2024, pp. 1–26, available: https://proceedings.neurips.cc/paper_files/paper/2024/hash/e4a0d8aef3567f742b0794844d9b5847-Abstract-Conference.html
[44] C. Burns et al., “Weak-to-Strong Generalization: Eliciting Strong Capabilities With Weak Supervision,” ArXiv, vol. 1, pp. 1–49, 2023, doi: http://arxiv.org/abs/2312.09390
[45] S. Ramadhani et al., “Feature Selection Optimisation for Cancer Classification Based on Evolutionary Algorithms: An Extensive Review,” Computer Modeling in Engineering & Sciences, vol. 143, no. 1, pp. 2712–2765, 2025, doi: 10.32604/cmes . 2025.062709.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Digital Zone: Jurnal Teknologi Informasi dan Komunikasi

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.






