Weighted majority voting ensemble method for student performance classification
Hasnah Nawang1, Che Akmal Che Yahaya1, Ridzwan Mohamed Sumeri2 and Khairul Annuar Abdullah1
Department of Electrical Technology,Kolej Komuniti Kuala Terengganu, Terengganu,Malaysia2
Corresponding Author : Hasnah Nawang
Recieved : 14-Jan-2025; Revised : 16-Aug-2025; Accepted : 24-Aug-2025
Abstract
The classification of educational data provides valuable information for understanding student performance trends. An ensemble is a method in machine learning that uses the strength of multiple models to achieve better accuracy and performance than individual models. Combining classifiers is one of the approaches in ensemble methods, where multiple classifier models are combined to improve overall predictive performance. The combination of multiple classifiers (MCS) provides more useful insights into student performance classification than single base learners, and the outcome is determined through a majority voting (MV) technique. However, unlike equal weighting in MV, weighted MV assigns weights to base classifiers to enhance classification performance based on their confidence. This study introduces multiple classifiers using weighted majority voting (MCS-WeMV), which evaluates the individual performance of each classifier within the ensemble and adjusts their weights based on classifier confidence to contribute to the final class prediction. During experimental investigations, the weight for each classifier within a collection of classification algorithms such as decision tree (DT), support vector machine (SVM), k-nearest neighbor (KNN), random forest (RF), logistic regression (LR), and naive Bayes (NB) was computed. The findings indicate that the proposed MCS-WeMV outperformed the other three ensemble algorithms, AdaBoost, Bagging, and Stacking, by obtaining an accuracy of 88.15% through the combination of RF, SVM, and DT classifiers.
Keywords
Ensemble method, Multiple classifiers, Majority voting, Weighted majority voting, Classifier fusions, and Combination rule.
References
[1] Sarker S, Paul MK, Thasin ST, Hasan MA. Analyzing students' academic performance using educational data mining. Computers and Education: Artificial Intelligence. 2024; 7:1-16.
[2] Malik A, Onyema EM, Dalal S, Lilhore UK, Anand D, Sharma A, et al. Forecasting students' adaptability in online entrepreneurship education using modified ensemble machine learning model. Array. 2023; 19:1-11.
[3] Pecuchova J, Drlik M. Predicting students at risk of early dropping out from course using ensemble classification methods. Procedia Computer Science. 2023; 225:3223-32.
[4] Meedech P, Iam-on N, Boongoen T. Prediction of student dropout using personal profile and data mining approach. In intelligent and evolutionary systems: the 19th Asia pacific symposium, IES 2015 (pp. 143-55). Cham: Springer International Publishing.
[5] Zhou ZH. Open-environment machine learning. National Science Review. 2022; 9(8):1-11.
[6] Li Z, Yoon J, Zhang R, Rajabipour F, Srubar IIIWV, Dabo I, et al. Machine learning in concrete science: applications, challenges, and best practices. NPJ Computational Materials. 2022; 8(1):1-17.
[7] Głowania S, Kozak J, Juszczuk P. New voting schemas for heterogeneous ensemble of classifiers in the problem of football results prediction. Procedia Computer Science. 2022; 207:3393-402.
[8] Żabiński G, Gramacki J, Gramacki A, Miśta-jakubowska E, Birch T, Disser A. Multi-classifier majority voting analyses in provenance studies on iron artefacts. Journal of Archaeological Science. 2020; 113:1-15.
[9] Vidyashree KP, Rajendra AB, Gururaj HL. A tweet sentiment classification approach using an ensemble classifier. International Journal of Cognitive Computing in Engineering. 2024; 5:170-7.
[10] Biswas AK, Seethalakshmi R, Mariappan P, Bhattacharjee D. An ensemble learning model for predicting the intention to quit among employees using classification algorithms. Decision Analytics Journal. 2023; 9:1-13.
[11] Beckham NR, Akeh LJ, Mitaart GN, Moniaga JV. Determining factors that affect student performance using various machine learning methods. Procedia Computer Science. 2023; 216:597-603.
[12] Pallathadka H, Wenda A, Ramirez-asís E, Asís-lópez M, Flores-albornoz J, Phasinam K. Classification and prediction of student performance data using various machine learning algorithms. Materials Today: Proceedings. 2023; 80:3782-5.
[13] Daphal SD, Koli SM. Enhancing sugarcane disease classification with ensemble deep learning: a comparative study with transfer learning techniques. Heliyon. 2023; 9(8):1-19.
[14] Siddique A, Jan A, Majeed F, Qahmash AI, Quadri NN, Wahab MO. Predicting academic performance using an efficient model based on fusion of classifiers. Applied Sciences. 2021; 11(24):1-19.
[15] Karo IM, Fajari MY, Fadhilah NU, Wardani WY. Benchmarking naïve bayes and ID3 algorithm for prediction student scholarship. In conference series: materials science and engineering 2022 (pp. 1-7). IOP Publishing.
[16] https://machinelearningmastery.com/ensemble-machine-learning-with-python-7-day-mini-course/. Accessed 15 August 2025.
[17] Ramezan AC, Warner AT, Maxwell EA. Evaluation of sampling and cross-validation tuning strategies for regional-scale machine learning classification. Remote Sensing. 2019; 11(2):1-21.
[18] Osman AH, Aljahdali HM. An effective of ensemble boosting learning method for breast cancer virtual screening using neural network model. IEEE Access. 2020; 8:39165-74.
[19] www.worldscientific.com. Accessed 15 August 2025.
[20] Kazmaier J, Van VJH. The power of ensemble learning in sentiment analysis. Expert Systems with Applications. 2022; 187:115819.
[21] Vergaray AD, Miranda JC, Cornelio JB, Carranza AR, Sánchez CF. Predicting the depression in university students using stacking ensemble techniques over oversampling method. Informatics in Medicine Unlocked. 2023; 41:1-9.
[22] Carè A, Campi MC, Ramponi FA, Garatti S, Cobbenhagen AR. A study on majority-voting classifiers with guarantees on the probability of error. IFAC-PapersOnLine. 2020; 53(2):1013-8.
[23] Osamor VC, Okezie AF. Enhancing the weighted voting ensemble algorithm for tuberculosis predictive diagnosis. Scientific Reports. 2021; 11(1):1-11.
[24] Mukherjee A, Singhal R, Shroff G. Numin: weighted-majority ensembles for intraday trading. In proceedings of the 5th ACM international conference on AI in finance 2024 (pp. 703-10). ACM.
[25] Meyen S, Sigg DM, Luxburg UV, Franz VH. Group decisions based on confidence weighted majority voting. Cognitive Research: Principles and Implications. 2021; 6(1):1-13.
[26] Smirani LK, Yamani HA, Menzli LJ, Boulahia JA. Using ensemble learning algorithms to predict student failure and enabling customized educational paths. Scientific Programming. 2022; 2022(1):1-15.
[27] Nawang H, Makhtar M, Hamzah WM. A systematic literature review on student performance predictions. International Journal of Advanced Technology and Engineering Exploration. 2021; 8(84):1441-53.
[28] Rabelo AM, Zárate LE. A model for predicting dropout of higher education students. Data Science and Management. 2025; 8(1):72-85.
[29] Mirzaeian R, Nopour R, Asghari VZ, Shafiee M, Shanbehzadeh M, Kazemi-arpanahi H. Which are best for successful aging prediction? bagging, boosting, or simple machine learning algorithms? Biomedical Engineering Online. 2023; 22(1):85.
[30] Lazzarini R, Tianfield H, Charissis V. A stacking ensemble of deep learning models for IoT intrusion detection. Knowledge-Based Systems. 2023; 279:110941.
[31] Pecuchova J, Drlik M. Predicting students at risk of early dropping out from course using ensemble classification methods. Procedia Computer Science. 2023; 225:3223-32.
[32] Belciug S, Ivănescu RC, Nascu A, Serbănescu MS, Comănescu C, Iliescu DG. Knowledge-based statistical data analysis for deep learning and voting classifiers merger. Procedia Computer Science. 2023; 225:4206-13.
[33] Kunapuli G. Ensemble methods for machine learning. Simon and Schuster; 2023.
[34] Rojarath A, Songpan W. Probability-weighted voting ensemble learning for classification model. Journal of Advances in Information Technology. 2020; 11(4):217-27.
[35] Ruppert D. The elements of statistical learning: data mining, inference, and prediction. Journal of the American Statistical Association. 2011; 99(466):567.