ReLiS-Net for emotion recognition: an AI-driven approach to advancing mental health diagnostics
Bidyutlata Sahoo1 and Arpita Gupta2
Associate Professor, Department of Computer Science and Engineering,Koneru Lakshmaiah Education Foundation, Hyderabad-500075,Telangana,India2
Corresponding Author : Bidyutlata Sahoo
Recieved : 25-Nov-2024; Revised : 19-Feb-2026; Accepted : 23-Feb-2026
Abstract
The assessment of emotional states via facial expressions has considerable ramifications for the identification and treatment of mental health disorders, including depression. Utilizing sophisticated machine learning models can improve the precision and sensitivity of identifying intricate emotional states from visual data. This research presents the residual-liquid-support-vector network (ReLiS-Net) model, which integrates residual neural network (ResNet), liquid neural network (LNN) and support vector machine (SVM) to effectively capture spatial and temporal variations in facial expressions. Using the extended Cohn–Kanade (CK+) dataset, the proposed model ReLis-Net exhibits exceptional performance, achieving 99.34% accuracy, 99.12% sensitivity, 98.92% specificity and a 99.29% F1-score. The robustness of ReLiS-Net is confirmed in comparison to modern models, demonstrating its greater capacity to analyze emotional nuances essential for detecting depressive states. This improved analytical approach not only delivers superior predictive performance but also yields insights into emotional tendencies that may signify underlying mental health concerns, thereby facilitating early intervention measures.
Keywords
Facial expression recognition, Depression detection, Residual-liquid-support-vector network (ReLiS-Net), Deep learning and hybrid models, Mental health analytics.
Cite this article
Sahoo B, Gupta A. ReLiS-Net for emotion recognition: an AI-driven approach to advancing mental health diagnostics. International Journal of Advanced Technology and Engineering Exploration. 2026;13(135):263-279. DOI : 10.19101/IJATEE.2024.111102087
References
[1] Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. In proceedings of the IEEE conference on computer vision and pattern recognition 2015 (pp. 1-9). IEEE.
[2] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In proceedings of the conference on computer vision and pattern recognition 2016 (pp. 770-8). IEEE.
[3] Zeng J, Shan S, Chen X. Facial expression recognition with inconsistently annotated datasets. In proceedings of the European conference on computer vision (ECCV) 2018 (pp. 222-37). Springer.
[4] Antoniadis P, Filntisis PP, Maragos P. Exploiting emotional dependencies with graph convolutional networks for facial expression recognition. In16th international conference on automatic face and gesture recognition (FG 2021) 2021 (pp. 1-8). IEEE.
[5] Song J, He M, Zheng X, Zhang Y, Bi C, Feng J, et al. Face-based machine learning diagnostics: applications, challenges and opportunities. Artificial Intelligence Review. 2025; 58(8):1-71.
[6] Lausen A, Hammerschmidt K. Emotion recognition and confidence ratings predicted by vocal stimulus type and prosodic parameters. Humanities and Social Sciences Communications. 2020; 7(1):1-17.
[7] Austermann A, Esau N, Kleinjohann L, Kleinjohann B. Prosody based emotion recognition for MEXI. In IEEE/RSJ international conference on intelligent robots and systems 2005 (pp. 1138-44). IEEE.
[8] Canadian centre for occupational health and safety. Government of Canada. 2023.
[9] Foley GN, Gentile JP. Nonverbal communication in psychotherapy. Psychiatry (Edgmont). 2010; 7(6):38-44.
[10] https://ncdalliance.org/explore-ncds/ncds/mental-health Accessed 30 January 2026.
[11] Sreenivasa RK, Shashidhar G. Robust emotion recognition using spectral and prosodic features. Springer Briefs in Speech Technology; 2013.
[12] Can YS, Chalabianloo N, Ekiz D, Ersoy C. Continuous stress detection using wearable sensors in real life: algorithmic programming contest case study. Sensors. 2019; 19(8):1-21.
[13] Bevilacqua F, Engström H, Backlund P. Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games. International Journal of Computer Games Technology. 2018; 2018(1):1-14.
[14] Taware S, Thakare AD. Development of facial emotion recognition system using unimodal and multimodal approach. In international conference on advanced communication and intelligent systems 2023 (pp. 259-68). Cham: Springer Nature Switzerland.
[15] Marrero FPD, Guerrero PFA, Ren T, Cunha A. Feratt: facial expression recognition with attention net. In proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops 2019 (pp. 1-10). IEEE.
[16] Niu B, Gao Z, Guo B. Facial expression recognition with LBP and ORB features. Computational Intelligence and Neuroscience. 2021; 2021(1):1-10.
[17] Kim JH, Kim BG, Roy PP, Jeong DM. Efficient facial expression recognition algorithm based on hierarchical deep neural network structure. IEEE Access. 2019; 7:41273-85.
[18] Sekaran SA, Lee CP, Lim KM. Facial emotion recognition using transfer learning of AlexNet. In 9th international conference on information and communication technology (ICoICT) 2021 (pp. 170-4). IEEE.
[19] Kanade T, Cohn JF, Tian Y. Comprehensive database for facial expression analysis. In proceedings fourth IEEE international conference on automatic face and gesture recognition 2000 (pp. 46-53). IEEE.
[20] Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I. The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression. In IEEE computer society conference on computer vision and pattern recognition-workshops 2010 (pp. 94-101). IEEE.
[21] Kumar G, Das T, Singh K. Early detection of depression through facial expression recognition and electroencephalogram-based artificial intelligence-assisted graphical user interface. Neural Computing and Applications. 2024; 36(12):6937-54.
[22] Fromberg L, Nielsen T, Frumosu FD, Clemmensen LH. Beyond accuracy: fairness, scalability, and uncertainty considerations in facial emotion recognition. In Northern lights deep learning conference 2024 (pp. 67-74). PMLR.
[23] Hussein SA, Bayoumi AE, Soliman AM. Automated detection of human mental disorder. Journal of Electrical Systems and Information Technology. 2023; 10(1):1-10.
[24] Thati RP, Dhadwal AS, Kumar P, P S. A novel multi-modal depression detection approach based on mobile crowd sensing and task-based mechanisms. Multimedia Tools and Applications. 2023; 82(4):4787-820.
[25] D’incà M, Beyan C, Niewiadomski R, Barattin S, Sebe N. Unleashing the transferability power of unsupervised pre-training for emotion recognition in masked and unmasked facial images. IEEE Access. 2023; 11:90876-90.
[26] Mukhiddinov M, Djuraev O, Akhmedov F, Mukhamadiyev A, Cho J. Masked face emotion recognition based on facial landmarks and deep learning approaches for visually impaired people. Sensors. 2023; 23(3):1-23.
[27] Debnath T, Reza MM, Rahman A, Beheshti A, Band SS, Alinejad-rokny H. Four-layer ConvNet to facial emotion recognition with minimal epochs and the significance of data diversity. Scientific Reports. 2022; 12(1):1-18.
[28] Ansari FA, Peddi P. Non-intrusive stress detection based on temporal emotion analysis in videos applying machine learning. Turkish Online Journal of Qualitative Inquiry. 2022; 13(1).
[29] Banerjee K, Kishore A, Kumar A. Facial emotion recognition using machine learning techniques. Grenze International Journal of Engineering & Technology (GIJET). 2025; 11.
[30] Gong Q, Liu X, Ma Y. Real-time facial expression recognition based on image processing in virtual reality. International Journal of Computational Intelligence Systems. 2025; 18(1):1-16.
[31] Dey K, Roy S, Jana B, Dhar P. Recent advancements in facial emotion recognition using conventional and deep neural network: a comprehensive review. Smart Science. 2025; 13(4):431-44.
[32] Wan L, Cheng W. Facial expression recognition method based on multi-level feature fusion of high-resolution images. International Journal of Biometrics. 2025;17(1-2):57-72.
[33] Kumar R, Corvisieri G, Fici TF, Hussain SI, Tegolo D, Valenti C. Transfer learning for facial expression recognition. Information. 2025; 16(4):1-22.
[34] Gonzalez-acosta AM, Vargas-treviño M, Batres-mendoza P, Guerra-hernandez EI, Gutierrez-gutierrez J, Cano-perez JL, et al. The first look: a biometric analysis of emotion recognition using key facial features. Frontiers in Computer Science. 2025; 7:1-16.
[35] Mehrotra M, Singh KP, Singh YB. Facial emotion recognition and detection using convolutional neural networks with low computation cost. In 2nd international conference on disruptive technologies (ICDT) 2024 (pp. 1349-54). IEEE.
