Identification of optimal long short-term memory (LSTM) parameters for pod workloads in a Kubernetes environment
Lussiana ETP1, Ega Rudy Graha2, Yuhilza Hanum3 and Eri Prasetyo Wibowo4
School of Business, Social & Decision Sciences,Constructor University Bremen, Campus Ring 1, 28759, Bremen,Germany2
Information Technology Master Program,Universitas Gunadarma, Jl. Margonda Raya No. 100, Depok,Indonesia3
Information Technology Doctoral Program,Universitas Gunadarma, Jl. Margonda Raya No. 100, Depok,Indonesia4
Corresponding Author : Lussiana ETP
Recieved : 14-Sep-2024; Revised : 23-Aug-2025; Accepted : 27-Aug-2025
Abstract
The use of cloud computing resources with static allocation configurations leads to resource wastage when workloads fluctuate. This research aims to analyze the performance of the long short-term memory (LSTM) model by identifying optimal parameters based on pod history data in a dynamically fluctuating Red Hat OpenShift Kubernetes cluster environment. The research process included observation of the Kubernetes environment, data collection, data preparation, modeling, testing, and finalizing the LSTM model with optimal parameters using real cluster data. Data collection was conducted in repeated 4-week cycles and validated using the mean squared error (MSE) method to ensure reliable results that could support efficiency recommendations. Experimental tests with parameters such as epochs, training data rate, and the number of hidden layers yielded a stable MSE loss value of 0.000115. These results demonstrate that the proposed LSTM model can be effectively implemented with the identified parameters.
Keywords
Cloud computing, Kubernetes, Red hat OpenShift, Long short-term memory (LSTM), Resource allocation, Mean squared error (MSE).
References
[1] Huawei Technologies Co. Ltd. Network basics in cloud computing. In cloud computing technology 2022 (pp. 145-95). Singapore: Springer Nature Singapore.
[2] Tamane S. A review on virtualization: a cloud technology. International Journal on Recent and Innovation Trends in Computing and Communication. 2015; 3(7):4582-5.
[3] RK Chopra. A review paper on virtualization. International Journal of Innovative Research in Computer Science and Technology. 2022; 10(2):131-35.
[4] Sachdeva S. Kubernetes and docker: an introduction to container orchestration and management. International Journal of Computer Trends and Technology. 2022; 71(22):57-62.
[5] https://www.redhat.com/en/engage/openshift-application-transformation-ebook. Accessed 29 July 2024.
[6] Hopfe DH, Lee K, Yu C. Short-term forecasting airport passenger flow during periods of volatility: comparative investigation of time series vs. neural network models. Journal of air transport management. 2024; 115:102525.
[7] Artley B. Time series forecasting with arima, sarima and sarimax. Towards Data Science. 2022.
[8] Latief NH, Nur’eni NE, Setiawan I. Peramalan curah hujan di kota makassar dengan menggunakan metode SARIMAX. Statistika. 2022; 22(1):55-63.
[9] Pustokhina IV, Pustokhin DA. A comparative analysis of traditional forecasting methods and machine learning techniques for sales prediction in E-commerce. Full Length Article. 2023; 10(2):39-9.
[10] Liu S, Wang X, Xiang Y, Xu H, Wang H, Tang B. Multi-channel fusion LSTM for medical event prediction using EHRs. Journal of Biomedical Informatics. 2022; 127:1-9.
[11] Yu Q, Tolson BA, Shen H, Han M, Mai J, Lin J. Enhancing long short-term memory (LSTM)-based streamflow prediction with a spatially distributed approach. Hydrology and Earth System Sciences. 2024; 28(9):2107-22.
[12] Lu H, Barzegar V, Nemani VP, Hu C, Laflamme S, Zimmerman AT. GAN-LSTM predictor for failure prognostics of rolling element bearings. In international conference on prognostics and health management (ICPHM) 2021 (pp. 1-8). IEEE.
[13] Liu L, Xu G, Wang Y, Wang L, Liu J. Battery temperature estimation at wide C-rates using the LSTM model based on polarization characteristics. Journal of Energy Storage. 2024; 101:113941.
[14] Sampedro GA, Putra MA, Lee JM, Kim DS. Industrial internet of things-based fault mitigation for smart additive manufacturing using multi-flow BILSTM. IEEE Access. 2023; 11:99130-42.
[15] Phan LA, Kim T. Traffic-aware horizontal pod autoscaler in Kubernetes-based edge computing infrastructure. IEEE Access. 2022; 10:18966-77.
[16] Hu S, Cai W, Liu J, Shi H, Yu J. Refining short-term power load forecasting: an optimized model with long short-term memory network. Journal of computing and information technology. 2023; 31(3):151-66.
[17] Mao Q, Zhang X, Xu Z, Xiao Y, Song Y, Xu F. Identification of Escherichia coli strains using MALDI-TOF MS combined with long short-term memory neural networks. Aging (Albany NY). 2024; 16(13):11018-26.
[18] Ben OG, Copot D, Yumuk E, Neckebroek M, Ionescu CM. Selecting optimal long short-term memory (LSTM) architectures for online estimation of mean arterial pressure (MAP) in patients undergoing general Anesthesia. Applied Sciences. 2024; 14(13):1-20.
[19] Li P, Zhao H, Gu J, Duan S. Dynamic constitutive identification of concrete based on improved dung beetle algorithm to optimize long short-term memory model. Scientific Reports. 2024; 14(1):1-15.
[20] Istaltofa M, Sarwido S, Sucipto A. Comparison of linear regression and LSTM (long short-term memory) in cryptocurrency prediction. Journal of Dinda Data Science Information Technology and Data Analytics. 2024; 4(2):141-8.
[21] Vincent M, Thomas S, Suresh S, Prathap BR. Enhancing industrial equipment reliability: advanced predictive maintenance strategies using data analytics and machine learning. In international conference on contemporary computing and communications (InC4) 2024 (pp. 1-6). IEEE.
[22] Arya TF, Rachmadi RF, Affandi A. Cloud node auto-scaling system automation based on computing workload prediction. Jurnal RESTI. 2024; 8(5):597-606.
[23] Adam HA, Raditiansyah F, Imani MR, Fawwaz MF, Julham J, Lubis AR. Comparison of ARIMA and LSTM models in stock price forecasting: a case study of GOTO. JK. Journal of Informatics and Telecommunication Engineering. 2024; 8(1):94-105.
[24] Gers FA, Schmidhuber J, Cummins F. Learning to forget: continual prediction with LSTM. Neural computation. 2000; 12(10):2451-71.
[25] MY HU, Setyowati SL, Notodiputro KA, Angraini Y, Mualifah LN. Comparison of seasonal ARIMA and support vector machine forecasting method for international arrival in Lombok. Jambura Journal of Mathematics. 2024; 6(2):212-9.
[26] Iida T. Identifying causes of errors between two wave-related data using performance metrics. Applied Ocean Research. 2024; 148:104024.
[27] Taghiyeh S, Lengacher DC, Sadeghi AH, Sahebi-fakhrabad A, Handfield RB. A novel multi-phase hierarchical forecasting approach with machine learning in supply chain management. Supply Chain Analytics. 2023; 3:1-15.
[28] Festag S, Spreckelsen C. Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention. Journal of Biomedical Informatics. 2023; 139:1-9.
[29] Diaa NM, Abed MQ, Taha SW, Ali M. Machine learning and traditional statistics integrative approaches for bioinformatics. Journal of Ecohumanism. 2024; 3(5):335-52.
[30] Dawson HL, Dubrule O, John CM. Impact of dataset size and convolutional neural network architecture on transfer learning for carbonate rock classification. Computers & Geosciences. 2023; 171:105284.
[31] Azzedin F, Mohammed S, Ghaleb M, Yazdani J, Ahmed A. Systematic partitioning and labeling XML subtrees for efficient processing of XML queries in IoT environments. IEEE Access. 2020; 8:61817-33.
[32] Liu H, Cai C, Tao Y, Chen J. Dynamic equivalent modeling for microgrids based on LSTM recurrent neural network. In Chinese automation congress (CAC) 2018 (pp. 4020-24). IEEE.
[33] Kumar KV, Anitha R. A novel ensemble model by combining LSTM, BiLSTM, and facebook prophet algorithm to forecast stock prices. In third international conference on intelligent computing instrumentation and control technologies (ICICICT) 2022 (pp. 1044-7). IEEE.
[34] Alfonso RGC. Dynamic container orchestration for a device-cloud continuum. Thesis Project. KTH Royal Institue of Technology. Stockholm. Sweden. 2023.
[35] Dang-quang NM, Yoo M. Deep learning-based autoscaling using bidirectional long short-term memory for kubernetes. Applied Sciences. 2021; 11(9):1-25.
[36] https://www.redhat.com/en/resources/openshift- managed-services-cost-savings-brief. Accessed 29 July 2024.
[37] Wang T. Predictive vertical CPU autoscaling in kubernetes based on time-series forecasting with holt-winters exponential smoothing and long short-term memory. Master Thesis, School of Electrical Engineering and Computer Science (EECS), KTH Royal Institute, Sweden. 2021.
[38] Kumar A. Hold-out method for training machine learning models. Analytics Yogi: New York, NY, USA. 2023.