Analisis Perbandingan Klasifikasi dalam Data Mining pada Prediksi Hujan dengan menggunakan Algoritma LSTM dan GRU
Main Article Content
Abstract
Hujan memiliki dampak signifikan terhadap berbagai sektor, seperti pertanian, transportasi, dan manajemen sumber daya. Oleh karena itu, akurasi prediksi hujan sangat penting untuk mendukung perencanaan dan pengambilan keputusan yang efektif. Seiring kemajuan teknologi, pemanfaatan metode deep learning seperti Long Short-Term Memory (LSTM) dan Gated Recurrent Unit (GRU) semakin berkembang dalam bidang prediksi cuaca. Penelitian ini bertujuan untuk membandingkan performa kedua algoritma tersebut dalam mengklasifikasikan data meteorologi untuk prediksi hujan. Dataset yang digunakan berasal dari Australian Bureau of Meteorology melalui Kaggle, terdiri dari 145.460 data dengan 23 atribut. Setelah melalui tahapan pra-pemrosesan dan resampling menggunakan metode SMOTE, kedua model dievaluasi menggunakan metrik akurasi, presisi, recall, dan F1-score. Hasil menunjukkan bahwa model LSTM memperoleh akurasi sebesar 81,46%, sedangkan GRU sebesar 81,39%. Nilai F1-score GRU lebih tinggi dibandingkan LSTM, masing-masing sebesar 60,96% dan 58,95%. Hasil ini mengindikasikan bahwa kedua model memiliki performa yang kompetitif dan efektif untuk diterapkan dalam sistem prediksi hujan berbasis deep learning.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
References
[1] A. Parmar, K. Mistree, and M. Sompura, “Machine Learning Techniques For Rainfall Prediction: A Review Machine Learning View project Rainfall prediction using ANN View project Machine Learning Techniques For Rainfall Prediction: A Review,” Int. Conf. Innov. Inf. Embed. Commun. Syst., no. 3, p. 6, 2017.
[2] X. Shi, Z. Chen, H. Wang, D. Y. Yeung, W. K. Wong, and W. C. Woo, “Convolutional LSTM network: A machine learning approach for precipitation nowcasting,” Adv. Neural Inf. Process. Syst., vol. 2015-Janua, pp. 802–810, 2015.
[3] F. A. D. Aji Prasetya Wibawa, Muhammad Guntur Aji Purnama, Muhammad Fathony Akbar, “Metode-metode Klasifikasi,” Pros. Semin. Ilmu Komput. dan Teknol. Inf., vol. 3, no. 1, p. 134, 2018.
[4] A. A. Soofi and A. Awan, “Classification Techniques in Machine Learning: Applications and Issues,” J. Basic Appl. Sci., vol. 13, pp. 459–465, 2017.
[5] B. Mahesh, “Machine learning Algorithms-A Review,” Int. J. Sci. Res. (IJSR).[Internet], vol. 9, no. 1, pp. 381–386, 2020, doi: 10.21275/ART20203995.
[6] D. D. N. Cahyo, M. A. Fauzi, and J. T. Nugroho, “Analisis Perbandingan Optimizer pada Arsitektur NASNetMobile Convolutional Neural Network untuk Klasifikasi Ras Kucing,” vol. 15, pp. 171–177, 2022.
[7] Y. Liu, L. Nie, L. Liu, and D. S. Rosenblum, “From action to activity: Sensor-based activity recognition,” Neurocomputing, vol. 181, pp. 108–115, 2016, doi: 10.1016/j.neucom.2015.08.096.
[8] D. Diffran Nur Cahyo, A. Sunyoto, and D. Ariatmanto, “Transfer Learning and Fine-tuning Effect Analysis on Classification of Cat Breeds using a Convolutional Neural Network,” 2023 6th Int. Conf. Inf. Commun. Technol. ICOIACT 2023, pp. 488–493, 2023, doi: 10.1109/ICOIACT59844.2023.10455771.
[9] H. Salehinejad, S. Sankar, J. Barfett, E. Colak, and S. Valaee, “Recent Advances in Recurrent Neural Networks,” pp. 1–21, 2017, [Online]. Available: http://arxiv.org/abs/1801.01078.
[10] R. C. Staudemeyer and E. R. Morris, “Understanding LSTM -- a tutorial into Long Short-Term Memory Recurrent Neural Networks,” pp. 1–42, 2019, [Online]. Available: http://arxiv.org/abs/1909.09586.
[11] R. Dey and F. M. Salem, “Gate-Variants of Gated Recurrent Unit (GRU),” Midwest Symp. Circuits Syst. Inst. Electr. Electron. Eng. Inc., vol. 784, no. 2017, pp. 1597–1600, 2017.
[12] J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling,” pp. 1–9, 2014, [Online]. Available: http://arxiv.org/abs/1412.3555.
[13] B. C. Mateus, M. Mendes, J. T. Farinha, R. Assis, and A. M. Cardoso, “Comparing LSTM and GRU models to predict the condition of a pulp paper press,” Energies, vol. 14, no. 21, pp. 1–21, 2021, doi: 10.3390/en14216958.
[14] W. Surta, K. T. Basuki, N. E. Surya, and ..., “Rainfall Prediction in Palembang City Using the GRU and LSTM Methods,” J. Data …, vol. 2023, 2023, [Online]. Available: http://eprints.intimal.edu.my/1730/.
[15] N. Gruber and A. Jockisch, “Are GRU Cells More Specific and LSTM Cells More Sensitive in Motive Classification of Text?,” Front. Artif. Intell., vol. 3, no. June, pp. 1–6, 2020, doi: 10.3389/frai.2020.00040.
[16] M. T. Anwar, W. Hadikurniawati, E. Winarno, and W. Widiyatmoko, “Performance Comparison of Data Mining Techniques for Rain Prediction Models in Indonesia,” 2020 3rd Int. Semin. Res. Inf. Technol. Intell. Syst. ISRITI 2020, pp. 83–88, 2020, doi: 10.1109/ISRITI51436.2020.9315460.
[17] Y. R. Sari, E. C. Djamal, and F. Nugraha, “Daily Rainfall Prediction Using One Dimensional Convolutional Neural Networks,” 2020 3rd Int. Conf. Comput. Informatics Eng. IC2IE 2020, pp. 90–95, 2020, doi: 10.1109/IC2IE50715.2020.9274572.
[18] N. Hayaty, H. Kurniawan, M. R. Rathomi, F. Chahyadi, and M. Bettiza, “Rainfall Prediction with Support Vector Machines: A Case Study in Tanjungpinang City, Indonesia,” BIO Web Conf., vol. 70, p. 01003, 2023, doi: 10.1051/bioconf/20237001003.
[19] D. Mahajan and S. Sharma, “Prediction of Rainfall Using Machine Learning,” 4th Int. Conf. Emerg. Res. Electron. Comput. Sci. Technol. ICERECT 2022, 2022, doi: 10.1109/ICERECT56837.2022.10059679.
[20] H. Duan, Y. Li, Q. Li, Y. Wang, Y. Xie, and H. Peng, “Application and Analysis of Machine Learning Based Rainfall Prediction,” 2023 8th Int. Conf. Intell. Comput. Signal Process. ICSP 2023, no. September, pp. 1941–1949, 2023, doi: 10.1109/ICSP58490.2023.10248891.
[21] Suad A. Alasadi and Wesam S. Bhaya, “Review_of_Data_Preprocessing_Techniques,” Jurnal of Engineering and Applied Sciences, vol. 12, no. 16. pp. 4102–4107, 2017.
[22] K. Li, W. Zhang, Q. Lu, and X. Fang, “An improved SMOTE imbalanced data classification method based on support degree,” Proc. - 2014 Int. Conf. Identification, Inf. Knowl. Internet Things, IIKI 2014, pp. 34–38, 2014, doi: 10.1109/IIKI.2014.14.
[23] N. K. Manaswi, Deep Learning with Applications Using Python. 2018.
[24] H. Chung and K. S. Shin, “Genetic algorithm-optimized long short-term memory network for stock market prediction,” Sustain., vol. 10, no. 10, 2018, doi: 10.3390/su10103765.
[25] A. Hanifa and S. Akbar, “Detection of Unstable Approaches in Flight Track with Recurrent Neural Network,” 2018 Int. Conf. Inf. Commnunications Technol., pp. 735–740, 2018, [Online]. Available: http://www.nber.org/papers/w16019.
[26] T. Guillod, P. Papamanolis, and J. W. Kolar, “Artificial neural network (ann) based fast and accurate inductor modeling and design,” IEEE Open J. Power Electron., vol. 1, no. June, pp. 284–299, 2020, doi: 10.1109/OJPEL.2020.3012777.
[27] A. Dutta, S. Kumar, and M. Basu, “A Gated Recurrent Unit Approach to Bitcoin Price Prediction,” J. Risk Financ. Manag., vol. 13, no. 2, 2020, doi: 10.3390/jrfm13020023.
[28] C. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall, “Activation Functions: Comparison of trends in Practice and Research for Deep Learning,” pp. 1–20, 2018, [Online]. Available: http://arxiv.org/abs/1811.03378.
[29] Z. Zhang, “Improved Adam Optimizer for Deep Neural Networks,” 2018 IEEE/ACM 26th Int. Symp. Qual. Serv. IWQoS 2018, pp. 1–2, 2019, doi: 10.1109/IWQoS.2018.8624183.