Keywords:-
Article Content:-
Abstract
Activation functions play a crucial role in Convolutional Neural Networks (CNN), particularly in enabling the model to recognize and represent complex patterns in digital images. In image classification tasks, the choice of activation function can significantly impact the accuracy and overall performance of the model. The Rectified Linear Unit (ReLU) is the most commonly used activation function due to its simplicity; however, it has a limitation in discarding information from negative input values. To address this issue, alternative functions such as Leaky ReLU and Gaussian Error Linear Unit (GELU) have been developed, designed to retain information from negative inputs. This study presents a comparative analysis of three activation functions ReLU, Leaky ReLU, and GELU on a CNN model for classifying oil palm fruit ripeness levels. The results show that although all three activation functions achieved high training accuracy ReLU at 100%, Leaky ReLU at 99.93%, and GELU at 99.49%—the performance on testing data varied significantly. Leaky ReLU outperformed the others, achieving the highest test accuracy of 95.35%, an F1-score of 95.39%, and a Matthews Correlation Coefficient (MCC) of 93.28%. It also exhibited the smallest gap between training and testing accuracy (4.58%), indicating better generalization capability and a lower risk of overfitting compared to ReLU and GELU. Moreover, the model using Leaky ReLU was able to classify all three classes more evenly, particularly excelling in identifying the 'ripe' class, which tends to be more challenging. These findings highlight that Leaky ReLU is a more optimal activation function for oil palm fruit image classification, as it maintains high accuracy while effectively reducing overfitting. This study contributes to the selection of appropriate activation functions for CNN-based classification systems and opens opportunities for exploring more adaptive activation functions in future research.
References:-
References
S. Espinoza, C. Aguilera, L. Rojas, dan P. G. Campos, “Analysis of Fruit Images with Deep Learning: A Systematic Literature Review and Future Directions,” IEEE Access, no. October 2023, hal. 1–1, 2023, doi: 10.1109/access.2023.3345789.
S. Srivastava, A. V. Divekar, C. Anilkumar, I. Naik, V. Kulkarni, dan V. Pattabiraman, “Comparative analysis of deep learning image detection algorithms,” J. Big Data, vol. 8, no. 1, 2021, doi: 10.1186/s40537-021-00434-w.
J. Naranjo-Torres, M. Mora, R. Hernández-García, R. J. Barrientos, C. Fredes, dan A. Valenzuela, “A review of convolutional neural network applied to fruit image processing,” Appl. Sci., vol. 10, no. 10, 2020, doi: 10.3390/app10103443.
X. Zhao, L. Wang, Y. Zhang, X. Han, M. Deveci, dan M. Parmar, A review of convolutional neural networks in computer vision, vol. 57, no. 4. Springer Netherlands, 2024.
Y. Jiang, J. Xie, dan D. Zhang, “An Adaptive Offset Activation Function for CNN Image Classification Tasks,” Electron., vol. 11, no. 22, hal. 1–11, 2022, doi: 10.3390/electronics11223799.
S. KILIÇARSLAN, K. ADEM, dan M. ÇELİK, “An overview of the activation functions used in deep learning algorithms,” J. New Results Sci., vol. 10, no. 3, hal. 75–88, 2021, doi: 10.54187/jnrs.1011739.
Y. Bai, “RELU-Function and Derived Function Review,” SHS Web Conf., vol. 144, hal. 02006, 2022, doi: 10.1051/shsconf/202214402006.
Y. Sun, “The role of activation function in image classification,” 2021 IEEE 3rd Int. Conf. Commun. Inf. Syst. Comput. Eng. CISCE 2021, no. Cisce, hal. 275–278, 2021, doi: 10.1109/CISCE52179.2021.9445868.
A. Mujhid, S. Surono, N. Irsalinda, dan A. Thobirin, “Comparison and Combination of Leaky ReLU and ReLU Activation Function and Three Optimizers on Deep CNN for COVID-19 Detection,” Front. Artif. Intell. Appl., vol. 358, hal. 50–57, 2022, doi: 10.3233/FAIA220369.
M. Lee, “GELU Activation Function in Deep Learning: A Comprehensive Mathematical Analysis and Performance,” hal. 1–19, 2023, [Daring]. Tersedia pada: http://arxiv.org/abs/2305.12073.
L. Chen, S. Li, Q. Bai, J. Yang, S. Jiang, dan Y. Miao, “Review of Image Classification Algorithms Based on Convolutional Neural Networks,” Remote Sens., vol. 13, no. 22, 2021, doi: 10.3390/rs13224712.
R. H. K. Emanuel, P. D. Docherty, H. Lunt, dan K. Möller, “The effect of activation functions on accuracy, convergence speed, and misclassification confidence in CNN text classification: a comprehensive exploration,” J. Supercomput., vol. 80, no. 1, hal. 292–312, 2024, doi: 10.1007/s11227-023-05441-7.
A. S. Tomar, A. Sharma, A. Shrivastava, A. S. Rana, dan P. Yadav, “A Comparative Analysis of Activation Function, Evaluating their Accuracy and Efficiency when Applied to Miscellaneous Datasets,” Proc. 2nd Int. Conf. Appl. Artif. Intell. Comput. ICAAIC 2023, no. Icaaic, hal. 1035–1042, 2023, doi: 10.1109/ICAAIC56838.2023.10140823.
L. Alzubaidi dkk, Review of deep learning: concepts, CNN architectures, challenges, applications, future directions, vol. 8, no. 1. Springer International Publishing, 2021.
S. Khan, H. Rahmani, S. A. A. Shah, dan M. Bennamoun, “A Guide to Convolutional Neural Networks for Computer Vision,” Synth. Lect. Comput. Vis., vol. 8, no. 1, hal. 1–207, 2018, doi: 10.2200/s00822ed1v01y201712cov015.
R. Parhi dan R. D. Nowak, “The Role of Neural Network Activation Functions,” IEEE Signal Process. Lett., vol. 27, hal. 1779–1783, 2020, doi: 10.1109/LSP.2020.3027517.
A. Maniatopoulos dan N. Mitianoudis, “Learnable Leaky ReLU (LeLeLU): An Alternative Accuracy-Optimized Activation Function,” Inf., vol. 12, no. 12, 2021, doi: 10.3390/info12120513.
M. Lee, “Mathematical Analysis and Performance Evaluation of the GELU Activation Function in Deep Learning,” J. Math., vol. 2023, 2023, doi: 10.1155/2023/4229924.
A. Nguyen, K. Pham, D. Ngo, T. Ngo, dan L. Pham, “An analysis of state-of-the-art activation functions for supervised deep neural network,” Proc. 2021 Int. Conf. Syst. Sci. Eng. ICSSE 2021, hal. 215–220, 2021, doi: 10.1109/ICSSE52999.2021.9538437.
C. C. Nworu, E. J. Ekpenyong, J. Chisimkwuo, dan C. N. Onyeukwu, “The Effects of Modified ReLU Activation Functions in Image Classification Biomedical Engineering and Medical Devices The Effects of Modified ReLU Activation Functions in Image Classification,” vol. 7, no. November, hal. 0–5, 2022, doi: 10.35248/2475-7586.22.07.237