Optimasi Parameter Neural Network untuk Meningkatkan Efisiensi Pembelajaran Model

Ari Kurniawan Saputra, Robby Yuli Endra, Erlangga Erlangga

Abstract


Penelitian ini dilakukan untuk mengatasi permasalahan rendahnya stabilitas dan efisiensi pembelajaran pada Neural Network yang disebabkan oleh pemilihan parameter awal secara acak, seperti bobot, bias, dan learning rate. Penelitian ini bertujuan mengoptimasi parameter-parameter tersebut agar proses pelatihan menjadi lebih stabil, cepat, dan konsisten. Metode yang digunakan adalah optimasi Particle Swarm Optimization (PSO) yang diterapkan melalui tujuh tahapan, mulai dari pra-proses data, perancangan arsitektur model, perhitungan fungsi loss, inisialisasi bobot, optimasi learning rate serta bobot awal, sampai pelatihan model dan evaluasi performa. Dataset terdiri dari 10 sampel dengan lima fitur input dan satu target harga jual. Hasil penelitian menunjukkan bahwa PSO berhasil menghasilkan learning rate optimal sebesar 0.174 dan bobot awal yang lebih stabil dibandingkan model baseline. Evaluasi menggunakan confusion matrix menunjukkan peningkatan performa dibandingkan baseline yang hanya memperoleh accuration 60%, precision 0.60, recall 1.00, dan F1-score 0.75. Secara keseluruhan, optimasi PSO terbukti meningkatkan stabilitas pembelajaran, mempercepat konvergensi, dan menghasilkan model Neural Network yang lebih efisien serta lebih akurat.

Keywords


Neural Network; Pembelajaran Model; Optimasi Particle Swarm Optimization.

Full Text:

PDF

References


G. Thakkar and N. Mikeli, “Examining Sentiment Analysis for Low-Resource Languages with Data Augmentation Techniques,” pp. 2920–2942, 2024, doi: https://doi.org/10.3390/eng5040152.

M. Chakraborty, W. Pal, S. Bandyopadhyay, and U. Maulik, “Survey on Multi-Objective-Based Parameter Optimization,” vol. 24, no. 3, pp. 327–359, 2023, doi: https://doi.org/10.7494/csci.2023.24.3.5479.

P. A. Chandak and J. P. Modak, “Optimization of Artificial Neural Network Model for improvement of Artificial Intelligence of Manually Driven Brick Making Machine Powered by HPFM,” Int. J. Chaos, Control. Model. Simul., vol. 2, no. 3, pp. 39–58, Sep. 2013, doi: 10.5121/ijccms.2013.2304.

M. E. Akiner and M. Ghasri, “Comparative assessment of deep belief network and hybrid adaptive neuro-fuzzy inference system model based on a meta-heuristic optimization algorithm for precise predictions of the potential evapotranspiration,” Environ. Sci. Pollut. Res., vol. 31, no. 30, pp. 42719–42749, Jun. 2024, doi: 10.1007/s11356-024-33987-3.

S. S. Wardani, R. D. Susanti, and M. Taufik, “Implementasi Pendekatan Computational Thinking Melalui Game Jungle Adventure Terhadap Kemampuan Problem Solving,” SJME (Supremum J. Math. Educ., vol. 6, no. 1, pp. 1–13, Jan. 2022, doi: 10.35706/sjme.v6i1.5430.

S.-A. N. Alexandropoulos, S. B. Kotsiantis, and M. N. Vrahatis, “Data preprocessing in predictive data mining,” Knowl. Eng. Rev., vol. 34, p. e1, Jan. 2019, doi: 10.1017/S026988891800036X.

B. Pandey, D. Kumar Pandey, B. Pratap Mishra, and W. Rhmann, “A comprehensive survey of deep learning in the field of medical imaging and medical natural language processing: Challenges and research directions,” J. King Saud Univ. - Comput. Inf. Sci., vol. 34, no. 8, pp. 5083–5099, Sep. 2022, doi: 10.1016/j.jksuci.2021.01.007.

Q. Zhang, M. Zhang, T. Chen, Z. Sun, Y. Ma, and B. Yu, “Recent advances in convolutional neural network acceleration,” Neurocomputing, vol. 323, pp. 37–51, Jan. 2019, doi: 10.1016/j.neucom.2018.09.038.

P. Ramachandran, B. Zoph, and Q. V Le, “Searching for Activation Functions,” 6th Int. Conf. Learn. Represent. ICLR 2018 - Work. Track Proc., pp. 1–13, Oct. 2017, [Online]. Available: http://arxiv.org/abs/1710.05941

L. N. Smith, “A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay,” pp. 1–21, Apr. 2018, [Online]. Available: http://arxiv.org/abs/1803.09820

K. Janocha and W. M. Czarnecki, “On Loss Functions for Deep Neural Networks in Classification,” Schedae Informaticae, vol. 1/2016, no. December, pp. 49–59, 2017, doi: 10.4467/20838476SI.16.004.6185.

S. Ambikasaran, D. Foreman-Mackey, L. Greengard, D. W. Hogg, and M. O’Neil, “Fast Direct Methods for Gaussian Processes,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 38, no. 2, pp. 252–265, Feb. 2016, doi: 10.1109/TPAMI.2015.2448083.

Y. Zhang, S. Wang, and G. Ji, “A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications,” Math. Probl. Eng., vol. 2015, pp. 1–38, 2015, doi: 10.1155/2015/931256.

M. Tayebi and S. El Kafhali, “Deep Neural Networks Hyperparameter Optimization Using Particle Swarm Optimization for Detecting Frauds Transactions,” in Advances on Smart and Soft Computing, F. Saeed, T. Al-Hadhrami, E. Mohammed, and M. Al-Sarem, Eds., Singapore: Springer Singapore, 2022, pp. 507–516.

D. P. Rini, T. K. Sari, W. K. Sari, and N. Yusliani, “Hyperparameter optimization of convolutional neural network using particle swarm optimization for emotion recognition,” IAES Int. J. Artif. Intell., vol. 14, no. 1, p. 547, Feb. 2025, doi: 10.11591/ijai.v14.i1.pp547-560.

S. Hameed, B. Qolomany, S. B. Belhaouari, M. Abdallah, J. Qadir, and A. Al-Fuqaha, “Large Language Model Enhanced Particle Swarm Optimization for Hyperparameter Tuning for Deep Learning Models,” IEEE Open J. Comput. Soc., vol. 6, pp. 574–585, 2025, doi: 10.1109/OJCS.2025.3564493.

W. H. Bangyal et al., “An Improved Particle Swarm Optimization Algorithm for Data Classification,” Appl. Sci., vol. 13, no. 1, p. 283, Dec. 2022, doi: 10.3390/app13010283.

D. Wang, L. Zhai, J. Fang, Y. Li, and Z. Xu, “psoResNet: An improved PSO-based residual network search algorithm,” Neural Networks, vol. 172, p. 106104, Apr. 2024, doi: 10.1016/j.neunet.2024.106104.

W. Y. Lai, K. K. Kuok, S. Gato-Trinidad, M. R. Rahman, and M. K. Bakri, “Metaheuristic Algorithms to Enhance the Performance of a Feedforward Neural Network in Addressing Missing Hourly Precipitation,” Int. J. Integr. Eng., vol. 15, no. 1, pp. 273–285, Apr. 2023, doi: 10.30880/ijie.2023.15.01.025.

J. DeMarchi et al., “Evaluation of Robustness Metrics for Defense of Machine Learning Systems,” in 2023 International Conference on Military Communications and Information Systems (ICMCIS), IEEE, May 2023, pp. 1–12. doi: 10.1109/ICMCIS59922.2023.10253593.

C.-L. Chang, J.-L. Hung, C.-W. Tien, C.-W. Tien, and S.-Y. Kuo, “Evaluating Robustness of AI Models against Adversarial Attacks,” in Proceedings of the 1st ACM Workshop on Security and Privacy on Artificial Intelligence, in SPAI ’20. New York, NY, USA: ACM, Oct. 2020, pp. 47–54. doi: 10.1145/3385003.3410920.

R. Snaiki, A. Jamali, A. Rahem, M. Shabani, and B. L. Barjenbruch, “A metaheuristic-optimization-based neural network for icing prediction on transmission lines,” Cold Reg. Sci. Technol., vol. 224, no. June, p. 104249, Aug. 2024, doi: 10.1016/j.coldregions.2024.104249.

N. M. AbdelAziz, M. Bekheet, A. Salah, N. El-Saber, and W. T. AbdelMoneim, “A Comprehensive Evaluation of Machine Learning and Deep Learning Models for Churn Prediction,” Information, vol. 16, no. 7, p. 537, Jun. 2025, doi: 10.3390/info16070537.

A. Tri Sasongko, M. Ekhsan, W. Hadikristanto, and A. Nugroho, “A Comprehensive Evaluation of Machine Learning Models for Sentiment Analysis in Employee Reviews,” in 2024 International Conference on Electrical Engineering and Computer Science (ICECOS), IEEE, Sep. 2024, pp. 424–429. doi: 10.1109/ICECOS63900.2024.10791157.

L. Larwuy, “Optimasi Parameter Artificial Neural Network (ANN) Menggunakan Particle Swarm Optimization (PSO) Untuk Pengkategorian Nasabah Bank,” J. Mat. Komputasi dan Stat., vol. 3, no. 3, pp. 506–511, Jan. 2024, doi: 10.33772/jmks.v3i3.60.

G. Li et al., “A particle swarm optimization improved BP neural network intelligent model for electrocardiogram classification,” BMC Med. Inform. Decis. Mak., vol. 21, no. S2, p. 99, Jul. 2021, doi: 10.1186/s12911-021-01453-6.




DOI: http://dx.doi.org/10.36448/expert.v15i2.4618

Refbacks

  • There are currently no refbacks.


EXPERT: Jurnal Manajemen Sistem Informasi dan Teknologi

Published by Pusat Studi Teknologi Informasi, Fakultas Ilmu Komputer, Universitas Bandar Lampung
Gedung M Lt.2 Pascasarjana Universitas Bandar Lampung
Jln Zainal Abidin Pagaralam No.89 Gedong Meneng, Rajabasa, Bandar Lampung,
LAMPUNG, INDONESIA

Indexed by:



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.