Optimizing Artificial Neural Networks Using Mountain Gazelle Optimizer

Muhammed Abdulhamid Karabiyik, Bahaeddin Turkoglu, Tunc Asuroglu

Research output: Contribution to journalArticleScientificpeer-review

Abstract

The performance of artificial neural networks heavily depends on the optimization of network parameters, specifically weights and biases, during the training process. Effectively adjusting these parameters is essential to minimize the error between predicted and actual outputs. While traditional training algorithms, such as gradient-based methods, have been widely used, they often face challenges like premature convergence and stagnation in local optima. Training an ANN can, therefore, be viewed as an optimization problem, where the goal is to fine-tune parameters to achieve accurate and efficient performance. In this study, we introduce a novel approach to optimizing neural network parameters using the Mountain Gazelle Optimizer (MGO), a nature-inspired metaheuristic algorithm that mimics the social hierarchy and behavioral patterns of wild mountain gazelles. The MGO algorithm leverages its unique features, including hierarchical social structure and adaptive movement strategies, to effectively navigate the complex parameter space of neural networks. The algorithm’s search mechanism integrates four key behavioral strategies: Territorial Solitary Males (TSM) for refining optimal solutions, Maternity Herds (MH) for balancing exploration and exploitation, Bachelor Male Herds (BMH) for global exploration, and Migration to Search for Food (MSF) for introducing randomness to prevent stagnation in local optima. These mechanisms work collaboratively, ensuring a dynamic and balanced optimization process throughout the training phase. To evaluate the effectiveness of the proposed approach, we conducted comprehensive experiments using various classification datasets from the UCI repository. The performance of the MGO-based neural network optimizer was compared with traditional backpropagation and several state-of-the-art optimization algorithms. Experimental results demonstrate that the proposed method exhibits superior performance in terms of convergence speed and avoiding local optima, suggesting that MGO is a promising alternative for optimizing artificial neural networks during training.
Original languageEnglish
Pages (from-to)50464-50479
JournalIEEE Access
Volume13
DOIs
Publication statusPublished - 27 Mar 2025
MoE publication typeA1 Journal article-refereed

Keywords

  • Artificial Neural Network
  • Optimization
  • Mountain Gazelle Optimizer

Fingerprint

Dive into the research topics of 'Optimizing Artificial Neural Networks Using Mountain Gazelle Optimizer'. Together they form a unique fingerprint.

Cite this