Local minima free neural network learning

Ivan N. Jordanov*, Tahseen A. Rafik

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Global optimization algorithm applied for feedforward neural networks (NN) supervised learning is investigated. The network weights are determined by minimizing the traditional backpropagation error function. The difference is that the optimization based learning algorithm utilizes stochastic technique, based on the use of low discrepancy sequences. This technique searches the parameter space, defined by the network weights, to define initial regions of attraction with candidates for local minima, and then exploits each region to locate the minima, and to determine a global minimum. The proposed technique is initially tested on multimodal mathematical functions and subsequently applied for training NN with moderate size for solving simple benchmark problems. Finally, the results are analysed, discussed, and compared with others.

Original languageEnglish
Title of host publication2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings
EditorsR.R. Yager, V.S. Sgurev, V.S. Jotsov, P.D. Koprinkova-Hristova
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Print)0780382781
Publication statusPublished - 25 Oct 2004
Event2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings - Varna, Bulgaria
Duration: 22 Jun 200424 Jun 2004


Conference2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings


  • Local minima
  • Low-discrepancy sequences
  • Stochastic global optimisation
  • Supervised learning


Dive into the research topics of 'Local minima free neural network learning'. Together they form a unique fingerprint.

Cite this