Abstract
Global optimization algorithm applied for feedforward neural networks (NN) supervised learning is investigated. The network weights are determined by minimizing the traditional backpropagation error function. The difference is that the optimization based learning algorithm utilizes stochastic technique, based on the use of low discrepancy sequences. This technique searches the parameter space, defined by the network weights, to define initial regions of attraction with candidates for local minima, and then exploits each region to locate the minima, and to determine a global minimum. The proposed technique is initially tested on multimodal mathematical functions and subsequently applied for training NN with moderate size for solving simple benchmark problems. Finally, the results are analysed, discussed, and compared with others.
Original language | English |
---|---|
Title of host publication | 2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings |
Editors | R.R. Yager, V.S. Sgurev, V.S. Jotsov, P.D. Koprinkova-Hristova |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 34-39 |
Number of pages | 6 |
ISBN (Print) | 0780382781 |
DOIs | |
Publication status | Published - 25 Oct 2004 |
Event | 2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings - Varna, Bulgaria Duration: 22 Jun 2004 → 24 Jun 2004 |
Conference
Conference | 2004 2nd International IEEE Conference 'Intelligent Systems' - Proceedings |
---|---|
Country/Territory | Bulgaria |
City | Varna |
Period | 22/06/04 → 24/06/04 |
Keywords
- Local minima
- Low-discrepancy sequences
- Stochastic global optimisation
- Supervised learning