Abstract
This paper investigates local minima problem in neural network (NN) backpropagation supervised learning. The proposed algorithm of training makes use of stochastic optimization technique based on so-called low-discrepancy sequences. The learning process is considered as an unconstrained optimization problem and once parameter space (defined by the NN weights) and objective functions are defined, the proposed method searches for a global optimum. First, regions of attraction as candidates for local minima are obtained, and secondly, each region is searched for locating minima and subsequently finding a global minimum. The conducted algorithm is initially tested on multimodal mathematical functions and then on common benchmark problems for NN training. The results are finally discussed and compared with such obtained from backpropagation and other methods.
Original language | English |
---|---|
Title of host publication | Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02 |
Editors | Jagath C. Rajapakse, Xin Yao, Lipo Wang, Kunihiko Fukushima, Soo-Young Lee |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 488-492 |
Number of pages | 5 |
ISBN (Electronic) | 9810475241, 9789810475246 |
DOIs | |
Publication status | Published - 5 Jun 2003 |
Event | 9th International Conference on Neural Information Processing, ICONIP 2002 - Singapore, Singapore Duration: 18 Nov 2002 → 22 Nov 2002 |
Conference
Conference | 9th International Conference on Neural Information Processing, ICONIP 2002 |
---|---|
Country/Territory | Singapore |
City | Singapore |
Period | 18/11/02 → 22/11/02 |