Neural network learning using low-discrepancy sequence

Ivan Jordanov, Robert Brown

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Backpropagation, (BP), is one of the most frequently used practical methods for supervised training of artificial neural networks. During the learning process, BP may get stuck in local minima, producing suboptimal solution, and thus limiting the effectiveness of the training. This work is dedicated to the problem of avoiding local minima and introduces a new technique for learning, which substitutes gradient descent algorithm in the BP with an optimization method for a global search in a multi-dimensional parameter (weight) space. For this purpose, a low-discrepancy LPT sequence is used. The proposed method is discussed and tested with common benchmark problems at the end.

Original languageEnglish
Title of host publicationAdvanced Topics in Artificial Intelligence
Subtitle of host publication12th Australian Joint Conference on Artificial Intelligence, AI 1999, Proceedings
EditorsNorman Foo
PublisherSpringer
Pages255-267
Number of pages13
ISBN (Electronic)9783540466956
ISBN (Print)9783540668220
DOIs
Publication statusPublished - 1999
Event12th Australian Joint Conference on Artificial Intelligence, AI 1999 - Sydney, Australia
Duration: 6 Dec 199910 Dec 1999

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume1747
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference12th Australian Joint Conference on Artificial Intelligence, AI 1999
Country/TerritoryAustralia
CitySydney
Period6/12/9910/12/99

Keywords

  • Neural networks
  • NN learning

Fingerprint

Dive into the research topics of 'Neural network learning using low-discrepancy sequence'. Together they form a unique fingerprint.

Cite this