K., Y. Leong and Augustina Aegidius Sitiol, and Kalaiarasi Sonai Muthu, Anbananthen (2009) Enhance neural networks training using GA with chaos theory. In: 6th International Symposium on Neural Networks (ISNN 2009), 26-29 May 2009, Wuhan, China.
Full text not available from this repository.
Official URL: http://dx.doi.org/10.1007/978-3-642-01510-6_59
There are numerous algorithms available for training artificial neural networks. Besides classical algorithms for supervised learning such as backpropagation, associative memory and radial basis function, this training task can be employed by evolutionary computation since most of the gradient descent related algorithms can be view as an application of optimization theory and stochastic search. In this paper, the logistic model of population growth from ecology is integrated into initialization, selection and crossover operators of genetic algorithms for neural network training. These chaotic operators are very efficient in maintaining the population diversity during the evolution process of genetic algorithms. A comparison is done on the basis of a benchmark comprising several data classification problems for neural networks. Three variants of training - Backpropagation (BP), Genetic Algorithms (GA) and Genetic Algorithms with Chaotic Operators (GACO) - are described and compared. The experimental results confirm the dynamic mobility of chaotic algorithms in GACO network training, which can overcome saturation and improve the convergence rate. © 2009 Springer Berlin Heidelberg.
|Item Type:||Conference Paper (UNSPECIFIED)|
|Uncontrolled Keywords:||Chaos theory, Chaotic operator, Evolutionary algorithms, Neural networks training|
|Subjects:||?? QA75.5-76.95 ??|
|Divisions:||SCHOOL > School of Engineering and Information Technology|
|Deposited By:||IR Admin|
|Deposited On:||28 Mar 2011 17:29|
|Last Modified:||30 Dec 2014 14:22|
Repository Staff Only: item control page