Journal des technologies de l'information et du génie logiciel

Journal des technologies de l'information et du génie logiciel
Libre accès

ISSN: 2165- 7866

Abstrait

Recurrent Neural Networks: Associative Memory and Optimization

K. -L. Du

Due to feedback connections, recurrent neural networks (RNNs) are dynamic models. RNNs can provide more compact structure for approximating dynamic systems compared to feedforward neural networks (FNNs). For some RNN models such as the Hopfield model and the Boltzmann machine, the fixed-point property of the dynamic systems can be used for optimization and associative memory. The Hopfield model is the most important RNN model, and the Boltzmann machine as well as some other stochastic dynamic models are proposed as its generalization. These models are especially useful for dealing with combinatorial optimization problems (COPs), which are notorious NPcomplete problems. In this paper, we provide a state-of-the-art introduction to these RNN models, their learning algorithms as well as their analog implementations. Associative memory, COPs, simulated annealing (SA), chaotic neural networks and multilevel Hopfield models are also important topics treated in this paper.

Clause de non-responsabilité: Ce résumé a été traduit à l'aide d'outils d'intelligence artificielle et n'a pas encore été révisé ou vérifié.
Top