Recurrent Neural Networks with Small Weights Implement Definite Memory Machines

Hammer B, Tiňo P (2003)
Neural Computation 15(8): 1897-1929.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Hammer, BarbaraUniBi ; Tiňo, Peter
Abstract / Bemerkung
Recent experimental studies indicate that recurrent neural networks initialized with “small” weights are inherently biased toward definite memory machines (Tiňno, Čerňanský, & Beňušková, 2002a, 2002b). This article establishes a theoretical counterpart: transition function of recurrent network with small weights and squashing activation function is a contraction. We prove that recurrent networks with contractive transition function can be approximated arbitrarily well on input sequences of unbounded length by a definite memory machine. Conversely, every definite memory machine can be simulated by a recurrent network with contractive transition function. Hence, initialization with small weights induces an architectural bias into learning with recurrent neural networks. This bias might have benefits from the point of view of statistical learning theory: it emphasizes one possible region of the weight space where generalization ability can be formally proved. It is well known that standard recurrent neural networks are not distribution independent learnable in the probably approximately correct (PAC) sense if arbitrary precision and inputs are considered. We prove that recurrent networks with contractive transition function with a fixed contraction parameter fulfill the so-called distribution independent uniform convergence of empirical distances property and hence, unlike general recurrent networks, are distribution independent PAC learnable.
Erscheinungsjahr
2003
Zeitschriftentitel
Neural Computation
Band
15
Ausgabe
8
Seite(n)
1897-1929
ISSN
0899-7667
eISSN
1530-888X
Page URI
https://pub.uni-bielefeld.de/record/2982124

Zitieren

Hammer B, Tiňo P. Recurrent Neural Networks with Small Weights Implement Definite Memory Machines. Neural Computation. 2003;15(8):1897-1929.
Hammer, B., & Tiňo, P. (2003). Recurrent Neural Networks with Small Weights Implement Definite Memory Machines. Neural Computation, 15(8), 1897-1929. https://doi.org/10.1162/08997660360675080
Hammer, Barbara, and Tiňo, Peter. 2003. “Recurrent Neural Networks with Small Weights Implement Definite Memory Machines”. Neural Computation 15 (8): 1897-1929.
Hammer, B., and Tiňo, P. (2003). Recurrent Neural Networks with Small Weights Implement Definite Memory Machines. Neural Computation 15, 1897-1929.
Hammer, B., & Tiňo, P., 2003. Recurrent Neural Networks with Small Weights Implement Definite Memory Machines. Neural Computation, 15(8), p 1897-1929.
B. Hammer and P. Tiňo, “Recurrent Neural Networks with Small Weights Implement Definite Memory Machines”, Neural Computation, vol. 15, 2003, pp. 1897-1929.
Hammer, B., Tiňo, P.: Recurrent Neural Networks with Small Weights Implement Definite Memory Machines. Neural Computation. 15, 1897-1929 (2003).
Hammer, Barbara, and Tiňo, Peter. “Recurrent Neural Networks with Small Weights Implement Definite Memory Machines”. Neural Computation 15.8 (2003): 1897-1929.
Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar