Multilayer neural networks : learnability, network generation, and network simplification
Ellerbrock TM (1999)
Bielefeld (Germany): Bielefeld University.
Bielefelder E-Dissertation | Englisch
Autor*in
Ellerbrock, Thomas M.
Gutachter*in / Betreuer*in
Blanchard, Philippe (Prof. Dr.)
Einrichtung
Abstract / Bemerkung
Chapter 1 of this book shall give a little impression of the theoretical diversity of the non-trivial theory of multilayer neural networks (multilayer perceptrons). This diversity comprises ideas from Approximation Theory, Measure and Probability Theory, Statistics, the Theory of NP-Completeness, Geometry, Topology and Graph Theory.
In Chapter 2 a new perspective in learning and generalization of multilayer perceptrons is introduced.
Proposing a definition of 'representativity' for training sets, we proved that multilayer perceptrons are able to realize 'topologically adequate solutions' to classification problems whenever the training set is representative of the problem to learn. Using concepts from topology and combinatorial geometry, the definition 'adequate solution' gives the notion of 'generalization' a precise mathematical meaning. This way, connectivity properties of the classes to be learned are taken into account.
In contrast to the known results concerning the approximation capabilities of neural networks, here classes are not approximated by functions but by sets. This is done directly with respect to the training set.
In Chapter 3 an algorithm is introduced which generates modular multilayer neural networks for classification problems. Computer simulations are presented and discussed with respect to generalization, adequate solutions, learning and network structure.
In Chapter 4 a new iterative pruning method for a variety of neural network architectures is given.
Successively from all neurons in the net, all connections and thresholds are removed which can be removed without changing the input output behavior of the neurons. Organizing the local input space into an acyclic directed graph, only a small portion of the input space (called test inputs) needs to be used to decide whether a set of neuron parameters (synaptic connections and thresholds) can be removed or not. Furthermore, the optimum order for the presentation of test inputs has been computed.
These theoretical and numerical results are combined with theorems on the optimum order and optimum number of neuron parameters to be removed in each iteration step.
Stichworte
Mehrschichten-Perzeptron , , Multilayer perceptrons , Network structuring , Generalization , Network architecture
Jahr
1999
Page URI
https://pub.uni-bielefeld.de/record/2302401
Zitieren
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University; 1999.
Ellerbrock, T. M. (1999). Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University.
Ellerbrock, Thomas M. 1999. Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University.
Ellerbrock, T. M. (1999). Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University.
Ellerbrock, T.M., 1999. Multilayer neural networks : learnability, network generation, and network simplification, Bielefeld (Germany): Bielefeld University.
T.M. Ellerbrock, Multilayer neural networks : learnability, network generation, and network simplification, Bielefeld (Germany): Bielefeld University, 1999.
Ellerbrock, T.M.: Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld University, Bielefeld (Germany) (1999).
Ellerbrock, Thomas M. Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University, 1999.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Copyright Statement:
Dieses Objekt ist durch das Urheberrecht und/oder verwandte Schutzrechte geschützt. [...]
Volltext(e)
Access Level
Open Access
Zuletzt Hochgeladen
2019-09-06T08:57:40Z
MD5 Prüfsumme
083ac2b4dc3ed6360a389765b1636116
Name
Access Level
Open Access
Zuletzt Hochgeladen
2019-09-06T08:57:40Z
MD5 Prüfsumme
8d1359a99543ab45ceef0b72754dea00
Name
Access Level
Open Access
Zuletzt Hochgeladen
2019-09-06T08:57:40Z
MD5 Prüfsumme
ff9030183a6eca4dd4ae98deab4a0b63
Automatisch aus der Originaldatei erzeugtes PDF
Name
1_ellerbrock_title.pdf
49.74 KB
Access Level
Open Access
Zuletzt Hochgeladen
2023-08-04T13:00:40Z
MD5 Prüfsumme
c3cdbd60420dcbf52113379a78416877
Automatisch aus der Originaldatei erzeugtes PDF
Name
2_ellerbrock_diss.pdf
2.51 MB
Access Level
Open Access
Zuletzt Hochgeladen
2023-08-04T13:00:43Z
MD5 Prüfsumme
2b989bbc769f374d533693a56742a410