Multilayer neural networks : learnability, network generation, and network simplification

Ellerbrock TM (1999)
Bielefeld (Germany): Bielefeld University.

Download
OA
OA container.tgz
OA 2_ellerbrock_diss.ps
Bielefeld Dissertation | English
Author
Supervisor
Blanchard, Philippe (Prof. Dr.)
Abstract
Chapter 1 of this book shall give a little impression of the theoretical diversity of the non-trivial theory of multilayer neural networks (multilayer perceptrons). This diversity comprises ideas from Approximation Theory, Measure and Probability Theory, Statistics, the Theory of NP-Completeness, Geometry, Topology and Graph Theory. In Chapter 2 a new perspective in learning and generalization of multilayer perceptrons is introduced. Proposing a definition of 'representativity' for training sets, we proved that multilayer perceptrons are able to realize 'topologically adequate solutions' to classification problems whenever the training set is representative of the problem to learn. Using concepts from topology and combinatorial geometry, the definition 'adequate solution' gives the notion of 'generalization' a precise mathematical meaning. This way, connectivity properties of the classes to be learned are taken into account. In contrast to the known results concerning the approximation capabilities of neural networks, here classes are not approximated by functions but by sets. This is done directly with respect to the training set. In Chapter 3 an algorithm is introduced which generates modular multilayer neural networks for classification problems. Computer simulations are presented and discussed with respect to generalization, adequate solutions, learning and network structure. In Chapter 4 a new iterative pruning method for a variety of neural network architectures is given. Successively from all neurons in the net, all connections and thresholds are removed which can be removed without changing the input output behavior of the neurons. Organizing the local input space into an acyclic directed graph, only a small portion of the input space (called test inputs) needs to be used to decide whether a set of neuron parameters (synaptic connections and thresholds) can be removed or not. Furthermore, the optimum order for the presentation of test inputs has been computed. These theoretical and numerical results are combined with theorems on the optimum order and optimum number of neuron parameters to be removed in each iteration step.
Year
PUB-ID

Cite this

Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University; 1999.
Ellerbrock, T. M. (1999). Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University.
Ellerbrock, T. M. (1999). Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University.
Ellerbrock, T.M., 1999. Multilayer neural networks : learnability, network generation, and network simplification, Bielefeld (Germany): Bielefeld University.
T.M. Ellerbrock, Multilayer neural networks : learnability, network generation, and network simplification, Bielefeld (Germany): Bielefeld University, 1999.
Ellerbrock, T.M.: Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld University, Bielefeld (Germany) (1999).
Ellerbrock, Thomas M. Multilayer neural networks : learnability, network generation, and network simplification. Bielefeld (Germany): Bielefeld University, 1999.
Main File(s)
Access Level
OA Open Access
File Name
Access Level
OA Open Access
Access Level
OA Open Access

This data publication is cited in the following publications:
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Search this title in

Google Scholar