Efficient approximations of robust soft learning vector quantization for non-vectorial data

Hofmann D, Gisbrecht A, Hammer B (2015)
Neurocomputing 147: 96-106.

Journal Article | Published | English

No fulltext has been uploaded

Abstract
Due to its intuitive learning algorithms and classification behavior, learning vector quantization (LVQ) enjoys a wide popularity in diverse application domains. In recent years, the classical heuristic schemes have been accompanied by variants which can be motivated by a statistical framework such as robust soft LVQ (RSLVQ). In its original form, LVQ and RSLVQ can be applied to vectorial data only, making it unsuitable for complex data sets described in terms of pairwise relations only. In this contribution, we address kernel RSLVQ which extends its applicability to data which are described by a general Gram matrix. While leading to state of the art results, this extension has the drawback that models are no longer sparse, and quadratic training complexity is encountered due to the dependency of the method on the full Gram matrix. In this contribution, we investigate the performance of a speed-up of training by means of low rank approximations of the Gram matrix, and we investigate how sparse models can be enforced in this context. It turns out that an efficient Nyström approximation can be used if data are intrinsically low dimensional, a property which can be efficiently checked by sampling the variance of the approximation prior to training. Further, all models enable sparse approximations of comparable quality as the full models using simple geometric approximation schemes only. We demonstrate the behavior of these approximations in a couple of benchmarks.
Publishing Year
ISSN
eISSN
PUB-ID

Cite this

Hofmann D, Gisbrecht A, Hammer B. Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing. 2015;147:96-106.
Hofmann, D., Gisbrecht, A., & Hammer, B. (2015). Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing, 147, 96-106.
Hofmann, D., Gisbrecht, A., and Hammer, B. (2015). Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing 147, 96-106.
Hofmann, D., Gisbrecht, A., & Hammer, B., 2015. Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing, 147, p 96-106.
D. Hofmann, A. Gisbrecht, and B. Hammer, “Efficient approximations of robust soft learning vector quantization for non-vectorial data”, Neurocomputing, vol. 147, 2015, pp. 96-106.
Hofmann, D., Gisbrecht, A., Hammer, B.: Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing. 147, 96-106 (2015).
Hofmann, Daniela, Gisbrecht, Andrej, and Hammer, Barbara. “Efficient approximations of robust soft learning vector quantization for non-vectorial data”. Neurocomputing 147 (2015): 96-106.
This data publication is cited in the following publications:
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Web of Science

View record in Web of Science®

Search this title in

Google Scholar