Efficient approximations of robust soft learning vector quantization for non-vectorial data
Hofmann D, Gisbrecht A, Hammer B (2015)
Neurocomputing 147: 96-106.
Zeitschriftenaufsatz
| Veröffentlicht | Englisch
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Einrichtung
Abstract / Bemerkung
Due to its intuitive learning algorithms and classification behavior, learning vector quantization (LVQ) enjoys a wide popularity in diverse application domains. In recent years, the classical heuristic schemes have been accompanied by variants which can be motivated by a statistical framework such as robust soft LVQ (RSLVQ). In its original form, LVQ and RSLVQ can be applied to vectorial data only, making it unsuitable for complex data sets described in terms of pairwise relations only. In this contribution, we address kernel RSLVQ which extends its applicability to data which are described by a general Gram matrix. While leading to state of the art results, this extension has the drawback that models are no longer sparse, and quadratic training complexity is encountered due to the dependency of the method on the full Gram matrix. In this contribution, we investigate the performance of a speed-up of training by means of low rank approximations of the Gram matrix, and we investigate how sparse models can be enforced in this context. It turns out that an efficient Nyström approximation can be used if data are intrinsically low dimensional, a property which can be efficiently checked by sampling the variance of the approximation prior to training. Further, all models enable sparse approximations of comparable quality as the full models using simple geometric approximation schemes only. We demonstrate the behavior of these approximations in a couple of benchmarks.
Stichworte
RSLVQ;
Classification;
Kernel;
Sparse;
Nyström
Erscheinungsjahr
2015
Zeitschriftentitel
Neurocomputing
Band
147
Seite(n)
96-106
ISSN
0925-2312
eISSN
1872-8286
Page URI
https://pub.uni-bielefeld.de/record/2695196
Zitieren
Hofmann D, Gisbrecht A, Hammer B. Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing. 2015;147:96-106.
Hofmann, D., Gisbrecht, A., & Hammer, B. (2015). Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing, 147, 96-106. doi:10.1016/j.neucom.2013.11.044
Hofmann, Daniela, Gisbrecht, Andrej, and Hammer, Barbara. 2015. “Efficient approximations of robust soft learning vector quantization for non-vectorial data”. Neurocomputing 147: 96-106.
Hofmann, D., Gisbrecht, A., and Hammer, B. (2015). Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing 147, 96-106.
Hofmann, D., Gisbrecht, A., & Hammer, B., 2015. Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing, 147, p 96-106.
D. Hofmann, A. Gisbrecht, and B. Hammer, “Efficient approximations of robust soft learning vector quantization for non-vectorial data”, Neurocomputing, vol. 147, 2015, pp. 96-106.
Hofmann, D., Gisbrecht, A., Hammer, B.: Efficient approximations of robust soft learning vector quantization for non-vectorial data. Neurocomputing. 147, 96-106 (2015).
Hofmann, Daniela, Gisbrecht, Andrej, and Hammer, Barbara. “Efficient approximations of robust soft learning vector quantization for non-vectorial data”. Neurocomputing 147 (2015): 96-106.
Export
Markieren/ Markierung löschen
Markierte Publikationen
Web of Science
Dieser Datensatz im Web of Science®Suchen in