High Dimensional Matrix Relevance Learning

Schleif F-M, Villmann T, Zhu X (2015)
In: 2014 IEEE International Conference on Data Mining Workshop. Piscataway, NJ: IEEE.

Download
No fulltext has been uploaded. References only!
Conference Paper | Published | English

No fulltext has been uploaded

Author
; ;
Abstract
In supervised learning the parameters of a parametric Euclidean distance or mahalanobis distance can be effectively learned by so called Matrix Relevance Learning. This adaptation is not only useful to improve the discrimination capabilities of the model, but also to identify relevant features or relevant correlated features in the input data. Classical Matrix Relevance Learning scales quadratic with the number of input dimensions M and becomes prohibitive if M exceeds some thousand input features. We address Matrix Relevance Learning for data with a very large number of input dimensions. Such high dimensional data occur frequently in the life sciences domain e.g. For microarray or spectral data. We derive two respective approximation schemes and show exemplarily the implementation in Generalized Matrix Relevance Learning (GMLVQ) for classification problems. The first approximation scheme is based on Limited Rank Matrix Approximation (LiRaM) LiRaM is a random subspace projection technique which was formerly mainly considered for visualization purposes. The second novel approximation scheme is based on the Nystroem approximation and is exact if the number of Eigen values equals the rank of the Relevance Matrix. Using multiple benchmark problems, we demonstrate that the training process yields fast low rank approximations of the relevance matrices without harming the generalization ability. The approaches can be used to identify discriminative features for high dimensional data sets.
Publishing Year
Conference
2014 IEEE International Conference on Data Mining Workshop
Location
Shenzhen, China
Conference Date
2014-12-14 – 2014-12-14
PUB-ID

Cite this

Schleif F-M, Villmann T, Zhu X. High Dimensional Matrix Relevance Learning. In: 2014 IEEE International Conference on Data Mining Workshop. Piscataway, NJ: IEEE; 2015.
Schleif, F. - M., Villmann, T., & Zhu, X. (2015). High Dimensional Matrix Relevance Learning. 2014 IEEE International Conference on Data Mining Workshop Piscataway, NJ: IEEE. doi:10.1109/icdmw.2014.15
Schleif, F. - M., Villmann, T., and Zhu, X. (2015). “High Dimensional Matrix Relevance Learning” in 2014 IEEE International Conference on Data Mining Workshop (Piscataway, NJ: IEEE).
Schleif, F.-M., Villmann, T., & Zhu, X., 2015. High Dimensional Matrix Relevance Learning. In 2014 IEEE International Conference on Data Mining Workshop. Piscataway, NJ: IEEE.
F.-M. Schleif, T. Villmann, and X. Zhu, “High Dimensional Matrix Relevance Learning”, 2014 IEEE International Conference on Data Mining Workshop, Piscataway, NJ: IEEE, 2015.
Schleif, F.-M., Villmann, T., Zhu, X.: High Dimensional Matrix Relevance Learning. 2014 IEEE International Conference on Data Mining Workshop. IEEE, Piscataway, NJ (2015).
Schleif, Frank-Michael, Villmann, Thomas, and Zhu, Xibin. “High Dimensional Matrix Relevance Learning”. 2014 IEEE International Conference on Data Mining Workshop. Piscataway, NJ: IEEE, 2015.
This data publication is cited in the following publications:
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Search this title in

Google Scholar
ISBN Search