Derivation of Symmetric PCA Learning Rules from a Novel Objective Function

Möller R (2020)
arXiv:2005.11689.

Preprint | Englisch
 
Abstract / Bemerkung
Neural learning rules for principal component / subspace analysis (PCA / PSA) can be derived by maximizing an objective function (summed variance of the projection on the subspace axes) under an orthonormality constraint. For a subspace with a single axis, the optimization produces the principal eigenvector of the data covariance matrix. Hierarchical learning rules with deflation procedures can then be used to extract multiple eigenvectors. However, for a subspace with multiple axes, the optimization leads to PSA learning rules which only converge to axes spanning the principal subspace but not to the principal eigenvectors. A modified objective function with distinct weight factors had to be introduced produce PCA learning rules. Optimization of the objective function for multiple axes leads to symmetric learning rules which do not require deflation procedures. For the PCA case, the estimated principal eigenvectors are ordered (w.r.t. the corresponding eigenvalues) depending on the order of the weight factors. Here we introduce an alternative objective function where it is not necessary to introduce fixed weight factors; instead, the alternative objective function uses squared summands. Optimization leads to symmetric PCA learning rules which converge to the principal eigenvectors, but without imposing an order. In place of the diagonal matrices with fixed weight factors, variable diagonal matrices appear in the learning rules. We analyze this alternative approach by determining the fixed points of the constrained optimization. The behavior of the constrained objective function at the fixed points is analyzed which confirms both the PCA behavior and the fact that no order is imposed. Different ways to derive learning rules from the optimization of the objective function are presented. The role of the terms in the learning rules obtained from these derivations is explored.
Erscheinungsjahr
2020
Zeitschriftentitel
arXiv:2005.11689
Page URI
https://pub.uni-bielefeld.de/record/2943643

Zitieren

Möller R. Derivation of Symmetric PCA Learning Rules from a Novel Objective Function. arXiv:2005.11689. 2020.
Möller, R. (2020). Derivation of Symmetric PCA Learning Rules from a Novel Objective Function. arXiv:2005.11689
Möller, R. (2020). Derivation of Symmetric PCA Learning Rules from a Novel Objective Function. arXiv:2005.11689.
Möller, R., 2020. Derivation of Symmetric PCA Learning Rules from a Novel Objective Function. arXiv:2005.11689.
R. Möller, “Derivation of Symmetric PCA Learning Rules from a Novel Objective Function”, arXiv:2005.11689, 2020.
Möller, R.: Derivation of Symmetric PCA Learning Rules from a Novel Objective Function. arXiv:2005.11689. (2020).
Möller, Ralf. “Derivation of Symmetric PCA Learning Rules from a Novel Objective Function”. arXiv:2005.11689 (2020).
Link(s) zu Volltext(en)
Access Level
OA Open Access

Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Quellen

arXiv: 2005.11689

Suchen in

Google Scholar