Learning in non-stationary Environments

Raab C (2022)
Bielefeld: Universität Bielefeld.

Bielefelder E-Dissertation | Englisch
 
Download
OA 12.45 MB
Autor*in
Raab, Christoph
Abstract / Bemerkung
The topic of Machine Learning deals with learning a data-based decision function. This function assigns correct output information based on input data. The learning process is performed in a laboratory domain and finally applied in a test or application domain. Typically, the distribution of data between the learning and application domain is assumed to be the same, denoted as a stationary environment. If this is not the case and the distributions change between the two domains, it is called a non-stationary environment.

The research area of Domain Adaptation offers methods to adapt the input data or an already learned decision function to the test domain data in such non-stationary environments. Current solutions search for a suitable representation space by minimizing some statistical divergence measure or subspace projector differences. The former methods bound domain differences with computational requirements almost intractable, while the latter formulate no guarantees regarding domain differences.

In the first part of this thesis, I provide geometric- and subspace-oriented projectors built upon the idea of domain separability. The methods outperform recent solutions while being a magnitude faster. Both solutions bound the domain differences by the remaining differences of the spectra, and I show that subspace solutions naturally provide less domain separability.

In the second part of this thesis, I motivated a spectral-based moment-matching regularization loss. The approach is based on the characterization of domain differences via spectral differences described above. Applied to neural networks, domains can be matched in latent or label space. Furthermore, I implemented a relevance weighting that minimizes domain-specific influences, leading to a stable and effective solution.

While the above methods work well for time-independent data with finite sample sizes, tasks such as weather forecasting require fast processing of a potentially unlimited amount of data in a temporal sequence. If the data distribution changes over time, the sequence, also called data stream, is additionally affected by Concept Drift. Domain Adaptation methods are not practical for such scenarios because they do not scale to data stream size. The research area of Concept Drift Stream Classification addresses the challenges just described. It provides a variety of algorithms and techniques to fit and adapt the model with linear complexity. Prototype-based learning is already reasonably flexible in this constraint through an online learning approach but is not yet prepared for Concept Drifts. In the third part of the thesis, I extended probabilistic prototype models by drift detection via a statistical test plus an active and passive Concept Drift adaptation, providing stable and fast concept adaptation against various drift types.
Jahr
2022
Seite(n)
165
Page URI
https://pub.uni-bielefeld.de/record/2961979

Zitieren

Raab C. Learning in non-stationary Environments. Bielefeld: Universität Bielefeld; 2022.
Raab, C. (2022). Learning in non-stationary Environments. Bielefeld: Universität Bielefeld. https://doi.org/10.4119/unibi/2961979
Raab, Christoph. 2022. Learning in non-stationary Environments. Bielefeld: Universität Bielefeld.
Raab, C. (2022). Learning in non-stationary Environments. Bielefeld: Universität Bielefeld.
Raab, C., 2022. Learning in non-stationary Environments, Bielefeld: Universität Bielefeld.
C. Raab, Learning in non-stationary Environments, Bielefeld: Universität Bielefeld, 2022.
Raab, C.: Learning in non-stationary Environments. Universität Bielefeld, Bielefeld (2022).
Raab, Christoph. Learning in non-stationary Environments. Bielefeld: Universität Bielefeld, 2022.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung - Weitergabe unter gleichen Bedingungen 4.0 International Public License (CC BY-SA 4.0):
Volltext(e)
Access Level
OA Open Access
Zuletzt Hochgeladen
2022-03-24T10:19:35Z
MD5 Prüfsumme
f6639d027a8c6d8f2c06cb9b17763d87


Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar