Memory Models for Incremental Learning Architectures
Losing V (2019)
Bielefeld: Universität Bielefeld.
Bielefelder E-Dissertation | Englisch
Download
publishedThesis.pdf
6.38 MB
Autor*in
Gutachter*in / Betreuer*in
Einrichtung
Abstract / Bemerkung
Technological advancement leads constantly to an exponential growth of generated data in basically every domain, drastically increasing the burden of data storage and maintenance. Most of the data is instantaneously extracted and available in form of endless streams that contain the most current information. Machine learning methods constitute one fundamental way of processing such data in an automatic way, as they generate models that capture the processes behind the data. They are omnipresent in our everyday life as their applications include personalized advertising, recommendations, fraud detection, surveillance, credit ratings, high-speed trading and smart-home devices. Thereby, batch learning, denoting the offline construction of a static model based on large datasets, is the predominant scheme. However, it is increasingly unfit to deal with the accumulating masses of data in given time and in particularly its static nature cannot handle changing patterns. In contrast, incremental learning constitutes one attractive alternative that is a very natural fit for the current demands. Its dynamic adaptation allows continuous processing of data streams, without the necessity to store all data from the past, and results in always up-to-date models, even able to perform in non-stationary environments. In this thesis, we will tackle crucial research questions in the domain of incremental learning by contributing new algorithms or significantly extending existing ones. Thereby, we consider stationary and non-stationary environments and present multiple real-world applications that showcase merits of the methods as well as their versatility. The main contributions are the following:
One novel approach that addresses the question of how to extend a model for prototype-based algorithms based on cost minimization.
We propose local split-time prediction for incremental decision trees to mitigate the trade-off between adaptation speed versus model complexity and run time.
An extensive survey of the strengths and weaknesses of state-of-the-art methods that provides guidance for choosing a suitable algorithm for a given task.
One new approach to extract valuable information about the type of change in a dataset.
We contribute a biologically inspired architecture, able to handle different types of drift using dedicated memories that are kept consistent.
Application of the novel methods within three diverse real-world tasks, highlighting their robustness and versatility.
Investigation of personalized online models in the context of two real-world applications.
Jahr
2019
Urheberrecht / Lizenzen
Page URI
https://pub.uni-bielefeld.de/record/2936581
Zitieren
Losing V. Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld; 2019.
Losing, V. (2019). Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld. doi:10.4119/unibi/2936581
Losing, Viktor. 2019. Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld.
Losing, V. (2019). Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld.
Losing, V., 2019. Memory Models for Incremental Learning Architectures, Bielefeld: Universität Bielefeld.
V. Losing, Memory Models for Incremental Learning Architectures, Bielefeld: Universität Bielefeld, 2019.
Losing, V.: Memory Models for Incremental Learning Architectures. Universität Bielefeld, Bielefeld (2019).
Losing, Viktor. Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld, 2019.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung - Weitergabe unter gleichen Bedingungen 4.0 International Public License (CC BY-SA 4.0):
Volltext(e)
Name
publishedThesis.pdf
6.38 MB
Access Level
Open Access
Zuletzt Hochgeladen
2019-09-06T09:19:08Z
MD5 Prüfsumme
6642eb2b0dafa66eb31610331de318ad