Three-Dimensional Voxel-Based Neural Style Transfer and Quantification
Friedrich T (2022)
Bielefeld: Universität Bielefeld.
Bielefelder E-Dissertation | Englisch
Download
Autor*in
Friedrich, Timo
Gutachter*in / Betreuer*in
Hammer, BarbaraUniBi;
Jin, YaochuUniBi;
Menzel, Stefan
Einrichtung
Abstract / Bemerkung
Machine Learning and especially Deep Learning has started to conquer another human trait in recent years by being able to perform creative tasks. Neural Network based systems compose music, create dream-like creatures, generate faces of fictional persons, and even write complete books. Accordingly, of course, they also generate visual art. Here, Neural Style Transfer stylizes images and photographs with a style extracted from an arbitrary image, creating novel images of a quality hardly separatable from actual human creations. However, Neural Style Transfer had not been applied in a 3D context.
In this thesis, we transfer, adapt, extent and evaluate Neural Style Transfer in the spatially three-dimensional space. Due to fundamental differences between digital 2D images and 3D shapes, we first answer the question if 3D Neural Style Transfer is in principle possible and select, visualize and evaluate suited three-dimensional data representations. Also, we rely on a 2D surrogate images setup to visualize and solve issues occurring. As Neural Style Transfer is strongly based on high-quality 3D object classification networks for feature extraction, we compose and evaluate novel networks in combination with extensively increased voxel-shape resolutions. The classification networks do not only match state-of the-art accuracy, but many crucial adaptations are incorporated to enhance following usage during Neural Style Transfer. In order to substantiate our results and subjective interpretations, we developed and applied a novel style transfer success metric. By adding new regularization terms to the style transfer optimization loop, we created and proposed a novel threedimensional voxel-based Neural Style Transfer pipeline. The first successful solely 3D-based high-resolution Neural Style Transfer results prove the applicability and give valuable insights into the meaning of style in combination with three-dimensional objects, which led to the development of a second complementary Eigendecomposition-based style transfer approach applied on triangular meshes, which interactively creates complete stylized product portfolios in real-time. Altogether, this thesis provides an extensive and profound basis for future threedimensional style transfer and voxel-based Neural Style Transfer in particular.
In this thesis, we transfer, adapt, extent and evaluate Neural Style Transfer in the spatially three-dimensional space. Due to fundamental differences between digital 2D images and 3D shapes, we first answer the question if 3D Neural Style Transfer is in principle possible and select, visualize and evaluate suited three-dimensional data representations. Also, we rely on a 2D surrogate images setup to visualize and solve issues occurring. As Neural Style Transfer is strongly based on high-quality 3D object classification networks for feature extraction, we compose and evaluate novel networks in combination with extensively increased voxel-shape resolutions. The classification networks do not only match state-of the-art accuracy, but many crucial adaptations are incorporated to enhance following usage during Neural Style Transfer. In order to substantiate our results and subjective interpretations, we developed and applied a novel style transfer success metric. By adding new regularization terms to the style transfer optimization loop, we created and proposed a novel threedimensional voxel-based Neural Style Transfer pipeline. The first successful solely 3D-based high-resolution Neural Style Transfer results prove the applicability and give valuable insights into the meaning of style in combination with three-dimensional objects, which led to the development of a second complementary Eigendecomposition-based style transfer approach applied on triangular meshes, which interactively creates complete stylized product portfolios in real-time. Altogether, this thesis provides an extensive and profound basis for future threedimensional style transfer and voxel-based Neural Style Transfer in particular.
Jahr
2022
Seite(n)
150
Urheberrecht / Lizenzen
Page URI
https://pub.uni-bielefeld.de/record/2961467
Zitieren
Friedrich T. Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Bielefeld: Universität Bielefeld; 2022.
Friedrich, T. (2022). Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Bielefeld: Universität Bielefeld. https://doi.org/10.4119/unibi/2961467
Friedrich, Timo. 2022. Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Bielefeld: Universität Bielefeld.
Friedrich, T. (2022). Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Bielefeld: Universität Bielefeld.
Friedrich, T., 2022. Three-Dimensional Voxel-Based Neural Style Transfer and Quantification, Bielefeld: Universität Bielefeld.
T. Friedrich, Three-Dimensional Voxel-Based Neural Style Transfer and Quantification, Bielefeld: Universität Bielefeld, 2022.
Friedrich, T.: Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Universität Bielefeld, Bielefeld (2022).
Friedrich, Timo. Three-Dimensional Voxel-Based Neural Style Transfer and Quantification. Bielefeld: Universität Bielefeld, 2022.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung 4.0 International Public License (CC-BY 4.0):
Volltext(e)
Name
Access Level
Open Access
Zuletzt Hochgeladen
2022-02-28T18:43:27Z
MD5 Prüfsumme
e707a53da9f73f815d08c7e16b481ae0