Recursive Tree Grammar Autoencoders
Paaßen B, Koprinska I, Yacef K (2022)
Machine Learning 111: 3393–3423.
Zeitschriftenaufsatz | Englisch
Download
s10994-022-06223-7.pdf
2.45 MB
Autor*in
Paaßen, BenjaminUniBi ;
Koprinska, Irena;
Yacef, Kalina
Einrichtung
Abstract / Bemerkung
Machine learning on trees has been mostly focused on trees as input. Much less research has investigated trees as output, which has many applications, such as molecule optimization for drug discovery, or hint generation for intelligent tutoring systems. In this work, we propose a novel autoencoder approach, called recursive tree grammar autoencoder (RTG-AE), which encodes trees via a bottom-up parser and decodes trees via a tree grammar, both learned via recursive neural networks that minimize the variational autoencoder loss. The resulting encoder and decoder can then be utilized in subsequent tasks, such as optimization and time series prediction. RTG-AEs are the first model to combine three features: recursive processing, grammatical knowledge, and deep learning. Our key message is that this unique combination of all three features outperforms models which combine any two of the three. Experimentally, we show that RTG-AE improves the autoencoding error, training time, and optimization score on synthetic as well as real datasets compared to four baselines. We further prove that RTG-AEs parse and generate trees in linear time and are expressive enough to handle all regular tree grammars.
Stichworte
Recursive neural networks;
Tree grammars;
Representation learning;
Variational autoencoders
Erscheinungsjahr
2022
Zeitschriftentitel
Machine Learning
Band
111
Seite(n)
3393–3423
Urheberrecht / Lizenzen
Konferenz
2022 European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
Konferenzort
Grenoble
Konferenzdatum
2022-09-19 – 2022-09-23
Page URI
https://pub.uni-bielefeld.de/record/2978970
Zitieren
Paaßen B, Koprinska I, Yacef K. Recursive Tree Grammar Autoencoders. Machine Learning. 2022;111:3393–3423.
Paaßen, B., Koprinska, I., & Yacef, K. (2022). Recursive Tree Grammar Autoencoders. Machine Learning, 111, 3393–3423. https://doi.org/10.1007/s10994-022-06223-7
Paaßen, Benjamin, Koprinska, Irena, and Yacef, Kalina. 2022. “Recursive Tree Grammar Autoencoders”. Machine Learning 111: 3393–3423.
Paaßen, B., Koprinska, I., and Yacef, K. (2022). Recursive Tree Grammar Autoencoders. Machine Learning 111, 3393–3423.
Paaßen, B., Koprinska, I., & Yacef, K., 2022. Recursive Tree Grammar Autoencoders. Machine Learning, 111, p 3393–3423.
B. Paaßen, I. Koprinska, and K. Yacef, “Recursive Tree Grammar Autoencoders”, Machine Learning, vol. 111, 2022, pp. 3393–3423.
Paaßen, B., Koprinska, I., Yacef, K.: Recursive Tree Grammar Autoencoders. Machine Learning. 111, 3393–3423 (2022).
Paaßen, Benjamin, Koprinska, Irena, and Yacef, Kalina. “Recursive Tree Grammar Autoencoders”. Machine Learning 111 (2022): 3393–3423.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung 4.0 International Public License (CC-BY 4.0):
Volltext(e)
Name
s10994-022-06223-7.pdf
2.45 MB
Access Level
Open Access
Zuletzt Hochgeladen
2023-05-05T15:37:21Z
MD5 Prüfsumme
5f5b6691c02ea63aed6da083d58b0586
Link(s) zu Volltext(en)
Access Level
Open Access
Software: