Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models
Bunzeck B, Zarrieß S (2023)
In: Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD). Breitholtz E, Lappin S, Loaiciga S, Ilinykh N, Dobnik S (Eds); Stroudsburg, PA: Association for Computational Linguistics: 25-37.
Konferenzbeitrag
| Veröffentlicht | Englisch
Download
2023.clasp-1.3.pdf
369.12 KB
Autor*in
Herausgeber*in
Breitholtz, Ellen;
Lappin, Shalom;
Loaiciga, Sharid;
Ilinykh, Nikolai;
Dobnik, Simon
Abstract / Bemerkung
The success of large language models (LMs) has also prompted a push towards smaller models, but the differences in functionality and encodings between these two types of models are not yet well understood. In this paper, we employ a perturbed masking approach to investigate differences in token influence patterns on the sequence embeddings of larger and smaller RoBERTa models. Specifically, we explore how token properties like position, length or part of speech influence their sequence embeddings. We find that there is a general tendency for sequence-final tokens to exert a higher influence. Among part-of-speech tags, nouns, numerals and punctuation marks are the most influential, with smaller deviations for individual models. These findings also align with usage-based linguistic evidence on the effect of entrenchment. Finally, we show that the relationship between data size and model size influences the variability and brittleness of these effects, hinting towards a need for holistically balanced models.
Erscheinungsjahr
2023
Titel des Konferenzbandes
Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD)
Seite(n)
25-37
Urheberrecht / Lizenzen
Konferenz
2023 CLASP Conference on Learning with Small Data
Konferenzort
Gothenburg, Sweden
Konferenzdatum
2023-09-11 – 2023-09-12
ISBN
979-8-89176-000-4
Page URI
https://pub.uni-bielefeld.de/record/2982902
Zitieren
Bunzeck B, Zarrieß S. Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models. In: Breitholtz E, Lappin S, Loaiciga S, Ilinykh N, Dobnik S, eds. Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD). Stroudsburg, PA: Association for Computational Linguistics; 2023: 25-37.
Bunzeck, B., & Zarrieß, S. (2023). Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models. In E. Breitholtz, S. Lappin, S. Loaiciga, N. Ilinykh, & S. Dobnik (Eds.), Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD) (pp. 25-37). Stroudsburg, PA: Association for Computational Linguistics.
Bunzeck, Bastian, and Zarrieß, Sina. 2023. “Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models”. In Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD), ed. Ellen Breitholtz, Shalom Lappin, Sharid Loaiciga, Nikolai Ilinykh, and Simon Dobnik, 25-37. Stroudsburg, PA: Association for Computational Linguistics.
Bunzeck, B., and Zarrieß, S. (2023). “Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models” in Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD), Breitholtz, E., Lappin, S., Loaiciga, S., Ilinykh, N., and Dobnik, S. eds. (Stroudsburg, PA: Association for Computational Linguistics), 25-37.
Bunzeck, B., & Zarrieß, S., 2023. Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models. In E. Breitholtz, et al., eds. Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD). Stroudsburg, PA: Association for Computational Linguistics, pp. 25-37.
B. Bunzeck and S. Zarrieß, “Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models”, Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD), E. Breitholtz, et al., eds., Stroudsburg, PA: Association for Computational Linguistics, 2023, pp.25-37.
Bunzeck, B., Zarrieß, S.: Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models. In: Breitholtz, E., Lappin, S., Loaiciga, S., Ilinykh, N., and Dobnik, S. (eds.) Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD). p. 25-37. Association for Computational Linguistics, Stroudsburg, PA (2023).
Bunzeck, Bastian, and Zarrieß, Sina. “Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models”. Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD). Ed. Ellen Breitholtz, Shalom Lappin, Sharid Loaiciga, Nikolai Ilinykh, and Simon Dobnik. Stroudsburg, PA: Association for Computational Linguistics, 2023. 25-37.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung 4.0 International Public License (CC-BY 4.0):
Volltext(e)
Name
2023.clasp-1.3.pdf
369.12 KB
Access Level
Open Access
Zuletzt Hochgeladen
2023-09-13T14:53:32Z
MD5 Prüfsumme
d19d3ba0da23101bdb4a19762da4e891