Digitala Vetenskapliga Arkivet

Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Word Order Does Matter (And Shuffled Language Models Know It)
Univ Copenhagen, Dept Comp Sci, Copenhagen, Denmark..
Univ Oslo, Dept Informat, Language Technol Grp, Oslo, Norway..
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Languages, Department of Linguistics and Philology.
Univ Copenhagen, Dept Comp Sci, Copenhagen, Denmark..
2022 (English)In: PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), ASSOC COMPUTATIONAL LINGUISTICS-ACL Association for Computational Linguistics, 2022, p. 6907-6919Conference paper, Published paper (Refereed)
Abstract [en]

Recent studies have shown that language models pretrained and/or fine-tuned on randomly permuted sentences exhibit competitive performance on GLUE, putting into question the importance of word order information. Somewhat counter-intuitively, some of these studies also report that position embeddings appear to be crucial for models' good performance with shuffled text. We probe these language models for word order information and investigate what position embeddings learned from shuffled text encode, showing that these models retain information pertaining to the original, naturalistic word order. We show this is in part due to a subtlety in how shuffling is implemented in previous work - before rather than after subword segmentation. Surprisingly, we find even Language models trained on text shuffled after subword segmentation retain some semblance of information about word order because of the statistical dependencies between sentence length and unigram probabilities. Finally, we show that beyond GLUE, a variety of language understanding tasks do require word order information, often to an extent that cannot be learned through fine-tuning.

Place, publisher, year, edition, pages
ASSOC COMPUTATIONAL LINGUISTICS-ACL Association for Computational Linguistics, 2022. p. 6907-6919
National Category
Language Technology (Computational Linguistics) Computer Sciences
Identifiers
URN: urn:nbn:se:uu:diva-484790ISI: 000828702307003ISBN: 978-1-955917-21-6 (print)OAI: oai:DiVA.org:uu-484790DiVA, id: diva2:1696779
Conference
60th Annual Meeting of the Association-for-Computational-Linguistics (ACL), MAY 22-27, 2022, Dublin, IRELAND
Available from: 2022-09-19 Created: 2022-09-19 Last updated: 2024-01-15Bibliographically approved

Open Access in DiVA

No full text in DiVA

Search in DiVA

By author/editor
Kulmizev, Artur
By organisation
Department of Linguistics and Philology
Language Technology (Computational Linguistics)Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 122 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf