Digitala Vetenskapliga Arkivet

Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Attention Mechanisms for Transition-based Dependency Parsing
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Languages, Department of Linguistics and Philology.
2019 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Transition-based dependency parsing is known to compute the syntactic structure of a sentence efficiently, but is less accurate to predict long-distance relations between tokens as it lacks global information about the sentence. Our main contribution is the integration of attention mechanisms to replace the static token selection with a dynamic approach that takes the complete sequence into account. Though our experiments confirm that our approach fundamentally works, our models do not outperform the baseline parser. We further present a line of follow-up experiments to investigate these results. Our main conclusion is that the BiLSTM of the traditional parser is already powerful enough to encode the required global information into each token, eliminating the need for an attention-driven approach.

Our secondary results indicate that the attention models require a neural network with a higher capacity to potentially extract more latent information from the word embeddings and the LSTM than the traditional parser. We further show that positional encodings are not useful for our attention models, though BERT-style positional embeddings slightly improve the results. Finally, we experiment with replacing the LSTM with a Transformer-encoder to test the impact of self-attention. The results are disappointing, though we think that more future research should be dedicated to this.

For our work, we implement a UUParser-inspired dependency parser from scratch in PyTorch and extend it with, among other things, full GPU support and mini-batch processing. We publish the code under a permissive open source license at https://github.com/jgontrum/parseridge.

Place, publisher, year, edition, pages
2019.
Keywords [en]
natural language processing, language technology, dependency parsing, transition-based parsing, parsing, attention
National Category
Computer Systems Language Technology (Computational Linguistics)
Identifiers
URN: urn:nbn:se:uu:diva-395491OAI: oai:DiVA.org:uu-395491DiVA, id: diva2:1362881
Educational program
Master Programme in Language Technology
Presentation
2019-10-07, 22:47 (English)
Supervisors
Examiners
Available from: 2019-10-25 Created: 2019-10-21 Last updated: 2019-10-29Bibliographically approved

Open Access in DiVA

Attention Mechanisms for Transition-based Dependency Parsing(451 kB)1360 downloads
File information
File name FULLTEXT01.pdfFile size 451 kBChecksum SHA-512
7f21676341036d2b4febabae0cda1254ef01b4dd4d2cc270a672c305d886a7e94db8ce1514cbd07c9e457fa42a5007bb961da2e3ccf5a3fce7dd474eda2eaf50
Type fulltextMimetype application/pdf

By organisation
Department of Linguistics and Philology
Computer SystemsLanguage Technology (Computational Linguistics)

Search outside of DiVA

GoogleGoogle Scholar
Total: 1363 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 1568 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf