Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Detailed Analysis of Semantic Dependency Parsing with Deep Neural Networks
Linköping University, Department of Computer and Information Science, Human-Centered systems.
2019 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesisAlternative title
En detaljerad analys av semantisk dependensparsning meddjupa neuronnät (Swedish)
Abstract [en]

The use of Long Short Term Memory (LSTM) networks continues to yield better results in natural language processing tasks. One area which recently has seen significant improvements is semantic dependency parsing, where the current state-of-the-art model uses a multilayer LSTM combined with an attention-based scoring function to predict the dependencies.

In this thesis the state of the art model is first replicated and then extended to include features based on syntactical trees, which was found to be useful in a similar model. In addition, the effect of part-of-speech tags is studied.

The replicated model achieves a labeled F1 score of 93.6 on the in-domain data and 89.2 on the out-of-domain data on the DM dataset, which shows that the model is indeed replicable. Using multiple features extracted from syntactic gold standard trees of the DELPH-IN Derivation Tree (DT) type increased the labeled scores to 97.1 and 94.1 respectively, while the use of predicted trees of the Stanford Basic (SB) type did not improve the results at all. The usefulness of part-of-speech tags was found to be diminished in the presence of other features.

Place, publisher, year, edition, pages
2019. , p. 47
Keywords [en]
Semantic Dependency Parsing, LSTM
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:liu:diva-156831ISRN: LIU-IDA/LITH-EX-A--19/013--SEOAI: oai:DiVA.org:liu-156831DiVA, id: diva2:1315439
Subject / course
Computer science
Presentation
2019-03-26, Linköping, 15:15 (English)
Supervisors
Examiners
Available from: 2019-05-15 Created: 2019-05-13 Last updated: 2019-05-15Bibliographically approved

Open Access in DiVA

fulltext(1081 kB)54 downloads
File information
File name FULLTEXT01.pdfFile size 1081 kBChecksum SHA-512
b72ec69fc6ca419bd4cdf45a0978929366b8cedff81cd50f140acd50e9d1d311dc23b11b7fd7a5054e9c3d8c845d8e3a26e889f32144a20fb61f77b1dfb7feed
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Roxbo, Daniel
By organisation
Human-Centered systems
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 54 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 668 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf