Show simple item record

 
dc.contributor.author Janz, Arkadiusz
dc.contributor.author Piasecki, Maciej
dc.contributor.author Wątorski, Piotr
dc.date.accessioned 2025-12-17T10:23:42Z
dc.date.available 2025-12-17T10:23:42Z
dc.date.issued 2021-01-01
dc.identifier.uri http://hdl.handle.net/11321/981
dc.description Neural language models, including transformer-based models, that are pretrained on very large corpora became a common way to represent text in various tasks, including recognition of textual semantic relations, e.g. Cross-document Structure Theory. Pretrained models are usually fine tuned to downstream tasks and the obtained vectors are used as an input for deep neural classifiers. No linguistic knowledge obtained from resources and tools is utilised. In this paper we compare such universal approaches with a combination of rich graph-based linguistically motivated sentence representation and a typical neural network classifier applied to a task of recognition of CST relation in Polish. The representation describes selected levels of the sentence structure including description of lexical meanings on the basis of the wordnet (plWordNet) synsets and connected SUMO concepts. The obtained results show that in the case of difficult relations and medium size training corpus semantically enriched text representation leads to significantly better results.
dc.language.iso eng
dc.publisher Global Wordnet Association
dc.rights Creative Commons - Attribution 4.0 International (CC BY 4.0)
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.rights.label CC
dc.subject neural language models
dc.subject CST relation
dc.title Neural Language Models vs Wordnet-based Semantically Enriched Representation in CST Relation Recognition
dc.type languageDescription
metashare.ResourceInfo#ContentInfo.detailedType other
metashare.ResourceInfo#ContentInfo.mediaType text
has.files yes
branding CLARIN-PL
contact.person Alicja Derych alicja.derych@pwr.edu.pl Politechnika Wrocławska
files.size 261821
files.count 1


 Files in this item

This item is
Distributed under Creative Commons
and licensed under:
Creative Commons - Attribution 4.0 International (CC BY 4.0)
Attribution Required
Icon
Name
Jazn et al, Neural Language Models vs Wordnet-based Semantically Enriched Representation.pdf
Size
255.68 KB
Format
PDF
Description
article
 Download file

Show simple item record