Page: 1 2 3 4 5 6 7 8 9... 55
81 |
Syntactic Transformations in Rule-Based Parsing of Support Verb Constructions: Examples from European Portuguese ...
|
|
|
|
BASE
|
|
Show details
|
|
84 |
"Kansen cogoyama": Bamanankan mankankalan kelennin dɔ ; Featural foot in Bambara ; Le « pied caractéristique », une unité phonétique en bambara
|
|
|
|
In: ISSN: 0167-6164 ; EISSN: 1613-3811 ; Journal of African Languages and Linguistics ; https://halshs.archives-ouvertes.fr/halshs-03195237 ; Journal of African Languages and Linguistics, De Gruyter, 2020, 41 (2), pp.265-300. ⟨10.1515/jall-2020-2012⟩ ; https://www.degruyter.com/journal/key/JALL/html (2020)
|
|
BASE
|
|
Show details
|
|
85 |
Syntactic Priming During Sentence Comprehension: Evidence for the Lexical Boost
|
|
|
|
In: Journal of Experimental Psychology: Learning, Memory, and Cognition, 2014, Vol. 40, No. 4, pp. 905-918. (2020)
|
|
BASE
|
|
Show details
|
|
87 |
Understanding and generating language with abstract meaning representation
|
|
|
|
BASE
|
|
Show details
|
|
88 |
On understanding character-level models for representing morphology
|
|
|
|
Abstract:
Morphology is the study of how words are composed of smaller units of meaning (morphemes). It allows humans to create, memorize, and understand words in their language. To process and understand human languages, we expect our computational models to also learn morphology. Recent advances in neural network models provide us with models that compose word representations from smaller units like word segments, character n-grams, or characters. These so-called subword unit models do not explicitly model morphology yet they achieve impressive performance across many multilingual NLP tasks, especially on languages with complex morphological processes. This thesis aims to shed light on the following questions: (1) What do subword unit models learn about morphology? (2) Do we still need prior knowledge about morphology? (3) How do subword unit models interact with morphological typology? First, we systematically compare various subword unit models and study their performance across language typologies. We show that models based on characters are particularly effective because they learn orthographic regularities which are consistent with morphology. To understand which aspects of morphology are not captured by these models, we compare them with an oracle with access to explicit morphological analysis. We show that in the case of dependency parsing, character-level models are still poor in representing words with ambiguous analyses. We then demonstrate how explicit modeling of morphology is helpful in such cases. Finally, we study how character-level models perform in low resource, cross-lingual NLP scenarios, whether they can facilitate cross-linguistic transfer of morphology across related languages. While we show that cross-lingual character-level models can improve low-resource NLP performance, our analysis suggests that it is mostly because of the structural similarities between languages and we do not yet find any strong evidence of crosslinguistic transfer of morphology. This thesis presents a careful, in-depth study and analyses of character-level models and their relation to morphology, providing insights and future research directions on building morphologically-aware computational NLP models.
|
|
Keyword:
character-level models; dependency parsing; morphemes; morphology; natural language processing; NLP
|
|
URL: https://doi.org/10.7488/era/49 https://hdl.handle.net/1842/36742
|
|
BASE
|
|
Hide details
|
|
89 |
RuBQ: A Russian Dataset for Question Answering over Wikidata
|
|
|
|
In: Lect. Notes Comput. Sci. ; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (2020)
|
|
BASE
|
|
Show details
|
|
90 |
Self attended stack pointer networks for learning long term dependencies
|
|
|
|
BASE
|
|
Show details
|
|
91 |
Neural Models for Integrating Prosody in Spoken Language Understanding
|
|
|
|
BASE
|
|
Show details
|
|
92 |
The Role of Information Theory in Gap-Filler Dependencies
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2020)
|
|
BASE
|
|
Show details
|
|
93 |
MG Parsing as a Model of Gradient Acceptability in Syntactic Islands
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2020)
|
|
BASE
|
|
Show details
|
|
94 |
The Role of Linguistic Features in Domain Adaptation: TAG Parsing of Questions
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2020)
|
|
BASE
|
|
Show details
|
|
95 |
Annotation syntaxique automatique de la partie orale du ORFÉO
|
|
|
|
In: Langages, N 219, 3, 2020-08-11, pp.87-102 (2020)
|
|
BASE
|
|
Show details
|
|
96 |
Transfer of L1 processing strategies to the interpretation of sentence-level L2 input: A cross-linguistic comparison on the resolution of relative clause attachment ambiguities
|
|
|
|
In: Eurasian Journal of Applied Linguistics, Vol 6, Iss 2, Pp 155-188 (2020) (2020)
|
|
BASE
|
|
Show details
|
|
97 |
Adapting a FrameNet Semantic Parser for Spoken Language Understanding Using Adversarial Learning
|
|
|
|
In: Interspeech 2019 ; https://hal.archives-ouvertes.fr/hal-02298417 ; Interspeech 2019, Sep 2019, Graz, Austria. pp.799-803, ⟨10.21437/Interspeech.2019-2732⟩ (2019)
|
|
BASE
|
|
Show details
|
|
98 |
Representation and Parsing of Multiword Expressions ; Representation and Parsing of Multiword Expressions: Current trends
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-01537920 ; Yannick Parmentier; Jakub Waszczuk. Germany. 3, Language Science Press, 2019, Phraseology and Multiword Expressions ; http://langsci-press.org (2019)
|
|
BASE
|
|
Show details
|
|
99 |
Cross-lingual parsing with polyglot training and multi-treebank learning: a Faroese case study
|
|
|
|
In: Barry, James orcid:0000-0003-3051-585X , Wagner, Joachim orcid:0000-0002-8290-3849 and Foster, Jennifer orcid:0000-0002-7789-4853 (2019) Cross-lingual parsing with polyglot training and multi-treebank learning: a Faroese case study. In: The 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), 3 - 5 Nov 2019, Hong Kong, China. ISBN 978-1-950737-78-9 (2019)
|
|
BASE
|
|
Show details
|
|
100 |
Cross-Lingual Transfer of Natural Language Processing Systems
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8 9... 55
|
|