DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...93
Hits 1 – 20 of 1.858

1
A Deep Fusion Matching Network Semantic Reasoning Model
In: Applied Sciences; Volume 12; Issue 7; Pages: 3416 (2022)
BASE
Show details
2
Meanings Expressed by Primary Schoolchildren When Solving a Partitioning Task
In: Mathematics; Volume 10; Issue 8; Pages: 1339 (2022)
BASE
Show details
3
MOCKERY AND PROVOCATION FOR FUN: LEXICAL AND SEMANTIC REPRESENTATION IN THE RUSSIAN LANGUAGE ...
Rebrina, L.N.. - : Инфинити, 2022
BASE
Show details
4
Formalization of AMR Inference via Hybrid Logic Tableaux ...
Goldner, Eli Tecumseh. - : Brandeis University, 2022
BASE
Show details
5
La théorie sens-texte : concepts-clés et applications
Marengo, Sébastien (Herausgeber); Tutin, Agnès (Verfasser eines Geleitwortes). - Paris : L'Harmattan, 2021
BLLDB
UB Frankfurt Linguistik
Show details
6
Semantisch-konzeptuelle Vernetzungen im bilingualen mentalen Lexikon : eine psycholinguistische Studie mit deutsch-türkischsprachigen Jugendlichen
Veletić, Sebastian. - Berlin, Germany : J.B. Metzler, 2021
BLLDB
UB Frankfurt Linguistik
Show details
7
Contextualization of Web contents through semantic enrichment from linked open data ; Contextualisation des contenus Web par l'enrichissement sémantique à partir de données
Kumar, Amit. - : HAL CCSD, 2021
In: https://tel.archives-ouvertes.fr/tel-03561788 ; Databases [cs.DB]. Normandie Université, 2021. English. ⟨NNT : 2021NORMC243⟩ (2021)
BASE
Show details
8
Research compendium for Montero-Melis et al. (2021) "No evidence for embodiment: The motor system is not needed to keep action words in working memory" (Cortex) ...
Montero-Melis, Guillermo. - : Open Science Framework, 2021
BASE
Show details
9
Graph-to-Graph Translations To Augment Abstract Meaning Representation Tense And Aspect ...
Bakal, Mollie. - : My University, 2021
BASE
Show details
10
Graphs, Computation, and Language ...
Ustalov, Dmitry. - : Zenodo, 2021
BASE
Show details
11
Graphs, Computation, and Language ...
Ustalov, Dmitry. - : Zenodo, 2021
BASE
Show details
12
APiCS-Ligt: Towards Semantic Enrichment of Interlinear Glossed Text ...
Ionov, Maxim. - : Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021
BASE
Show details
13
Essential Features in a Theory of Context for Enabling Artificial General Intelligence
In: Applied Sciences; Volume 11; Issue 24; Pages: 11991 (2021)
BASE
Show details
14
Mapping Directional Mid-Air Unistroke Gestures to Interaction Commands: A User Elicitation and Evaluation Study
In: Symmetry ; Volume 13 ; Issue 10 (2021)
BASE
Show details
15
Achieving Semantic Consistency for Multilingual Sentence Representation Using an Explainable Machine Natural Language Parser (MParser)
In: Applied Sciences; Volume 11; Issue 24; Pages: 11699 (2021)
BASE
Show details
16
The JeuxDeMots Project (Invited Talk) ...
Lafourcade, Mathieu. - : Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021
BASE
Show details
17
AAA4LLL - Acquisition, Annotation, Augmentation for Lively Language Learning ...
Wloka, Bartholomäus; Winiwarter, Werner. - : Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021
BASE
Show details
18
Graph-to-Graph Translations To Augment Abstract Meaning Representation Tense And Aspect
Bakal, Mollie. - 2021
BASE
Show details
19
Hy-NLI : a Hybrid system for state-of-the-art Natural Language Inference
BASE
Show details
20
Graph-based broad-coverage semantic parsing
Lyu, Chunchuan. - : The University of Edinburgh, 2021
Abstract: Many broad-coverage meaning representations can be characterized as directed graphs, where nodes represent semantic concepts and directed edges represent semantic relations among the concepts. The task of semantic parsing is to generate such a meaning representation from a sentence. It is quite natural to adopt a graph-based approach for parsing, where nodes are identified conditioning on the individual words, and edges are labeled conditioning on the pairs of nodes. However, there are two issues with applying this simple and interpretable graph-based approach for semantic parsing: first, the anchoring of nodes to words can be implicit and non-injective in several formalisms (Oepen et al., 2019, 2020). This means we do not know which nodes should be generated from which individual word and how many of them. Consequently, it makes a probabilistic formulation of the training objective problematical; second, graph-based parsers typically predict edge labels independent from each other. Such an independence assumption, while being sensible from an algorithmic point of view, could limit the expressiveness of statistical modeling. Consequently, it might fail to capture the true distribution of semantic graphs. In this thesis, instead of a pipeline approach to obtain the anchoring, we propose to model the implicit anchoring as a latent variable in a probabilistic model. We induce such a latent variable jointly with the graph-based parser in an end-to-end differentiable training. In particular, we test our method on Abstract Meaning Representation (AMR) parsing (Banarescu et al., 2013). AMR represents sentence meaning with a directed acyclic graph, where the anchoring of nodes to words is implicit and could be many-to-one. Initially, we propose a rule-based system that circumvents the many-to-one anchoring by combing nodes in some pre-specified subgraphs in AMR and treats the alignment as a latent variable. Next, we remove the need for such a rule-based system by treating both graph segmentation and alignment as latent variables. Still, our graph-based parsers are parameterized by neural modules that require gradient-based optimization. Consequently, training graph-based parsers with our discrete latent variables can be challenging. By combing deep variational inference and differentiable sampling, our models can be trained end-to-end. To overcome the limitation of graph-based parsing and capture interdependency in the output, we further adopt iterative refinement. Starting with an output whose parts are independently predicted, we iteratively refine it conditioning on the previous prediction. We test this method on semantic role labeling (Gildea and Jurafsky, 2000). Semantic role labeling is the task of predicting the predicate-argument structure. In particular, semantic roles between the predicate and its arguments need to be labeled, and those semantic roles are interdependent. Overall, our refinement strategy results in an effective model, outperforming strong factorized baseline models.
Keyword: Abstract Meaning Representation parsing; AMR parsing; graph-based parsers; hand-crafted pipelines; semantic parsing; semantic role labeling
URL: https://doi.org/10.7488/era/1390
https://hdl.handle.net/1842/38121
BASE
Hide details

Page: 1 2 3 4 5...93

Catalogues
101
0
494
0
0
3
0
Bibliographies
1.612
1
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
243
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern