DE eng

Search in the Catalogues and Directories

Hits 1 – 15 of 15

1
Multi-tasking Dialogue Comprehension with Discourse Parsing ...
BASE
Show details
2
Structural Pre-training for Dialogue Comprehension ...
BASE
Show details
3
Smoothing Dialogue States for Open Conversational Machine Reading ...
BASE
Show details
4
Dialogue Graph Modeling for Conversational Machine Reading ...
BASE
Show details
5
Tracing Origins: Coreference-aware Machine Reading Comprehension ...
BASE
Show details
6
Enhancing Pre-trained Language Model with Lexical Simplification ...
BASE
Show details
7
SG-Net: Syntax Guided Transformer for Language Representation ...
BASE
Show details
8
Open Named Entity Modeling from Embedding Distribution ...
Luo, Ying; Zhao, Hai; Zhang, Zhuosheng. - : arXiv, 2019
BASE
Show details
9
SG-Net: Syntax-Guided Machine Reading Comprehension ...
BASE
Show details
10
LIMIT-BERT : Linguistic Informed Multi-Task BERT ...
BASE
Show details
11
CoNLL 2018 Shared Task System Outputs
Zeman, Daniel; Potthast, Martin; Duthoo, Elie. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2018
BASE
Show details
12
Effective Subword Segmentation for Text Comprehension ...
BASE
Show details
13
Modeling Multi-turn Conversation with Deep Utterance Aggregation ...
BASE
Show details
14
Subword-augmented Embedding for Cloze Reading Comprehension ...
Abstract: Representation learning is the foundation of machine reading comprehension. In state-of-the-art models, deep learning methods broadly use word and character level representations. However, character is not naturally the minimal linguistic unit. In addition, with a simple concatenation of character and word embedding, previous models actually give suboptimal solution. In this paper, we propose to use subword rather than character for word embedding enhancement. We also empirically explore different augmentation strategies on subword-augmented embedding to enhance the cloze-style reading comprehension model reader. In detail, we present a reader that uses subword-level representation to augment word embedding with a short list to handle rare words effectively. A thorough examination is conducted to evaluate the comprehensive performance and generalization ability of the proposed reader. Experimental results show that the proposed approach helps the reader significantly outperform the state-of-the-art baselines ... : Proceedings of the 27th International Conference on Computational Linguistics (COLING 2018) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1806.09103
https://arxiv.org/abs/1806.09103
BASE
Hide details
15
One-shot Learning for Question-Answering in Gaokao History Challenge ...
Zhang, Zhuosheng; Zhao, Hai. - : arXiv, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
15
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern