DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Semantics-Preserved Distortion for Personal Privacy Protection ...
Peng, Letian; Li, Zuchao; Zhao, Hai. - : arXiv, 2022
BASE
Show details
2
Pre-training Universal Language Representation ...
Li, Yian; Zhao, Hai. - : arXiv, 2021
BASE
Show details
3
Head-driven Phrase Structure Parsing in O($n^3$) Time Complexity ...
Li, Zuchao; Zhou, Junru; Zhao, Hai. - : arXiv, 2021
BASE
Show details
4
Multi-tasking Dialogue Comprehension with Discourse Parsing ...
BASE
Show details
5
Structural Pre-training for Dialogue Comprehension ...
BASE
Show details
6
Smoothing Dialogue States for Open Conversational Machine Reading ...
BASE
Show details
7
Dialogue Graph Modeling for Conversational Machine Reading ...
BASE
Show details
8
Dialogue-oriented Pre-training ...
BASE
Show details
9
Pre-training Universal Language Representation ...
BASE
Show details
10
Unsupervised Neural Machine Translation with Universal Grammar ...
BASE
Show details
11
Seeking Common but Distinguishing Difference, A Joint Aspect-based Sentiment Analysis Model ...
BASE
Show details
12
Enhancing Language Generation with Effective Checkpoints of Pre-trained Language Model ...
BASE
Show details
13
Syntax Role for Neural Semantic Role Labeling ...
BASE
Show details
14
Tracing Origins: Coreference-aware Machine Reading Comprehension ...
BASE
Show details
15
Syntax-aware Data Augmentation for Neural Machine Translation ...
BASE
Show details
16
Cross-lingual Supervision Improves Unsupervised Neural Machine Translation ...
Abstract: Neural machine translation~(NMT) is ineffective for zero-resource languages. Recent works exploring the possibility of unsupervised neural machine translation (UNMT) with only monolingual data can achieve promising results. However, there are still big gaps between UNMT and NMT with parallel supervision. In this work, we introduce a multilingual unsupervised NMT (\method) framework to leverage weakly supervised signals from high-resource language pairs to zero-resource translation directions. More specifically, for unsupervised language pairs \texttt{En-De}, we can make full use of the information from parallel dataset \texttt{En-Fr} to jointly train the unsupervised translation directions all in one model. \method is based on multilingual models which require no changes to the standard unsupervised NMT. Empirical results demonstrate that \method significantly improves the translation quality by more than 3 BLEU score on six benchmark unsupervised translation directions. ... : NAACL Industry Track ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2004.03137
https://arxiv.org/abs/2004.03137
BASE
Hide details
17
Syntax Role for Neural Semantic Role Labeling ...
Li, Zuchao; Zhao, Hai; He, Shexia. - : arXiv, 2020
BASE
Show details
18
BURT: BERT-inspired Universal Representation from Learning Meaningful Segment ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details
19
Learning Universal Representations from Word to Sentence ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details
20
BURT: BERT-inspired Universal Representation from Twin Structure ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
51
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern