DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Semantics-Preserved Distortion for Personal Privacy Protection ...
Peng, Letian; Li, Zuchao; Zhao, Hai. - : arXiv, 2022
BASE
Show details
2
Pre-training Universal Language Representation ...
Li, Yian; Zhao, Hai. - : arXiv, 2021
BASE
Show details
3
Head-driven Phrase Structure Parsing in O($n^3$) Time Complexity ...
Li, Zuchao; Zhou, Junru; Zhao, Hai. - : arXiv, 2021
BASE
Show details
4
Multi-tasking Dialogue Comprehension with Discourse Parsing ...
BASE
Show details
5
Structural Pre-training for Dialogue Comprehension ...
BASE
Show details
6
Smoothing Dialogue States for Open Conversational Machine Reading ...
BASE
Show details
7
Dialogue Graph Modeling for Conversational Machine Reading ...
BASE
Show details
8
Dialogue-oriented Pre-training ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.235 Abstract: Pre-trained language models (PrLM) has been shown powerful in enhancing a broad range of downstream tasks including various dialogue related ones. However, PrLMs are usually trained on general plain text with common language model (LM) training objectives, which cannot sufficiently capture dialogue exclusive features due to the limitation of such training setting, so that there is an immediate need to fill the gap between a specific dialogue task and the LM task. As it is unlikely to collect huge dialogue data for dialogue-oriented pre-training, in this paper, we propose three strategies to simulate the conversation features on general plain text. Our proposed method differs from existing post-training methods that it may yield a general-purpose PrLM and does not individualize to any detailed task while keeping the capability of learning dialogue related features including speaker awareness, continuity and consistency. The resulted ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/frsn-n538
https://underline.io/lecture/26326-dialogue-oriented-pre-training
BASE
Hide details
9
Pre-training Universal Language Representation ...
BASE
Show details
10
Unsupervised Neural Machine Translation with Universal Grammar ...
BASE
Show details
11
Seeking Common but Distinguishing Difference, A Joint Aspect-based Sentiment Analysis Model ...
BASE
Show details
12
Enhancing Language Generation with Effective Checkpoints of Pre-trained Language Model ...
BASE
Show details
13
Syntax Role for Neural Semantic Role Labeling ...
BASE
Show details
14
Tracing Origins: Coreference-aware Machine Reading Comprehension ...
BASE
Show details
15
Syntax-aware Data Augmentation for Neural Machine Translation ...
BASE
Show details
16
Cross-lingual Supervision Improves Unsupervised Neural Machine Translation ...
BASE
Show details
17
Syntax Role for Neural Semantic Role Labeling ...
Li, Zuchao; Zhao, Hai; He, Shexia. - : arXiv, 2020
BASE
Show details
18
BURT: BERT-inspired Universal Representation from Learning Meaningful Segment ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details
19
Learning Universal Representations from Word to Sentence ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details
20
BURT: BERT-inspired Universal Representation from Twin Structure ...
Li, Yian; Zhao, Hai. - : arXiv, 2020
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
51
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern