DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 45

1
Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations ...
Meng, Yu; Zhang, Yunyi; Huang, Jiaxin. - : arXiv, 2022
BASE
Show details
2
Semantic pattern discovery in open information extraction
Chauhan, Aabhas. - 2022
BASE
Show details
3
Text mining at multiple granularity: leveraging subwords, words, phrases, and sentences
BASE
Show details
4
Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training ...
BASE
Show details
5
ChemNER: Fine-Grained Chemistry Named Entity Recognition with Ontology-Guided Distant Supervision ...
BASE
Show details
6
Generation-Augmented Retrieval for Open-Domain Question Answering ...
BASE
Show details
7
Few-Shot Named Entity Recognition: An Empirical Baseline Study ...
BASE
Show details
8
Reader-Guided Passage Reranking for Open-Domain Question Answering ...
BASE
Show details
9
The Future is not One-dimensional: Complex Event Schema Induction by Graph Modeling for Event Prediction ...
BASE
Show details
10
Extract, Denoise and Enforce: Evaluating and Improving Concept Preservation for Text-to-Text Generation ...
Abstract: Prior studies on text-to-text generation typically assume that the model could figure out what to attend to in the input and what to include in the output via seq2seq learning, with only the parallel training data and no additional guidance. However, it remains unclear whether current models can preserve important concepts in the source input, as seq2seq learning does not have explicit focus on the concepts and commonly used evaluation metrics also treat concepts equally important as other tokens. In this paper, we present a systematic analysis that studies whether current seq2seq models, especially pre-trained language models, are good enough for preserving important input concepts and to what extent explicitly guiding generation with the concepts as lexical constraints is beneficial. We answer the above questions by conducting extensive analytical experiments on four representative text-to-text generation tasks. Based on the observations, we then propose a simple yet effective framework to automatically ... : EMNLP 2021 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2104.08724
https://arxiv.org/abs/2104.08724
BASE
Hide details
11
Extract, Denoise and Enforce: Evaluating and Improving Concept Preservation for Text-to-Text Generation ...
BASE
Show details
12
Combating abuse on social media platforms using natural language processing
Seyler, Dominic. - 2021
BASE
Show details
13
Text Classification Using Label Names Only: A Language Model Self-Training Approach ...
Meng, Yu; Zhang, Yunyi; Huang, Jiaxin. - : arXiv, 2020
BASE
Show details
14
COVID-19 Literature Knowledge Graph Construction and Drug Repurposing Report Generation ...
Wang, Qingyun; Li, Manling; Wang, Xuan. - : arXiv, 2020
BASE
Show details
15
Constrained Abstractive Summarization: Preserving Factual Consistency with Constrained Generation ...
Mao, Yuning; Ren, Xiang; Ji, Heng. - : arXiv, 2020
BASE
Show details
16
Guiding Corpus-based Set Expansion by Auxiliary Sets Generation and Co-Expansion ...
Huang, Jiaxin; Xie, Yiqing; Meng, Yu. - : arXiv, 2020
BASE
Show details
17
Near-imperceptible Neural Linguistic Steganography via Self-Adjusting Arithmetic Coding ...
Shen, Jiaming; Ji, Heng; Han, Jiawei. - : arXiv, 2020
BASE
Show details
18
Cold-start universal information extraction
Huang, Lifu. - 2020
BASE
Show details
19
Cross-lingual entity extraction and linking for 300 languages
Pan, Xiaoman. - 2020
BASE
Show details
20
Text cube: construction, summarization and mining
Tao, Fangbo. - 2020
BASE
Show details

Page: 1 2 3

Catalogues
0
0
2
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
43
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern