DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 38

1
SyGNS: A Systematic Generalization Testbed Based on Natural Language Semantics ...
BASE
Show details
2
Summarize-then-Answer: Generating Concise Explanations for Multi-hop Reading Comprehension ...
BASE
Show details
3
SHAPE: Shifted Absolute Position Embedding for Transformers ...
BASE
Show details
4
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models ...
BASE
Show details
5
Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution ...
BASE
Show details
6
Exploring Methods for Generating Feedback Comments for Writing Learning ...
BASE
Show details
7
Transformer-based Lexically Constrained Headline Generation ...
Abstract: This paper explores a variant of automatic headline generation methods, where a generated headline is required to include a given phrase such as a company or a product name. Previous methods using Transformer-based models generate a headline including a given phrase by providing the encoder with additional information corresponding to the given phrase. However, these methods cannot always include the phrase in the generated headline. Inspired by previous RNN-based methods generating token sequences in backward and forward directions from the given phrase, we propose a simple Transformer-based method that guarantees to include the given phrase in the high-quality generated headline. We also consider a new headline generation strategy that takes advantage of the controllable generation order of Transformer. Our experiments with the Japanese News Corpus demonstrate that our methods, which are guaranteed to include the phrase in the generated headline, achieve ROUGE scores comparable to previous ... : EMNLP 2021 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2109.07080
https://dx.doi.org/10.48550/arxiv.2109.07080
BASE
Hide details
8
Transformer-based Lexically Constrained Headline Generation ...
BASE
Show details
9
Topicalization in Language Models: A Case Study on Japanese ...
BASE
Show details
10
Lower Perplexity is Not Always Human-Like ...
BASE
Show details
11
Lower Perplexity is Not Always Human-Like ...
BASE
Show details
12
An Empirical Study of Contextual Data Augmentation for Japanese Zero Anaphora Resolution ...
BASE
Show details
13
PheMT: A Phenomenon-wise Dataset for Machine Translation Robustness on User-Generated Contents ...
Fujii, Ryo; Mita, Masato; Abe, Kaori. - : arXiv, 2020
BASE
Show details
14
Seeing the world through text: Evaluating image descriptions for commonsense reasoning in machine reading comprehension ...
BASE
Show details
15
Language Models as an Alternative Evaluator of Word Order Hypotheses: A Case Study in Japanese ...
BASE
Show details
16
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction ...
BASE
Show details
17
Attention is Not Only a Weight: Analyzing Transformers with Vector Norms ...
BASE
Show details
18
Filtering Noisy Dialogue Corpora by Connectivity and Content Relatedness ...
Akama, Reina; Yokoi, Sho; Suzuki, Jun. - : arXiv, 2020
BASE
Show details
19
Modeling Event Salience in Narratives via Barthes' Cardinal Functions ...
BASE
Show details
20
Do Neural Models Learn Systematicity of Monotonicity Inference in Natural Language? ...
BASE
Show details

Page: 1 2

Catalogues
2
0
1
0
2
0
0
Bibliographies
3
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
32
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern