DE eng

Search in the Catalogues and Directories

Hits 1 – 6 of 6

1
Summarize-then-Answer: Generating Concise Explanations for Multi-hop Reading Comprehension ...
BASE
Show details
2
SHAPE: Shifted Absolute Position Embedding for Transformers ...
BASE
Show details
3
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models ...
BASE
Show details
4
Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.308/ Abstract: Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR). To further improve this approach, in this study, we made two proposals. The first is a new pretraining task that trains MLMs on anaphoric relations with explicit supervision, and the second proposal is a new finetuning method that remedies a notorious issue, the pretrainfinetune discrepancy. Our experiments on Japanese ZAR demonstrated that our two proposals boost the state-of-the-art performance, and our detailed analysis provides new insights on the remaining challenges. ...
Keyword: Computational Linguistics; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
URL: https://dx.doi.org/10.48448/chh9-q802
https://underline.io/lecture/37343-pseudo-zero-pronoun-resolution-improves-zero-anaphora-resolution
BASE
Hide details
5
Exploring Methods for Generating Feedback Comments for Writing Learning ...
BASE
Show details
6
Transformer-based Lexically Constrained Headline Generation ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
6
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern