Page: 1 2 3 4 5 6 7 8... 60
61 |
Towards Training Stronger Video Vision Transformers for EPIC-KITCHENS-100 Action Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
62 |
LICHEE: Improving Language Model Pre-training with Multi-grained Tokenization ...
|
|
|
|
BASE
|
|
Show details
|
|
63 |
Query-graph with Cross-gating Attention Model for Text-to-Audio Grounding ...
|
|
|
|
BASE
|
|
Show details
|
|
64 |
Congolese Swahili Machine Translation for Humanitarian Response ...
|
|
|
|
BASE
|
|
Show details
|
|
65 |
LAWDR: Language-Agnostic Weighted Document Representations from Pre-trained Models ...
|
|
|
|
BASE
|
|
Show details
|
|
66 |
FST: the FAIR Speech Translation System for the IWSLT21 Multilingual Shared Task ...
|
|
|
|
BASE
|
|
Show details
|
|
67 |
Pay Better Attention to Attention: Head Selection in Multilingual and Multi-Domain Sequence Modeling ...
|
|
|
|
BASE
|
|
Show details
|
|
68 |
Code Generation from Natural Language with Less Prior Knowledge and More Monolingual Data ...
|
|
|
|
BASE
|
|
Show details
|
|
69 |
UserAdapter: Few-Shot User Learning in Sentiment Analysis ...
|
|
|
|
BASE
|
|
Show details
|
|
70 |
CLICKER: A Computational LInguistics Classification Scheme for Educational Resources ...
|
|
|
|
BASE
|
|
Show details
|
|
71 |
What Truly Matters? Using Linguistic Cues for Analyzing the #BlackLivesMatter Movement and its Counter Protests: 2013 to 2020 ...
|
|
|
|
BASE
|
|
Show details
|
|
72 |
LAVT: Language-Aware Vision Transformer for Referring Image Segmentation ...
|
|
|
|
BASE
|
|
Show details
|
|
73 |
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
74 |
CONFIT: Toward Faithful Dialogue Summarization with Linguistically-Informed Contrastive Fine-tuning ...
|
|
|
|
BASE
|
|
Show details
|
|
75 |
Searching for Legal Documents at Paragraph Level: Automating Label Generation and Use of an Extended Attention Mask for Boosting Neural Models of Semantic Similarity ...
|
|
|
|
BASE
|
|
Show details
|
|
76 |
Alpha at SemEval-2021 Tasks 6: Transformer Based Propaganda Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
77 |
The Authors Matter: Understanding and Mitigating Implicit Bias in Deep Text Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
78 |
Revisiting Negation in Neural Machine Translation ...
|
|
|
|
Abstract:
Read paper: NA Abstract: In this paper, we evaluate the translation of negation both automatically and manually, in English—German (EN—DE) and English—Chinese (EN—ZH). We show that the ability of neural machine translation (NMT) models to translate negation has improved with deeper and more advanced networks, although the performance varies between language pairs and translation directions. The accuracy of manual evaluation in EN—DE, DE—EN, EN—ZH, and ZH—EN is 95.7%, 94.8%, 93.4%, and 91.7%, respectively. In addition, we show that under-translation is the most significant error type in NMT, which contrasts with the more diverse error profile previously observed for statistical machine translation. To better understand the root of the under-translation of negation, we study the model's information flow and training data. While our information flow analysis does not reveal any deficiencies that could be used to detect or fix the under-translation of negation, we find that negation is often rephrased during ...
|
|
Keyword:
Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
|
|
URL: https://underline.io/lecture/25803-revisiting-negation-in-neural-machine-translation https://dx.doi.org/10.48448/x27r-fa09
|
|
BASE
|
|
Hide details
|
|
80 |
From Discourse to Narrative: Knowledge Projection for Event Relation Extraction ...
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8... 60
|
|