DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 33

1
Finding Structural Knowledge in Multimodal-BERT ...
BASE
Show details
2
Modeling Coreference Relations in Visual Dialog ...
BASE
Show details
3
Causal Direction of Data Collection Matters: Implications of Causal and Anticausal Learning for NLP
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
4
Classifying Dyads for Militarized Conflict Analysis
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
5
Efficient Sampling of Dependency Structure
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
6
Searching for More Efficient Dynamic Programs
In: Findings of the Association for Computational Linguistics: EMNLP 2021 (2021)
BASE
Show details
7
“Let Your Characters Tell Their Story”: A Dataset for Character-Centric Narrative Understanding
In: Findings of the Association for Computational Linguistics: EMNLP 2021 (2021)
BASE
Show details
8
A Bayesian Framework for Information-Theoretic Probing
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
9
Improving Dialogue State Tracking with Turn-based Loss Function and Sequential Data Augmentation
Abstract: While state-of-the-art Dialogue State Tracking (DST) models show promising results, all of them rely on a traditional cross-entropy loss function during the training process, which may not be optimal for improving the joint goal accuracy. Although several approaches recently propose augmenting the training set by copying user utterances and replacing the real slot values with other possible or even similar values, they are not effective at improving the performance of existing DST models. To address these challenges, we propose a Turn-based Loss Function (TLF) that penalises the model if it inaccurately predicts a slot value at the early turns more so than in later turns in order to improve joint goal accuracy. We also propose a simple but effective Sequential Data Augmentation (SDA) algorithm to generate more complex user utterances and system responses to effectively train existing DST models. Experimental results on two standard DST benchmark collections demonstrate that our proposed TLF and SDA techniques significantly improve the effectiveness of the state-of-the-art DST model by approximately 7-8% relative reduction in error and achieves a new state-of-the-art joint goal accuracy with 59.50 and 54.90 on MultiWOZ2.1 and MultiWOZ2.2, respectively.
URL: http://eprints.gla.ac.uk/256991/
https://aclanthology.org/2021.findings-emnlp.144
http://eprints.gla.ac.uk/256991/2/256991.pdf
BASE
Hide details
10
AWESSOME : An unsupervised sentiment intensity scoring framework using neural word embeddings
Htait, Amal; Azzopardi, Leif. - : Springer, 2021
BASE
Show details
11
Robust fragment-based framework for cross-lingual sentence retrieval
In: Findings of the Association for Computational Linguistics: EMNLP 2021 ; 935 ; 944 (2021)
BASE
Show details
12
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details
13
Come hither or go away? Recognising pre-electoral coalition signals in the news
Rehbein, Ines; Ponzetto, Simone Paolo; Adendorf, Anna. - : Association for Computational Linguistics, 2021
BASE
Show details
14
LIIR at SemEval-2020 Task 12: A Cross-Lingual Augmentation Approach for Multilingual Offensive Language Identification ...
BASE
Show details
15
Autoregressive Reasoning over Chains of Facts with Transformers ...
BASE
Show details
16
Rethinking summarization and storytelling for modern social multimedia
In: Rudinac, Stevan, Chua, Tat-Seng, Diaz-Ferreyra, Nicolas, Friedland, Gerald, Gornostaja, Tatjana, Huet, Benoit, Kaptein, Rianne, Lindén, Krister, Moens, Marie-Francine, Peltonen, Jaakko, Redi, Miriam, Schedl, Markus, Shamma, David A, Smeaton, Alan F. orcid:0000-0003-1028-8389 and Xie, Lexing (2018) Rethinking summarization and storytelling for modern social multimedia. In: The 24th International Conference on Multimedia Modeling (MMM2018), 5-7 Feb, 2018, Bangkok, Thailand. ISBN 978-3-319-73599-3 (2018)
BASE
Show details
17
Word-Level Loss Extensions for Neural Temporal Relation Classification ...
BASE
Show details
18
A deep learning approach to bilingual lexicon induction in the biomedical domain ...
Heyman, Geert; Vulić, Ivan; Moens, Marie-Francine. - : Apollo - University of Cambridge Repository, 2018
BASE
Show details
19
A deep learning approach to bilingual lexicon induction in the biomedical domain. ...
Heyman, Geert; Vulić, Ivan; Moens, Marie-Francine. - : Apollo - University of Cambridge Repository, 2018
BASE
Show details
20
A deep learning approach to bilingual lexicon induction in the biomedical domain.
Heyman, Geert; Vulić, Ivan; Moens, Marie-Francine. - : Springer Science and Business Media LLC, 2018. : BMC Bioinformatics, 2018
BASE
Show details

Page: 1 2

Catalogues
2
0
3
0
0
0
0
Bibliographies
5
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
26
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern