1 |
ANLIzing the Adversarial Natural Language Inference Dataset
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Automatic Fact-Checking with Document-level Annotations using BERT and Multiple Instance Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
FastIF: Scalable Influence Functions for Efficient Model Interpretation and Debugging ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Open Aspect Target Sentiment Classification with Natural Language Prompts ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Does External Knowledge Help Explainable Natural Language Inference? Automatic Evaluation vs. Human Ratings ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
ContractNLI: A Dataset for Document-level Natural Language Inference for Contracts ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
IndoNLI: A Natural Language Inference Dataset for Indonesian ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Investigating the Effect of Natural Language Explanations on Out-of-Distribution Generalization in Few-shot NLI ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Don't Discard All the Biased Instances: Investigating a Core Assumption in Dataset Bias Mitigation Techniques ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Scheduled Sampling Based on Decoding Steps for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Pairwise Supervised Contrastive Learning of Sentence Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Finding a Balanced Degree of Automation for Summary Evaluation ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
A Multilingual Benchmark for Probing Negation-Awareness with Minimal Pairs ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Universal Sentence Representation Learning with Conditional Masked Language Model ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
BARThez: a Skilled Pretrained French Sequence-to-Sequence Model ...
|
|
|
|
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.740/ Abstract: Inductive transfer learning has taken the entire NLP field by storm, with models such as BERT and BART setting new state of theart on countless NLU tasks. However, most of the available models and research have been conducted for English. In this work, we introduce BARThez, the first large-scale pretrained seq2seq model for French. Being based on BART, BARThez is particularly well-suited for generative tasks. We evaluate BARThez on five discriminative tasks from the FLUE benchmark and two generative tasks from a novel summarization dataset, OrangeSum, that we created for this research. We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT. We also continue the pretraining of a multilingual BART on BARThez’ corpus, and show our resulting model, mBARThez, to significantly boost BARThez’ generative performance. ...
|
|
Keyword:
Computational Linguistics; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Inference; Natural Language Processing
|
|
URL: https://dx.doi.org/10.48448/brqh-1743 https://underline.io/lecture/37674-barthez-a-skilled-pretrained-french-sequence-to-sequence-model
|
|
BASE
|
|
Hide details
|
|
18 |
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Hy-NLI : a Hybrid system for state-of-the-art Natural Language Inference
|
|
|
|
BASE
|
|
Show details
|
|
|
|