DE eng

Search in the Catalogues and Directories

Hits 1 – 20 of 20

1
Enhancing Cross-lingual Prompting with Mask Token Augmentation ...
Zhou, Meng; Li, Xin; Jiang, Yue. - : arXiv, 2022
BASE
Show details
2
Cross-lingual Aspect-based Sentiment Analysis with Aspect Term Code-Switching ...
BASE
Show details
3
Towards Multi-Sense Cross-Lingual Alignment of Contextual Embeddings ...
BASE
Show details
4
Knowledge Based Multilingual Language Model ...
Liu, Linlin; Li, Xin; He, Ruidan. - : arXiv, 2021
BASE
Show details
5
MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER ...
Zhou, Ran; Li, Xin; He, Ruidan. - : arXiv, 2021
BASE
Show details
6
Multilingual AMR Parsing with Noisy Knowledge Distillation ...
BASE
Show details
7
GlobalWoZ: Globalizing MultiWoZ to Develop Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
8
Multi-perspective Coherent Reasoning for Helpfulness Prediction of Multimodal Reviews ...
BASE
Show details
9
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.172 Abstract: Adapter-based tuning has recently arisen as an alternative to fine-tuning. It works by adding light-weight adapter modules to a pretrained language model (PrLM) and only updating the parameters of adapter modules when learning on a downstream task. As such, it adds only a few trainable parameters per new task, allowing a high degree of parameter sharing. Prior studies have shown that adapter-based tuning often achieves comparable results to fine-tuning. However, existing work only focuses on the parameter-efficient aspect of adapter-based tuning while lacking further investigation on its effectiveness. In this paper, we study the latter. We first show that adapter-based tuning better mitigates forgetting issues than fine-tuning since it yields representations with less deviation from those generated by the initial PrLM. We then empirically compare the two tuning methods on several downstream NLP tasks and settings. We demonstrate that 1) ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/hke2-0g48
https://underline.io/lecture/25492-on-the-effectiveness-of-adapter-based-tuning-for-pretrained-language-model-adaptation
BASE
Hide details
10
Towards Generative Aspect-Based Sentiment Analysis ...
BASE
Show details
11
Argument Pair Extraction via Attention-guided Multi-Layer Multi-Cross Encoding ...
BASE
Show details
12
Learning Span-Level Interactions for Aspect Sentiment Triplet Extraction ...
BASE
Show details
13
MulDA: A Multilingual Data Augmentation Framework for Low-Resource Cross-Lingual NER ...
BASE
Show details
14
Unsupervised Cross-lingual Adaptation for Sequence Tagging and Beyond ...
Li, Xin; Bing, Lidong; Zhang, Wenxuan. - : arXiv, 2020
BASE
Show details
15
Dynamic Topic Tracker for KB-to-Text Generation
BASE
Show details
16
Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning ...
Li, Zheng; Li, Xin; Wei, Ying. - : arXiv, 2019
BASE
Show details
17
A knowledge regularized hierarchical approach for emotion cause analysis
Gui, Lin; Bing, Lidong; Xu, Ruifeng. - : Association for Computational Linguistics, 2019
BASE
Show details
18
Neural Rating Regression with Abstractive Tips Generation for Recommendation ...
Li, Piji; Wang, Zihao; Ren, Zhaochun. - : arXiv, 2017
BASE
Show details
19
Reader-Aware Multi-Document Summarization via Sparse Coding ...
Li, Piji; Bing, Lidong; Lam, Wai. - : arXiv, 2015
BASE
Show details
20
Abstractive Multi-Document Summarization via Phrase Selection and Merging ...
Bing, Lidong; Li, Piji; Liao, Yi. - : arXiv, 2015
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
20
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern