1 |
Contextual Semantic-Guided Entity-Centric GCN for Relation Extraction
|
|
|
|
In: Mathematics; Volume 10; Issue 8; Pages: 1344 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Virtual Reality-Integrated Immersion-Based Teaching to English Language Learning Outcome
|
|
|
|
In: Front Psychol (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Dialog{S}um: {A} Real-Life Scenario Dialogue Summarization Dataset ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Transfer Learning for Sequence Generation: from Single-source to Multi-source ...
|
|
|
|
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.446 Abstract: Multi-source sequence generation (MSG) is an important kind of sequence generation tasks that takes multiple sources, including automatic post-editing, multi-source translation, multi-document summarization, etc. As MSG tasks suffer from the data scarcity problem and recent pretrained models have been proven to be effective for low-resource downstream tasks, transferring pretrained sequence-to-sequence models to MSG tasks is essential. Although directly finetuning pretrained models on MSG tasks and concatenating multiple sources into a single long sequence is regarded as a simple method to transfer pretrained models to MSG tasks, we conjecture that the direct finetuning method leads to catastrophic forgetting and solely relying on pretrained self-attention layers to capture cross-source information is not sufficient. Therefore, we propose a two-stage finetuning method to alleviate the pretrain-finetune discrepancy and introduce a novel MSG ...
|
|
URL: https://dx.doi.org/10.48448/hc3e-sm74 https://underline.io/lecture/25885-transfer-learning-for-sequence-generation-from-single-source-to-multi-source
|
|
BASE
|
|
Hide details
|
|
6 |
Segment, Mask, and Predict: Augmenting Chinese Word Segmentation with Self-Supervision ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Learning to Selectively Learn for Weakly-supervised Paraphrase Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Statistically significant detection of semantic shifts using contextual word embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Statistically Significant Detection of Semantic Shifts using Contextual Word Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Leveraging Word-Formation Knowledge for Chinese Word Sense Disambiguation ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Tapping into non-English-language science for the conservation of global biodiversity. ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Tapping into non-English-language science for the conservation of global biodiversity. ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Tapping into non-English-language science for the conservation of global biodiversity
|
|
|
|
In: PLoS Biol (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Tapping into non-English-language science for the conservation of global biodiversity.
|
|
|
|
In: essn: 1545-7885 ; nlmid: 101183755 (2021)
|
|
BASE
|
|
Show details
|
|
15 |
Tapping into non-English-language science for the conservation of global biodiversity
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Overcoming Barriers to Cross-cultural Cooperation in AI Ethics and Governance
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Fostering EFL/ESL Students’ Language Achievement: The Role of Teachers’ Enthusiasm and Classroom Enjoyment
|
|
|
|
In: Front Psychol (2021)
|
|
BASE
|
|
Show details
|
|
18 |
Overview of AMALGUM – Large Silver Quality Annotations across English Genres
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
19 |
Please Mind the Root: Decoding Arborescences for Dependency Parsing
|
|
|
|
In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (2020)
|
|
BASE
|
|
Show details
|
|
20 |
Measuring the Similarity of Grammatical Gender Systems by Comparing Partitions
|
|
|
|
In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (2020)
|
|
BASE
|
|
Show details
|
|
|
|