Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (7)
Condensed Matter Physics (7)
Deep Learning (7)
Electromagnetism (7)
FOS Physical sciences (7)
Information and Knowledge Engineering (7)
Neural Network (7)
Semantics (7)
Creator / Publisher:
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 (7)
Zhou, Jie (7)
., Fandong (4)
Chen, Yufeng (2)
Li, Peng (2)
Lin, Yankai (2)
Liu, Zhiyuan (2)
Xu, Jia'an (2)
Han, Xu (1)
Hou, Lei (1)
more
Year
Medium
Type:
Article (7)
BLLDB-Access:
free (7)
subject to license (0)
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 7 of 7
1
Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT Knowledge Distillation ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
., Fandong
;
Lin, Zheng
. - : Underline Science Inc., 2021
BASE
Show details
2
CLEVE: Contrastive Pre-training for Event Extraction ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Han, Xu
;
Hou, Lei
. - : Underline Science Inc., 2021
BASE
Show details
3
Rethinking Stealthiness of Backdoor Attack against NLP Models ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Li, Peng
;
Lin, Yankai
. - : Underline Science Inc., 2021
BASE
Show details
4
Prevent the Language Model from being Overconfident in Neural Machine Translation ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
., Fandong
;
Liu, Yijin
. - : Underline Science Inc., 2021
BASE
Show details
5
KACC: A Multi-task Benchmark for Knowledge Abstraction, Concretization and Completion ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Hu, Shengding
;
Liu, Zhiyuan
. - : Underline Science Inc., 2021
BASE
Show details
6
Modeling Bilingual Conversational Characteristics for Neural Chat Translation ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
., Fandong
;
Chen, Yufeng
. - : Underline Science Inc., 2021
BASE
Show details
7
Target-oriented Fine-tuning for Zero-Resource Named Entity Recognition ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
., Fandong
;
Chen, Yufeng
;
Xu, Jia'an
;
Zhang, Ying
;
Zhou, Jie
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.findings-acl.140 Abstract: Zero-resource named entity recognition (NER) severely suffers from data scarcity in a specific domain or language. Most studies on zero-resource NER transfer knowledge from various data by fine-tuning on different auxiliary tasks. However, how to properly select training data and fine-tuning tasks is still an open problem. In this paper, we tackle the problem by transferring knowledge from three aspects, i.e., domain, language and task, and strengthening connections among them. Specifically, we propose four practical guidelines to guide knowledge transfer and task fine-tuning. Based on these guidelines, we design a target-oriented fine-tuning (TOF) framework to exploit various data from three aspects in a unified training manner. Experimental results on six benchmarks show that our method yields consistent improvements over baselines in both cross-domain and cross-lingual scenarios. Particularly, we achieve new state-of-the-art ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Information and Knowledge Engineering
;
Neural Network
;
Semantics
URL:
https://dx.doi.org/10.48448/04ry-9y67
https://underline.io/lecture/26231-target-oriented-fine-tuning-for-zero-resource-named-entity-recognition
BASE
Hide details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
7
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern