DE eng

Search in the Catalogues and Directories

Hits 1 – 16 of 16

1
Charippus heishiding Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
2
Charippus bukittimah Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
3
Charippus kubah Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
4
Charippus minotaurus Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
5
Charippus minotaurus Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
6
Charippus kubah Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
7
Charippus bukittimah Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
8
Charippus heishiding Yu, Maddison & Zhang 2022, sp. nov. ...
BASE
Show details
9
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding ...
NAACL 2021 2021; Li, Yu-Kun; Xiao, Dongling; Zhang, Han. - : Underline Science Inc., 2021
Abstract: Read the paper on the folowing link: https://www.aclweb.org/anthology/2021.naacl-main.136/ Abstract: Coarse-grained linguistic information, such as named entities or phrases, facilitates adequately representation learning in pre-training. Previous works mainly focus on extending the objective of BERT's Masked Language Modeling (MLM) from masking individual tokens to contiguous sequences of n tokens. We argue that such contiguously masking method neglects to model the intra-dependencies and inter-relation of coarse-grained linguistic information. As an alternative, we propose ERNIE-Gram, an explicitly n-gram masking method to enhance the integration of coarse-grained information into pre-training. In ERNIE-Gram, n-grams are masked and predicted directly using explicit n-gram identities rather than contiguous sequences of n tokens. Furthermore, ERNIE-Gram employs a generator model to sample plausible n-gram identities as optional n-gram masks and predict them in both coarse-grained and fine-grained manners to ...
Keyword: Artificial Intelligence; Computer Science and Engineering; Intelligent System; Machine Learning; Natural Language Processing
URL: https://dx.doi.org/10.48448/hqjf-mv13
https://underline.io/lecture/19941-ernie-gram-pre-training-with-explicitly-n-gram-masked-language-modeling-for-natural-language-understanding
BASE
Hide details
10
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding ...
Xiao, Dongling; Li, Yu-Kun; Zhang, Han. - : arXiv, 2020
BASE
Show details
11
Some Interval Neutrosophic Linguistic Maclaurin Symmetric Mean Operators And Their Application In Multiple Attribute Decision Making ...
Yushui Geng; Xingang Wang; Xuemei Li. - : Zenodo, 2018
BASE
Show details
12
Some Interval Neutrosophic Linguistic Maclaurin Symmetric Mean Operators And Their Application In Multiple Attribute Decision Making ...
Yushui Geng; Xingang Wang; Xuemei Li. - : Zenodo, 2018
BASE
Show details
13
Semantic retrieval of trademarks based on conceptual similarity
BASE
Show details
14
Semantic retrieval of trademarks based on conceptual similarity
BASE
Show details
15
Influence of linguistic context and working memory on auditory comprehension in young and older adults with aphasia
Yu, Kun. - : East Carolina University, 2010
BASE
Show details
16
INFLUENCE OF LINGUISTIC CONTEXT AND WORKING MEMORY ON AUDITORY COMPREHENSION IN YOUNG AND OLDER ADULTS WITH APHASIA
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
16
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern