DE eng

Search in the Catalogues and Directories

Hits 1 – 14 of 14

1
Cross-lingual Representation Learning for Natural Language Processing
Ahmad, Wasi Uddin. - : eScholarship, University of California, 2021
BASE
Show details
2
CoDesc: A Large Code-Description Parallel Dataset ...
BASE
Show details
3
BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla ...
BASE
Show details
4
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training ...
BASE
Show details
5
CrossSum: Beyond English-Centric Cross-Lingual Abstractive Text Summarization for 1500+ Language Pairs ...
BASE
Show details
6
Intent Classification and Slot Filling for Privacy Policies ...
BASE
Show details
7
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training ...
BASE
Show details
8
Syntax-augmented Multilingual BERT for Cross-lingual Transfer ...
BASE
Show details
9
Syntax-augmented Multilingual BERT for Cross-lingual Transfer ...
BASE
Show details
10
GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction ...
BASE
Show details
11
Word and Sentence Embedding Tools to Measure Semantic Similarity of Gene Ontology Terms by Their Definitions.
In: Journal of computational biology : a journal of computational molecular cell biology, vol 26, iss 1 (2019)
BASE
Show details
12
Word and Sentence Embedding Tools to Measure Semantic Similarity of Gene Ontology Terms by Their Definitions
Duong, Dat; Ahmad, Wasi Uddin; Eskin, Eleazar. - : Mary Ann Liebert, Inc., publishers, 2019
BASE
Show details
13
Word and Sentence Embedding Tools to Measure Semantic Similarity of Gene Ontology Terms by Their Definitions
In: Duong, Dat; Ahmad, Wasi Uddin; Eskin, Eleazar; Chang, Kai-Wei; & Li, Jingyi Jessica. (2018). Word and Sentence Embedding Tools to Measure Semantic Similarity of Gene Ontology Terms by Their Definitions. JOURNAL OF COMPUTATIONAL BIOLOGY, 26(1), 38 - 52. doi:10.1089/cmb.2018.0093. UCLA: Retrieved from: http://www.escholarship.org/uc/item/3j1294zk (2018)
BASE
Show details
14
Learning Robust, Transferable Sentence Representations for Text Classification ...
Abstract: Despite deep recurrent neural networks (RNNs) demonstrate strong performance in text classification, training RNN models are often expensive and requires an extensive collection of annotated data which may not be available. To overcome the data limitation issue, existing approaches leverage either pre-trained word embedding or sentence representation to lift the burden of training RNNs from scratch. In this paper, we show that jointly learning sentence representations from multiple text classification tasks and combining them with pre-trained word-level and sentence level encoders result in robust sentence representations that are useful for transfer learning. Extensive experiments and analyses using a wide range of transfer and linguistic tasks endorse the effectiveness of our approach. ... : arXiv admin note: substantial text overlap with arXiv:1804.07911 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1810.00681
https://arxiv.org/abs/1810.00681
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
14
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern