2 |
Generalized Shortest-Paths Encoders for AMR-to-Text Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Unsupervised Bilingual Lexicon Induction Across Writing Systems ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Semantic Neural Machine Translation Using AMR
|
|
|
|
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 19-31 (2019) (2019)
|
|
BASE
|
|
Show details
|
|
8 |
Addressing the Data Sparsity Issue in Neural AMR Parsing ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Sense Embedding Learning for Word Sense Induction ...
|
|
|
|
Abstract:
Conventional word sense induction (WSI) methods usually represent each instance with discrete linguistic features or cooccurrence features, and train a model for each polysemous word individually. In this work, we propose to learn sense embeddings for the WSI task. In the training stage, our method induces several sense centroids (embedding) for each polysemous word. In the testing stage, our method represents each instance as a contextual vector, and induces its sense by finding the nearest sense centroid in the embedding space. The advantages of our method are (1) distributed sense vectors are taken as the knowledge representations which are trained discriminatively, and usually have better performance than traditional count-based distributional models, and (2) a general model for the whole vocabulary is jointly trained to induce sense centroids under the mutlitask learning framework. Evaluated on SemEval-2010 WSI dataset, our method outperforms all participants and most of the recent state-of-the-art ... : 6 pages, no figures in *SEM 2016 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/1606.05409 https://dx.doi.org/10.48550/arxiv.1606.05409
|
|
BASE
|
|
Hide details
|
|
14 |
Simultaneous Word-Morpheme Alignment for Statistical Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Simultaneous Word-Morpheme Alignment for Statistical Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Using latent information for natural language processing tasks
|
|
|
|
BASE
|
|
Show details
|
|
18 |
On word alignment models for statistical machine translation
|
|
|
|
BASE
|
|
Show details
|
|
|
|