2 |
Jointly optimizing word representations for lexical and sentential tasks with the C-PHRASE model
|
|
|
|
Abstract:
Comunicació presentada a: 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing celebrat del 26 al 31 de juliol de 2015 a Pequín, Xina. ; We introduce C-PHRASE, a distributional semantic model that learns word representations by optimizing context prediction for phrases at all levels in a syntactic tree, from single words to full sentences. C-PHRASE outperforms the state-of-theart C-BOW model on a variety of lexical tasks. Moreover, since C-PHRASE word vectors are induced through a compositional learning objective (modeling the contexts of words combined into phrases), when they are summed, they produce sentence representations that rival those generated by ad-hoc compositional models. ; We thank Gemma Boleda and the anonymous reviewers for useful comments. We acknowledge ERC 2011 Starting Independent Research Grant n. 283554 (COMPOSES).
|
|
URL: http://hdl.handle.net/10230/46044 https://doi.org/10.3115/v1/P15-1094
|
|
BASE
|
|
Hide details
|
|
3 |
“Look, some green circles!”: learning to quantify from images
|
|
|
|
BASE
|
|
Show details
|
|
4 |
"The red one!": on learning to refer to things based on discriminative properties
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Hubness and pollution: delving into cross-space mapping for zero-shot learning
|
|
|
|
BASE
|
|
Show details
|
|
7 |
The LAMBADA dataset: word prediction requiring a broad discourse context
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Multimodal word meaning induction from minimal exposure to natural text
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Emergence of Linguistic Communication from Referential Games with Symbolic and Pixel Input
|
|
|
|
BASE
|
|
Show details
|
|
|
|