2 |
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
|
|
|
|
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
|
|
BASE
|
|
Show details
|
|
3 |
What you can cram into a single vector: Probing sentence embeddings for linguistic properties ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
The LAMBADA dataset: Word prediction requiring a broad discourse context ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Generation for Grammar Engineering
|
|
|
|
In: Proceedings of the seventh International Natural Language Generation Conference ; INLG 2012, The seventh International Natural Language Generation Conference. ; https://hal.archives-ouvertes.fr/hal-00768612 ; INLG 2012, The seventh International Natural Language Generation Conference., May 2012, Starved Rock, Illinois, United States. pp.31-40 (2012)
|
|
BASE
|
|
Show details
|
|
12 |
Generating Grammar Exercises
|
|
|
|
In: Proceeding of the 7th Workshop on Innovative Use of NLP for Building Educational Applications, NAACL-HLT Worskhop 2012 ; The 7th Workshop on Innovative Use of NLP for Building Educational Applications, NAACL-HLT Worskhop 2012 ; https://hal.archives-ouvertes.fr/hal-00768610 ; The 7th Workshop on Innovative Use of NLP for Building Educational Applications, NAACL-HLT Worskhop 2012, Jun 2012, Montreal, Canada. pp.147-157 (2012)
|
|
BASE
|
|
Show details
|
|
13 |
Jointly optimizing word representations for lexical and sentential tasks with the C-PHRASE model
|
|
|
|
Abstract:
Comunicació presentada a: 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing celebrat del 26 al 31 de juliol de 2015 a Pequín, Xina. ; We introduce C-PHRASE, a distributional semantic model that learns word representations by optimizing context prediction for phrases at all levels in a syntactic tree, from single words to full sentences. C-PHRASE outperforms the state-of-theart C-BOW model on a variety of lexical tasks. Moreover, since C-PHRASE word vectors are induced through a compositional learning objective (modeling the contexts of words combined into phrases), when they are summed, they produce sentence representations that rival those generated by ad-hoc compositional models. ; We thank Gemma Boleda and the anonymous reviewers for useful comments. We acknowledge ERC 2011 Starting Independent Research Grant n. 283554 (COMPOSES).
|
|
URL: http://hdl.handle.net/10230/46044 https://doi.org/10.3115/v1/P15-1094
|
|
BASE
|
|
Hide details
|
|
14 |
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
|
|
|
|
BASE
|
|
Show details
|
|
16 |
There is no logical negation here, but there are alternatives: modeling conversational negation with distributional semantics
|
|
|
|
BASE
|
|
Show details
|
|
17 |
The LAMBADA dataset: word prediction requiring a broad discourse context
|
|
|
|
BASE
|
|
Show details
|
|
|
|