DE eng

Search in the Catalogues and Directories

Hits 1 – 9 of 9

1
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
BASE
Show details
2
XNLI: Evaluating Cross-lingual Sentence Representations ...
BASE
Show details
3
What you can cram into a single vector: Probing sentence embeddings for linguistic properties ...
BASE
Show details
4
Fader Networks: Manipulating Images by Sliding Attributes
In: 31st Conference on Neural Information Processing Systems (NIPS 2017) ; https://hal.archives-ouvertes.fr/hal-02275215 ; 31st Conference on Neural Information Processing Systems (NIPS 2017), Dec 2017, Long Beach, CA, United States. pp.5969-5978 (2017)
BASE
Show details
5
Word Translation Without Parallel Data ...
BASE
Show details
6
Massively Multilingual Word Embeddings ...
BASE
Show details
7
Polyglot Neural Language Models: A Case Study in Cross-Lingual Phonetic Representation Learning ...
BASE
Show details
8
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
Kruszewski, German; Barrault, Loïc; Baroni, Marco. - : ACL (Association for Computational Linguistics)
BASE
Show details
9
Neural architectures for named entity recognition
Kawakami, Kazuya; Ballesteros, Miguel; Lample, Guillaume; Dyer, Chris; Subramanian, Sandeep. - : ACL (Association for Computational Linguistics)
Abstract: Comunicació presentada a la 2016 Conference of the North American Chapter of the Association for Computational Linguistics, celebrada a San Diego (CA, EUA) els dies 12 a 17 de juny 2016. ; State-of-the-art named entity recognition systems/nrely heavily on hand-crafted features and/ndomain-specific knowledge in order to learn/neffectively from the small, supervised training/ncorpora that are available. In this paper, we/nintroduce two new neural architectures—one/nbased on bidirectional LSTMs and conditional/nrandom fields, and the other that constructs/nand labels segments using a transition-based/napproach inspired by shift-reduce parsers./nOur models rely on two sources of information/nabout words: character-based word/nrepresentations learned from the supervised/ncorpus and unsupervised word representations/nlearned from unannotated corpora. Our/nmodels obtain state-of-the-art performance in/nNER in four languages without resorting to/nany language-specific knowledge or resources/nsuch as gazetteers. ; This work was sponsored in part by the Defense/nAdvanced Research Projects Agency (DARPA)/nInformation Innovation Office (I2O) under the/nLow Resource Languages for Emergent Incidents/n(LORELEI) program issued by DARPA/I2O under/nContract No. HR0011-15-C-0114. Miguel Ballesteros/nis supported by the European Commission under/nthe contract numbers FP7-ICT-610411 (project/nMULTISENSOR) and H2020-RIA-645012 (project/nKRISTINA).
Keyword: Lingüística computacional; Tractament del llenguatge natural (Informàtica)
URL: http://hdl.handle.net/10230/27725
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
9
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern