DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 40

1
Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics
In: http://www.transacl.org/wp-content/uploads/2014/04/52.pdf (2014)
BASE
Show details
2
Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics.
In: http://acl2014.org/acl2014/Q14/pdf/Q14-1010.pdf (2014)
BASE
Show details
3
Grounded compositional semantics for finding and describing images with sentences. Transactions of the Association for Computational Linguistics.
In: http://acl2014.org/acl2014/Q14/pdf/Q1410.pdf (2014)
BASE
Show details
4
Parsing With Compositional Vector Grammars
In: http://aclweb.org/anthology/P/P13/P13-1045.pdf (2013)
BASE
Show details
5
END-TO-END TEXT RECOGNITION WITH CONVOLUTIONAL NEURAL NETWORKS
In: http://crypto.stanford.edu/%7Edwu4/papers/HonorThesis.pdf (2012)
BASE
Show details
6
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
In: http://aclweb.org/anthology-new/D/D11/D11-1014.pdf (2011)
BASE
Show details
7
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
In: http://nlp.stanford.edu/pubs/SocherPenningtonHuangNgManning_EMNLP2011.pdf (2011)
BASE
Show details
8
Semi-supervised recursive autoencoders for predicting sentiment distributions
In: http://aclweb.org/supplementals/D/D11/D11-1014.Attachment.pdf (2011)
BASE
Show details
9
Learning continuous phrase representations and syntactic parsing with recursive neural networks
In: http://www.socher.org/uploads/Main/2010SocherManningNg.pdf (2010)
BASE
Show details
10
Learning continuous phrase representations and syntactic parsing with recursive neural networks
In: http://wuawua.googlecode.com/files/Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks.pdf (2010)
BASE
Show details
11
Cheap and fast - but is it good? evaluating nonexpert annotations for natural language tasks
In: http://sing.stanford.edu/cs303-sp11/papers/snow_turk.pdf (2008)
BASE
Show details
12
Cheap and fast — but is it good? Evaluating non-expert annotations for natural language tasks
In: http://blog.doloreslabs.com/wp-content/uploads/2008/09/amt_emnlp08_final.pdf (2008)
BASE
Show details
13
Cheap and Fast – But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks. EMNLP’08
In: http://www.aclweb.org/anthology-new/D/D08/D08-1027.pdf (2008)
BASE
Show details
14
Cheap and fast — but is it good? Evaluating non-expert annotations for natural language tasks
In: http://nlp.stanford.edu/pubs/amt_emnlp08.pdf (2008)
BASE
Show details
15
Cheap and fast — But is it good? Evaluating non-expert annotations for natural language tasks
In: http://blog.doloreslabs.com/wp-content/uploads/2008/09/amt_emnlp08_accepted.pdf (2008)
BASE
Show details
16
Solving the problem of cascading errors: Approximate bayesian inference for linguistic annotation pipelines
In: http://nlp.stanford.edu/cmanning/papers/pipeline.pdf (2006)
BASE
Show details
17
Solving the problem of cascading errors: Approximate bayesian inference for linguistic annotation pipelines
In: http://nlp.stanford.edu/cmanning/papers/pipeline.ps (2006)
BASE
Show details
18
Solving the problem of cascading errors: Approximate Bayesian inference for linguistic annotation pipelines
In: http://www.aclweb.org/anthology-new/W/W06/W06-1673.pdf (2006)
BASE
Show details
19
Robust textual inference via learning and abductive reasoning
In: http://www.robotics.stanford.edu/~ang/papers/aaai05-learnabduction.ps (2005)
BASE
Show details
20
Robust textual inference via learning and abductive reasoning
In: http://www.stanford.edu/~rajatr/papers/aaai05-learnabduction.pdf (2005)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
40
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern