DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
Chinese whispers: Cooperative paraphrase acquisition
In: http://www.lrec-conf.org/proceedings/lrec2012/pdf/772_Paper.pdf (2012)
BASE
Show details
2
Building textual entailment specialized data sets: a methodology for isolating linguistic phenomena relevant to inference
In: http://www.lrec-conf.org/proceedings/lrec2010/pdf/478_Paper.pdf (2010)
BASE
Show details
3
Overview of ResPubliQA 2009: Question Answering Evaluation over European Legislation
In: http://clef.isti.cnr.it/2009/working_notes/ResPubliQA-overview.pdf (2009)
BASE
Show details
4
Overview of the CLEF 2007 multilingual question answering track
In: http://ceur-ws.org/Vol-1173/CLEF2007wn-QACLEF-GiampiccoloEt2007.pdf (2007)
BASE
Show details
5
Overview of the CLEF 2007 multilingual question answering track
In: http://www.celct.it/download/qa07_overview_working_notes.pdf (2007)
BASE
Show details
6
Overview of the clef 2007 multilingual question answering track
In: http://www.linguateca.pt/documentos/CLEF07Proceedings_QA_Overview_LNCS_Final-version.pdf (2007)
BASE
Show details
7
Overview of the CLEF 2005 Multilingual Question Answering Track
In: http://www.clef-campaign.org/2005/working_notes/workingnotes2005/vallin05.pdf (2005)
BASE
Show details
8
Overview of the CLEF 2005 Multilingual Question Answering Track
In: http://www.science.uva.nl/~mdr/Publications/Files/clef-2005-qa-overview-wn.pdf (2005)
Abstract: The general aim of the third CLEF Multilingual Question Answering Track was to set up a common and replicable evaluation framework to test both monolingual and cross-language Question Answering (QA) systems that process queries and documents in several European languages. Nine target languages and ten source languages were exploited to enact 8 monolingual and 73 cross-language tasks. Twenty-four groups participated in the exercise.Overall results showed a general increase in performance in comparison to last year. The best performing monolingual system irrespective of target language answered 64.5 % of the questions correctly (in the monolingual Portuguese task), while the average of the best performances for each target language was 42.6%. The cross-language step instead entailed a considerable drop in performance. In addition to accuracy, the organisers also measured the relation between the correctness of an answer and a system’s stated confidence in it, showing that the best systems did not always provide the most reliable confidence score.
Keyword: Categories and Subject Descriptors H.3 [Information Storage and Retrieval; Experimentation Keywords Question answering; H.3.1 Content Analysis and Indexing; H.3.3 Information Search and Retrieval; H.3.4 Systems and Software; I.2 [Artificial Intelligence; I.2.7 Natural Language Processing General Terms Measurement; Performance
URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.77.2950
http://www.science.uva.nl/~mdr/Publications/Files/clef-2005-qa-overview-wn.pdf
BASE
Hide details
9
Overview of the CLEF 2005 Multilingual Question Answering Track
In: http://www.science.uva.nl/~mdr/Publications/Files/clef2005-qa-overview-proceedings.pdf (2005)
BASE
Show details
10
Association for Computational Linguistics The Italian Lexical Sample Task at SENSEVAL-3
In: http://www.aclweb.org/anthology-new/W/W04/W04-0805.pdf
BASE
Show details
11
Overview of the CLEF 2006 Multilingual Question Answering Track
In: http://www.linguateca.pt/documentos/MagninietalRocha2006.pdf
BASE
Show details
12
UvA-DARE (Digital Academic Repository) Overview of the CLEF 2005 Multilingual Question Answering Track Overview of the CLEF 2005 Multilingual Question Answering Track
In: https://pure.uva.nl/ws/files/3890088/38030_clef_2005_qa_overview_wn.pdf
BASE
Show details
13
The Multilingual Question Answering Track at CLEF
In: http://www.linguateca.pt/Diana/download/MagninietalLREC2006.pdf
BASE
Show details
14
Evaluating Multilingual Question Answering Systems at CLEF
In: http://www.lrec-conf.org/proceedings/lrec2010/pdf/464_Paper.pdf
BASE
Show details
15
Overview of the CLEF 2006 Multilingual Question Answering Track
In: http://www.clef-campaign.org/2006/working_notes/workingnotes2006/magniniOCLEF2006.pdf
BASE
Show details
16
Anselmo Peñas, 7
In: http://www.linguateca.pt/Diana/download/VallinetalSpringer2006.pdf
BASE
Show details
17
Anselmo Peñas, 7
In: http://www.linguateca.pt/Repositorio/VallinetalSpringer2006.pdf
BASE
Show details
18
A Resource for Investigating the Impact of Anaphora and Coreference on Inference
In: http://www.cs.biu.ac.il/%7Emirkins/publications/LREC-2010_Abad-etal.pdf
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern