DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 59

1
M.: Application of Axiomatic Approaches to Crosslanguage Retrieval
In: http://clef.isti.cnr.it/2009/working_notes/rkern-paperCLEF2009.pdf (2010)
BASE
Show details
2
Overview of VideoCLEF 2009: New perspectives on speech-based multimedia content enrichment
In: http://clef.isti.cnr.it/2009/working_notes/larsonVideoCLEF2009_overview.pdf (2009)
BASE
Show details
3
Multiple retrieval models and regression models for prior art search
In: http://clef.isti.cnr.it/2009/working_notes/lopez-paperCLEF2009.pdf (2009)
BASE
Show details
4
Multiple retrieval models and regression models for prior art search
In: http://hal.archives-ouvertes.fr/docs/00/41/18/35/PDF/technote.pdf (2009)
BASE
Show details
5
German, French, English and Persian Retrieval Experiments at CLEF 2009
In: http://clef.isti.cnr.it/2009/working_notes/tomlinson-paperCLEF2009.pdf (2009)
BASE
Show details
6
G.: IXA at CLEF 2008 Robust-WSD Task: using Word Sense Disambiguation for (Cross Lingual) Information Retrieval
In: http://ixa.si.ehu.es/Ixa/Argitalpenak/Artikuluak/1226564935/publikoak/otegi-paperCLEF2008.pdf (2009)
BASE
Show details
7
Overview of the ImageCLEFphoto 2008 Photographic Retrieval Task. 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access (CLEF 2008
In: http://clef.isti.cnr.it/2008/working_notes/ImageCLEFphoto2008-final.pdf (2008)
BASE
Show details
8
Back to Basics - Again - for Domain Specific Retrieval. This volume
In: http://clef.isti.cnr.it/2008/working_notes/Berkeley_Domain_Specific_08.pdf (2008)
BASE
Show details
9
German, French, English and Persian Retrieval Experiments at CLEF 2008
In: http://clef.isti.cnr.it/2008/working_notes/tomlinson-paperCLEF2008.pdf (2008)
BASE
Show details
10
Cross Lingual Question Answering using QRISTAL for CLEF 2007
In: http://www.qristal.fr/pub/Cross Lingual Question Answering using QRISTAL for CLEF 2007.pdf (2007)
BASE
Show details
11
L.: The University of West Bohemia at CLEF 2006, the CL-SR track
In: http://ceur-ws.org/Vol-1172/CLEF2006wn-CLSR-IrcingEt2006.pdf (2006)
BASE
Show details
12
de Rijke. Overview of WebCLEF 2006
In: http://ceur-ws.org/Vol-1172/CLEF2006wn-WebCLEF-BalogEt2006a.pdf (2006)
BASE
Show details
13
F.Llopis. AliQAn and BRILI QA systems at CLEF 2006
In: http://ceur-ws.org/Vol-1172/CLEF2006wn-QACLEF-FerrandezEt2006.pdf (2006)
BASE
Show details
14
Overview of the CLEF 2005 Multilingual Question Answering Track
In: http://www.clef-campaign.org/2005/working_notes/workingnotes2005/vallin05.pdf (2005)
BASE
Show details
15
Overview of the CLEF 2005 Multilingual Question Answering Track
In: http://www.science.uva.nl/~mdr/Publications/Files/clef-2005-qa-overview-wn.pdf (2005)
BASE
Show details
16
Sanchis: TPIRS: A System for Document Indexing Reduction on WebCLEF, Extended abstract in Working notes of CLEF’05, Viena
In: http://clef.isti.cnr.it/2005/working_notes/workingnotes2005/pinto05.pdf (2005)
BASE
Show details
17
Extraction of Definitions for Bulgarian
In: http://ceur-ws.org/Vol-1172/CLEF2006wn-QACLEF-Tanev2006.pdf
BASE
Show details
18
Evaluating Answer Validation in Spanish Question Answering
In: http://ceur-ws.org/Vol-1174/CLEF2008wn-QACLEF-TellezValero2008.pdf
BASE
Show details
19
Prior Art Search using International Patent Classification Codes and All-Claims-Queries
In: http://atlas.tk.informatik.tu-darmstadt.de/Publications/2009/CLEF2009-IP_final.pdf.pdf
BASE
Show details
20
Evaluating Answer Validation in Spanish Question Answering
In: http://ccc.inaoep.mx/~mmontesg/publicaciones/2008/UsingAVinQA-CLEF08.pdf
Abstract: This paper introduces the new INAOE’s answer validation method. This method is based on supervised learning approach that uses a set of attributes that capture some lexical-syntactic relations among the question, the answer and the given support text. In addition, the paper describes the evaluation of the proposed method at both the Spanish Answer validation Exercise (AVE 2008) and the Spanish Question Answering Main Task (QA 2008). The evaluation objectives were twofold. One the one hand, evaluate the ability of our answer validation method to discriminate correct from in-correct answers, and on the other hand, measure the impact of including an answer validation module in our QA system. The evaluation results were encouraging; the proposed method achieved a 0.39 F-measure in the detection of correct answers, out-performing the baseline result of the AVE 2008 task by more than 100%. It also enhanced the performance of our QA system, showing a gain in accuracy of 22 % for answering factoid questions. Furthermore, when there were evaluated three candidate answers per question, the answer validation method allowed increasing the MRR of our QA system by 40%, reaching a MRR of 0.28.
Keyword: Answer Validation; Categories and Subject Descriptors H.3 [Information Storage and Retrieval; Experimentation Keywords Question Answering; H.2.3 [Database Managment; H.3.1 Content Analysis and Indexing; H.3.3 Infor- mation Search and Retrieval; H.3.4 Systems and Software; H.3.7 Digital Libraries; Languages—Query Languages General Terms Measurement; Machine Learning; Performance; Textual Entailment
URL: http://ccc.inaoep.mx/~mmontesg/publicaciones/2008/UsingAVinQA-CLEF08.pdf
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.466.4021
BASE
Hide details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
59
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern