Posted on

Comparative Evaluation of Multilingual Information Access by Carol; Gonzalo, Julio; Braschler, Martin; Kluck, Michael

By Carol; Gonzalo, Julio; Braschler, Martin; Kluck, Michael (editors) Peters

The fourth crusade of the Cross-language review discussion board (CLEF) for eu languages was once held from January to August 2003. Participation during this crusade confirmed a moderate upward thrust within the variety of members from the former yr, with forty two teams filing effects for a number of of different tracks (compared with 37 in 2002), yet a steep upward push within the variety of experiments tried. a particular function of CLEF 2003 used to be the variety of new tracks and initiatives that have been provided as pilot experiments. the purpose used to be to aim out new principles and to motivate the improvement of recent review methodologies, fitted to the rising standards of either procedure builders and clients with recognize to today’s electronic collections and to inspire paintings on many ecu languages instead of simply these most generally used. CLEF is therefore progressively pushing its members in the direction of the final word target: the improvement of really multilingual platforms able to processing collections in various media. The crusade culminated in a two-day workshop held in Trondheim, Norway, 21–22 August, instantly following the seventh ecu convention on electronic Libraries (ECDL 2003), and attended by means of greater than 70 researchers and procedure builders. the target of the workshop was once to assemble the teams that had participated within the CLEF 2003 crusade so they may perhaps file at the result of their experiments.

Show description

Read Online or Download Comparative Evaluation of Multilingual Information Access Systems: 4th Workshop of the Cross-Language Evaluation Forum, CLEF 2003, Trondheim, Norway, August 21-22, 2003, Revised Selected Papers PDF

Similar software: office software books

The Entrepreneur's Strategy Guide: Ten Keys for Achieving Marketplace Leadership and Operational Excellence

In an atmosphere the place the probabilities of failure are a lot more than good fortune, what's going to make your organization a winner? Drawing from fifty years of expertise, examine, and commentary in entrepreneurial process, Tom Cannon bargains a video game plan for entrepreurs. Dividing the publication into easy parts—the industry (external surroundings) and the association (internal environment)—he outlines the 10 center services that each company needs to grasp as a way to be triumphant.

Extra resources for Comparative Evaluation of Multilingual Information Access Systems: 4th Workshop of the Cross-Language Evaluation Forum, CLEF 2003, Trondheim, Norway, August 21-22, 2003, Revised Selected Papers

Sample text

Hypertext Information Retrieval - Multimedia: Synergieeffekte Elektronischer Informationssysteme, Proceedings of HIM ’95, Universitätsverlag Konstanz, 9-- 28 3. : The Philosophy of Information Retrieval Evaluation. , and Kluck, M. ): Evaluation of Cross-Language Information Retrieval Systems. 2069, Springer Verlag (2002) 355--370 4. gov/ 5. jp/ntcir/ 6. Braschler, M. CLEF 2003 - Overview of Results: This volume. 7. : The Domain-Specific Task of CLEF – Specific Evaluation Strategies in Cross-Language Information Retrieval.

The practice of assessing the results on the basis of the “Narrative” means that only using the “Title” and/or “Description” parts of the topic implicitly assumes a particular interpretation of the user’s information need that is not (explicitly) contained in the actual query that is run in the experiment. The fact that the information contained in the title and description fields could have additional possible interpretations has influence only on the absolute values of the evaluation measures, which in general are inherently difficult to interpret.

Edu/pub/smart/ 12. : Content-Based Information Retrieval from Large Text and Audio Databases. 6 Evaluation Issues, Pages 22-29, Kluwer Academic Publishers, 1997. de Abstract. The reliability of the topics within the Cross Language Evaluation Forum (CLEF) needs to be validated constantly to justify the efforts for experiments within CLEF and to demonstrate the reliability of the results as far as possible. The analysis presented in this paper is concerned with several aspects. Continuing and expanding a study from 2002, we investigate the difficulty of topics and the correlation between the retrieval quality for topics and the occurrence of proper names.

Download PDF sample

Rated 4.97 of 5 – based on 15 votes