posted by user: grupocole || 2084 views || tracked by 6 users: [display]

NMEIR 2008 : ECIR 2008 Workshop on novel methodologies for evaluation in information retrieval

FacebookTwitterLinkedInGoogle

Link: http://ecir2008.dcs.gla.ac.uk/i
 
When Mar 30, 2008 - Apr 3, 2008
Where Glasgow, UK
Submission Deadline Feb 4, 2008
Notification Due Feb 20, 2008
Categories    information retrieval
 

Call For Papers

CALL FOR PAPERS

Workshop on novel methodologies for evaluation in information retrieval
At ECIR, 2008


OBJECTIVES
Information retrieval is an empirical science; the field cannot move forward unless there are means of evaluating the innovations devised by researchers. However the methodologies conceived in the early years of IR and used in the campaigns of today are starting to show their age and new research is emerging to understand how to overcome the twin challenges of scale and diversity.

Scale
The methodologies used to build test collections in the modern evaluation campaigns were originally conceived to work with collections of 10s of thousands of documents. The methodologies were found to scale well, but potential flaws are starting to emerge as test collections grow beyond 10s of millions of documents. Support for continued research in this area is crucial if IR research is to continue to evaluate large scale search.

Diversity
With the rise of the large Web search engines, some believed that all search problems could be solved with a single engine retrieving from a one vast data store. However, it is increasingly clear that evolution of retrieval is not towards a monolithic solution, but instead to a wide range of solutions tailored for different classes of information and different groups of users or organizations. Each tailored system on offer requires a different mixture of component technologies combined in distinct ways and each solution requires evaluation.

This workshop calls for research papers (max 8 pages) to be submitted on topics that address evaluation in Information Retrieval. Topics will include but are not limited to:
? test collection building for diverse needs
? new metrics and methodologies
? evaluation of multilingual IR and/or multimedia IR systems
? novel evaluation of related areas, such as QA or summarization
? evaluation of commercial systems
? Novel forms of user-centered evaluation

Papers will be peer reviewed by members of the workshop Programme Committee. A preliminary list of the PC members is

Paul Clough University of Sheffield
Franciska de Jong University of Twente
Thomas Deselaers RWTH Aachen University
Norbert Fuhr University of Duisburg
Gareth Jones Dublin City University
Jussi Karlgren Swedish Institute of Computer Science
Bernardo Magnini ITC-irst
Paul McNamee Johns Hopkins University
Henning Müller University & University Hospitals of Geneva
Stephen Robertson Microsoft Research
Tetsuya Sakai National Institute of Informatics


SUBMISSION
Papers will be submitted as PDFs in ACM SIG Proceedings format

http://www.acm.org/sigs/publications/proceedings-templates

Submit final versions of paper to m.sanderson@shef.ac.uk



IMPORTANT DATES
Submission date Monday 4 February
Notifications by 20 February
Final copy by 3 March.


WORKSHOP ORGANISERS AND CONTACT DETAILS:
The Workshop Chair is Mark Sanderson. Co-organisers are Martin Braschler, Nicola Ferro and Julio Gonzalo.

The workshop will be sponsored by Treble-CLEF, a Coordination Action under 7FP which will promote R&D, evaluation and technology transfer in the multilingual information access domain.

Related Resources

ECIR 2021   European Conference on Information Retrieval
CHEME 2021   5th International Conference on Chemical Engineering
ECNLPIR 2021   2021 European Conference on Natural Language Processing and Information Retrieval (ECNLPIR 2021)
IJCSEIT 2021   International Journal of Computer Science, Engineering and Information Technology
CSEIT 2021   8th International Conference on Computer Science, Engineering and Information Technology
ENASE 2021   16th International Conference on Evaluation of Novel Approaches to Software Engineering
EAI VALUETOOLS 2021   14th EAI International Conference on Performance Evaluation Methodologies and Tools (Springer, Scopus, ISI, Ei, more)
AS-RLPMTM 2021   Applied Sciences special issue Rich Linguistic Processing for Multilingual Text Mining
IJBISS 2021   International Journal of Business Information Systems Strategies
TRECVID 2021   TREC Video Retrieval Evaluation