posted by user: grupocole || 2251 views || tracked by 6 users: [display]

NMEIR 2008 : ECIR 2008 Workshop on novel methodologies for evaluation in information retrieval


When Mar 30, 2008 - Apr 3, 2008
Where Glasgow, UK
Submission Deadline Feb 4, 2008
Notification Due Feb 20, 2008
Categories    information retrieval

Call For Papers


Workshop on novel methodologies for evaluation in information retrieval
At ECIR, 2008

Information retrieval is an empirical science; the field cannot move forward unless there are means of evaluating the innovations devised by researchers. However the methodologies conceived in the early years of IR and used in the campaigns of today are starting to show their age and new research is emerging to understand how to overcome the twin challenges of scale and diversity.

The methodologies used to build test collections in the modern evaluation campaigns were originally conceived to work with collections of 10s of thousands of documents. The methodologies were found to scale well, but potential flaws are starting to emerge as test collections grow beyond 10s of millions of documents. Support for continued research in this area is crucial if IR research is to continue to evaluate large scale search.

With the rise of the large Web search engines, some believed that all search problems could be solved with a single engine retrieving from a one vast data store. However, it is increasingly clear that evolution of retrieval is not towards a monolithic solution, but instead to a wide range of solutions tailored for different classes of information and different groups of users or organizations. Each tailored system on offer requires a different mixture of component technologies combined in distinct ways and each solution requires evaluation.

This workshop calls for research papers (max 8 pages) to be submitted on topics that address evaluation in Information Retrieval. Topics will include but are not limited to:
? test collection building for diverse needs
? new metrics and methodologies
? evaluation of multilingual IR and/or multimedia IR systems
? novel evaluation of related areas, such as QA or summarization
? evaluation of commercial systems
? Novel forms of user-centered evaluation

Papers will be peer reviewed by members of the workshop Programme Committee. A preliminary list of the PC members is

Paul Clough University of Sheffield
Franciska de Jong University of Twente
Thomas Deselaers RWTH Aachen University
Norbert Fuhr University of Duisburg
Gareth Jones Dublin City University
Jussi Karlgren Swedish Institute of Computer Science
Bernardo Magnini ITC-irst
Paul McNamee Johns Hopkins University
Henning Müller University & University Hospitals of Geneva
Stephen Robertson Microsoft Research
Tetsuya Sakai National Institute of Informatics

Papers will be submitted as PDFs in ACM SIG Proceedings format

Submit final versions of paper to

Submission date Monday 4 February
Notifications by 20 February
Final copy by 3 March.

The Workshop Chair is Mark Sanderson. Co-organisers are Martin Braschler, Nicola Ferro and Julio Gonzalo.

The workshop will be sponsored by Treble-CLEF, a Coordination Action under 7FP which will promote R&D, evaluation and technology transfer in the multilingual information access domain.

Related Resources

SIGIR 2022   The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
CLEF 2022   13th Conference and Labs of the Evaluation Forum
ECNLPIR 2022   2022 European Conference on Natural Language Processing and Information Retrieval (ECNLPIR 2022)
MLAIJ 2022   Machine Learning and Applications: An International Journal
NLDB 2022   27th International Conference on Natural Language & Information Systems
ENASE 2022   17th International Conference on Evaluation of Novel Approaches to Software Engineering
ECIR 2022   European Conference on Information Retrieval
SEC 2022   8th International Conference on Software Engineering
ECNLPIR 2022   2022 European Conference on Natural Language Processing and Information Retrieval (ECNLPIR 2022)