posted by user: grupocole || 2778 views || tracked by 26 users: [display]

CLEF 2015 : CLEF 2015: Conference and Labs of the Evaluation Forum


Conference Series : Cross-Language Evaluation Forum
When Sep 8, 2015 - Sep 11, 2015
Where Toulouse, France
Submission Deadline Apr 12, 2015
Notification Due Jun 15, 2015
Final Version Due Jun 30, 2015
Categories    information retrieval   NLP

Call For Papers

CLEF 2015: Conference and Labs of the Evaluation Forum

Experimental IR meets Multilinguality, Multimodality and Interaction

September 8-11, 2015, Toulouse, France

Submission Deadlines:

- Sunday April 12, 2015 (long papers)
- Sunday May 17 (short papers)


The CLEF Conference addresses all aspects of Information Access in any modality and language. The conference is teamed up with a series of workshops presenting the results of lab-based comparative evaluation. CLEF 2015 is the 6th year of the CLEF Conference and the 16th year of the CLEF initiative as a forum for IR Evaluation. The CLEF conference has a clear focus on experimental IR as done at the evaluation forums (CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, TAC, …) with special attention to the challenges of multimodality, multilinguality, and interactive search. We invite submissions on significant new insights demonstrated on the resulting IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield/TREC/CLEF paradigm.

All submissions to the CLEF main conference will be reviewed on the basis of relevance, originality, importance, and clarity. CLEF welcomes papers that describe rigorous hypothesis testing regardless of whether the results are positive or negative. Methods are expected to be written so that they are reproducible by others, and the logic of the research design is clearly described in the paper. The conference proceedings will be published in the Springer Lecture Notes in Computer Science (LNCS).


Relevant topics for the CLEF 2015 Conference include but are not limited to:

- Information Access in any language or modality: Information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.

- Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis.

- Evaluation Initiatives: Conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.

- Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.

- Technology Transfer: Economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.

- Interactive Information Retrieval Evaluation: the interactive evaluation of IR systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.

- Specific Application Domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.


Authors are invited to submit electronically original papers, which have not been published and are not under consideration elsewhere, using the LNCS proceedings format:

Two types of papers are solicited:

- Long papers: 12 pages max. Aimed to report complete research works.
- Short papers: 6 pages max. Position papers, new evaluation proposals, developments and applications, etc.

Papers will be peer-reviewed by at least 3 members of the program committee. Selection will be based on originality, clarity, and technical quality. Papers should be submitted in PDF format to the following address:


- Submission of Long Papers: April 12, 2015
- Submission of Short Papers: May 17, 2015
- Notification of Acceptance: June 15, 2015
- Camera Ready Copy due: June 30, 2015
- Conference: September 8-11, 2015


General chairs
- Josiane Mothe, IRIT, University of Toulouse, France
- Jacques Savoy, University of Neuchâtel, Switzerland

Program chairs
- Jaap Kamps, University of Amsterdam, The Netherlands
- Karen Pinel-Sauvagnat, IRIT, University of Toulouse, France

Labs chairs
- Gareth Jones, Dublin City University (DCU), Ireland
- Eric SanJuan, University of Avignon, France

Proceedings Chairs
- Linda Cappellato, University of Padua, Italy
- Nicola Ferro, University of Padua, Italy

Related Resources

CLEF 2016   Conference and Labs of the Evaluation Forum. Information Access Evaluation meets Multilinguality, Multimodality and Interaction
SIGIR 2017   The 40th International ACM SIGIR Conference on Research and Development in Information Retrieval
FIRE 2016   FIRE 2016 : Eigth meeting of the Forum for Information Retrieval Evaluation
CIKM 2017   The 26th 2017 ACM Conference on Information and Knowledge Management
ICPE 2017   8th ACM/SPEC International Conference on Performance Engineering
IJAIA 2016   International Journal of Artificial Intelligence & Applications
QEST 2017   14th International Conference on Quantitative Evaluation of SysTems
ACL 2017   The 55th annual meeting of the Association for Computational Linguistics
ENASE 2017   12th International Conference on Evaluation of Novel Approaches to Software Engineering
EACL SRW 2017   EACL 2017 Student Research Workshop