posted by system || 2159 views || tracked by 2 users: [display]

BARS 2013 : ACM SIGIR Workshop on Benchmarking Adaptive Retrieval and Recommender Systems

FacebookTwitterLinkedInGoogle

Link: http://www.bars-workshop.org
 
When Aug 1, 2013 - Aug 1, 2013
Where Dublin, Ireland
Submission Deadline Jun 14, 2013
Categories    data mining   informatioin retrieval
 

Call For Papers

In recent years, immense progress has been made in the development of recommendation, retrieval and personalisation techniques. The evaluation of these systems is still based on traditional information retrieval and statistics metrics, e.g. precision, recall and/or RMSE, often not taking the use-case and situation of the system into consideration. However, the rapid evolution of recommender and adaptive IR systems in both their goals and their application domains foster the need for new evaluation methodologies and environments.

The workshop will be followed by a special issue of ACM TiST on Recommender System Benchmarking. Authors of high-quality papers from the workshop will be encouraged to submit extended versions of their work to the journal.
Submission topics

We invite the submission of papers reporting relevant research in the area of benchmarking and evaluation of recommendation and adaptive IR systems. We welcome submissions presenting contributions in this scope, addressing the following topics:

New metrics and methods for quality estimation of recommender and adaptive IR systems
Novel frameworks for the user-centric evaluation of adaptive systems
Validation of off-line methods with online studies
Comparison of evaluation metrics and methods
Comparison of recommender and IR approaches across multiple systems and domains
Measuring technical constraints vs. accuracy
New datasets for the evaluation of recommender and adaptive IR systems
Benchmarking frameworks
Multiple-objective benchmarking

Submissions

We invite the submission of papers reporting original research, studies, advances, or experiences in this area. Two submission types are accepted: long papers of up to 8 pages, and short papers up to 4 pages, in the standard ACM SIG proceedings format. Paper submissions and reviews will be handled electronically.

Each paper will be evaluated by at least three reviewers from the Program Committee. The papers will be evaluated for their originality, contribution significance, soundness, clarity, and overall quality. The interest of contributions will be assessed in terms of technical and scientific findings, contribution to the knowledge and understanding of the problem, methodological advancements, or applicative value.

Submission instructions can be found on the Submissions page.
Important dates

Paper submission: June 14, 2013
Notification: June 28th, 2013
Camera ready: July 13th, 2013
Workshop: August 1st, 2013

Related Resources

SIGIR 2025   The 48th International ACM SIGIR Conference on Research and Development in Information Retrieval
Ei/Scopus-CCNML 2025   2025 5th International Conference on Communications, Networking and Machine Learning (CCNML 2025)
Ei/Scopus-SGGEA 2025   2025 2nd Asia Conference on Smart Grid, Green Energy and Applications (SGGEA 2025)
IIR 2025   15th Italian Information Retrieval Workshop
OARS-KDD2025   CFP: KDD 2025 Workshop on Online and Adaptive Recommender Systems (OARS)
IEEE- CCRIS 2025   2025 IEEE 6th International Conference on Control, Robotics and Intelligent System (CCRIS 2025)
Ei/Scopus-MLBDM 2025   2025 5th International Conference on Machine Learning and Big Data Management (MLBDM 2025)
LLM4Eval 2025   Third LLM4Eval@SIGIR 2025
ACM SAC 2025   40th ACM/SIGAPP Symposium On Applied Computing
BPOD 2025   The Eighth IEEE International Workshop on Benchmarking, Performance Tuning and Optimization for Big Data Analytics and Big Models