posted by user: allxufang || 4303 views || tracked by 20 users: [display]

CSE 2010 : SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation


When Jul 23, 2010 - Jul 23, 2010
Where Geneva
Submission Deadline Jun 15, 2010
Notification Due Jun 30, 2010
Final Version Due Jul 7, 2010
Categories    information retrieval   artificial intelligence   NLP

Call For Papers

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

Call For Papers:

The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010)
solicits submissions on topics including but are not limited to the
following areas:

* Novel applications of crowdsourcing for evaluating search systems
(see examples below)

* Novel theoretical, experimental, and/or methodological
developments advancing state-of-the-art knowledge of crowdsourcing for
search evaluation

* Tutorials on how the different forms of crowdsourcing might be
best suited to or best executed in evaluating different search tasks

* New software packages which simplify or otherwise improve general
support for crowdsourcing, or particular support for crowdsourced search

* Reflective or forward-looking vision on use of crowdsourcing in
search evaluation as informed by prior and/or ongoing studies

* How crowdsourcing technology or process can be adapted to
encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of
search evaluation involving significant use of a crowdsourcing platform
such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel
applications of crowdsourcing are of particular interest. This includes
but is not restricted to the following tasks:

* cross-vertical search (video, image, blog, etc.) evaluation,

* local search evaluation

* mobile search evaluation

* realtime/news search evaluation

* entity search evaluation

* discovering representative groups of rare queries, documents, and
events in the long-tail of search

* detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing
enable better assessment of a query's local intent, its local-specific
facets, or diversity of returned results? Could crowd-sourcing be
employed in near real-time to better assess query intent for breaking
news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and
innovative crowdsourcing-based search evaluation techniques (e.g. using
Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be
recognized with "Most Innovative Awards" as judged by the workshop
organizers. Selection will be based on the creativity, originality, and
potential impact of the described proposal, and we expect the winners to
describe risky, ground-breaking, and unexpected ideas. The provision of
awards is thanks to generous support from Microsoft Bing, and the number
and nature of the awards will depend on the quality of the submissions
and overall availability of funds. All valid submissions to the workshop
will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or ongoing
research. Long paper submissions (up to 8 pages) will be primarily
target oral presentations. Short papers submissions can be up to 4 pages
long, and will primarily target poster presentations. Papers should be
formatted in double-column ACM SIG proceedings format
( Papers
must be submitted as PDF files. Submissions should not be anonymized.

Important Dates

Submissions due: June 15, 2010
Notification of acceptance: June 30, 2010
Camera-ready submission: July 7, 2010
Workshop date: July 23, 2010

Vitor Carvalho, Microsoft Bing

Matthew Lease, University of Texas at Austin

Emine Yilmaz, Microsoft Research

Program Committee

Eugene Agichtein, Emory University

Ben Carterette, University of Delaware

Charlie Clarke,University of Waterloo

Gareth Jones, Dublin City University

Michael Kaisser. University of Edinburgh

Jaap Kamps, University of Amsterdam

Gabriella Kazai, Microsoft Research

Winter Mason, Yahoo! Research

Stefano Mizzaro, University of Udine

Gheorghe Muresan, Microsoft Bing

Iadh Ounis, University of Glasgow

Mark Sanderson, University of Sheffield

Mark Smucker, University of Waterloo

Siddharth Suri, Yahoo! Research

Fang Xu, Saarland University


Email the organizers at

Related Resources

SIGIR 2017   The 40th International ACM SIGIR Conference on Research and Development in Information Retrieval
IUI 2017   Intelligent User Interfaces
LREC 2018   Language Resources and Evaluation Conference
ETHE Blearning 2017   Blended learning in higher education: research findings
IJCAI 2017   International Joint Conference on Artificial Intelligence
ICSC 2018   12th IEEE International Conference on Semantic Computing (ICSC 2018)
WSDM 2017   International Conference on Web Search and Data Mining
IPMU 2018   17th Information Processing and Management of Uncertainty in Knowledge-Based Systems Conference
WSDM 2018   The 11th ACM International Conference on Web Search and Data Mining
ICMV 2017   2017 The 10th International Conference on Machine Vision (ICMV 2017) - SPIE