posted by user: allxufang || 4406 views || tracked by 20 users: [display]

CSE 2010 : SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

FacebookTwitterLinkedInGoogle

Link: http://www.ischool.utexas.edu/~cse2010/call.htm
 
When Jul 23, 2010 - Jul 23, 2010
Where Geneva
Submission Deadline Jun 15, 2010
Notification Due Jun 30, 2010
Final Version Due Jul 7, 2010
Categories    information retrieval   artificial intelligence   NLP
 

Call For Papers

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

Call For Papers: http://www.ischool.utexas.edu/~cse2010/call.htm

================
The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010)
solicits submissions on topics including but are not limited to the
following areas:

* Novel applications of crowdsourcing for evaluating search systems
(see examples below)

* Novel theoretical, experimental, and/or methodological
developments advancing state-of-the-art knowledge of crowdsourcing for
search evaluation

* Tutorials on how the different forms of crowdsourcing might be
best suited to or best executed in evaluating different search tasks

* New software packages which simplify or otherwise improve general
support for crowdsourcing, or particular support for crowdsourced search
evaluation

* Reflective or forward-looking vision on use of crowdsourcing in
search evaluation as informed by prior and/or ongoing studies

* How crowdsourcing technology or process can be adapted to
encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of
search evaluation involving significant use of a crowdsourcing platform
such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel
applications of crowdsourcing are of particular interest. This includes
but is not restricted to the following tasks:

* cross-vertical search (video, image, blog, etc.) evaluation,

* local search evaluation

* mobile search evaluation

* realtime/news search evaluation

* entity search evaluation

* discovering representative groups of rare queries, documents, and
events in the long-tail of search

* detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing
enable better assessment of a query's local intent, its local-specific
facets, or diversity of returned results? Could crowd-sourcing be
employed in near real-time to better assess query intent for breaking
news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and
innovative crowdsourcing-based search evaluation techniques (e.g. using
Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be
recognized with "Most Innovative Awards" as judged by the workshop
organizers. Selection will be based on the creativity, originality, and
potential impact of the described proposal, and we expect the winners to
describe risky, ground-breaking, and unexpected ideas. The provision of
awards is thanks to generous support from Microsoft Bing, and the number
and nature of the awards will depend on the quality of the submissions
and overall availability of funds. All valid submissions to the workshop
will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or ongoing
research. Long paper submissions (up to 8 pages) will be primarily
target oral presentations. Short papers submissions can be up to 4 pages
long, and will primarily target poster presentations. Papers should be
formatted in double-column ACM SIG proceedings format
(http://www.acm.org/sigs/publications/proceedings-templates). Papers
must be submitted as PDF files. Submissions should not be anonymized.

Important Dates

Submissions due: June 15, 2010
Notification of acceptance: June 30, 2010
Camera-ready submission: July 7, 2010
Workshop date: July 23, 2010
Organizers



Vitor Carvalho, Microsoft Bing

Matthew Lease, University of Texas at Austin

Emine Yilmaz, Microsoft Research



Program Committee



Eugene Agichtein, Emory University

Ben Carterette, University of Delaware

Charlie Clarke,University of Waterloo

Gareth Jones, Dublin City University

Michael Kaisser. University of Edinburgh

Jaap Kamps, University of Amsterdam

Gabriella Kazai, Microsoft Research

Winter Mason, Yahoo! Research

Stefano Mizzaro, University of Udine

Gheorghe Muresan, Microsoft Bing

Iadh Ounis, University of Glasgow

Mark Sanderson, University of Sheffield

Mark Smucker, University of Waterloo

Siddharth Suri, Yahoo! Research

Fang Xu, Saarland University


Questions?

Email the organizers at cse2010@ischool.utexas.edu

Related Resources

SIGIR 2017   The 40th International ACM SIGIR Conference on Research and Development in Information Retrieval
ACL 2018   56th Annual Meeting of the Association for Computational Linguistics
ETHE Blearning 2017   Blended learning in higher education: research findings
IJCAI 2018   International Joint Conferences on Artificial Intelligence Organization
IJCAI 2017   International Joint Conference on Artificial Intelligence
SIGIR 2018   International ACM SIGIR Conference on Research and Development in Information Retrieval
ECCV 2018   European Conference on Computer Vision
WSDM 2017   International Conference on Web Search and Data Mining
CVPR 2018   Computer Vision and Pattern Recognition
WSDM 2018   The 11th ACM International Conference on Web Search and Data Mining