| |||||||||||||||
CSE 2010 : SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation | |||||||||||||||
Link: http://www.ischool.utexas.edu/~cse2010/call.htm | |||||||||||||||
| |||||||||||||||
Call For Papers | |||||||||||||||
SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation
Call For Papers: http://www.ischool.utexas.edu/~cse2010/call.htm ================ The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010) solicits submissions on topics including but are not limited to the following areas: * Novel applications of crowdsourcing for evaluating search systems (see examples below) * Novel theoretical, experimental, and/or methodological developments advancing state-of-the-art knowledge of crowdsourcing for search evaluation * Tutorials on how the different forms of crowdsourcing might be best suited to or best executed in evaluating different search tasks * New software packages which simplify or otherwise improve general support for crowdsourcing, or particular support for crowdsourced search evaluation * Reflective or forward-looking vision on use of crowdsourcing in search evaluation as informed by prior and/or ongoing studies * How crowdsourcing technology or process can be adapted to encourage and facilitate more participation from outside the USA The workshop especially calls for innovative solutions in the area of search evaluation involving significant use of a crowdsourcing platform such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel applications of crowdsourcing are of particular interest. This includes but is not restricted to the following tasks: * cross-vertical search (video, image, blog, etc.) evaluation, * local search evaluation * mobile search evaluation * realtime/news search evaluation * entity search evaluation * discovering representative groups of rare queries, documents, and events in the long-tail of search * detecting/evaluating query alterations For example, does the inherent geographic dispersal of crowdsourcing enable better assessment of a query's local intent, its local-specific facets, or diversity of returned results? Could crowd-sourcing be employed in near real-time to better assess query intent for breaking news and relevant information? Most Innovative Awards --- Sponsored by Microsoft Bing As further incentive to participation, authors of the most novel and innovative crowdsourcing-based search evaluation techniques (e.g. using Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be recognized with "Most Innovative Awards" as judged by the workshop organizers. Selection will be based on the creativity, originality, and potential impact of the described proposal, and we expect the winners to describe risky, ground-breaking, and unexpected ideas. The provision of awards is thanks to generous support from Microsoft Bing, and the number and nature of the awards will depend on the quality of the submissions and overall availability of funds. All valid submissions to the workshop will be considered for the awards. Submission Instructions Submissions should report new (unpublished) research results or ongoing research. Long paper submissions (up to 8 pages) will be primarily target oral presentations. Short papers submissions can be up to 4 pages long, and will primarily target poster presentations. Papers should be formatted in double-column ACM SIG proceedings format (http://www.acm.org/sigs/publications/proceedings-templates). Papers must be submitted as PDF files. Submissions should not be anonymized. Important Dates Submissions due: June 15, 2010 Notification of acceptance: June 30, 2010 Camera-ready submission: July 7, 2010 Workshop date: July 23, 2010 Organizers Vitor Carvalho, Microsoft Bing Matthew Lease, University of Texas at Austin Emine Yilmaz, Microsoft Research Program Committee Eugene Agichtein, Emory University Ben Carterette, University of Delaware Charlie Clarke,University of Waterloo Gareth Jones, Dublin City University Michael Kaisser. University of Edinburgh Jaap Kamps, University of Amsterdam Gabriella Kazai, Microsoft Research Winter Mason, Yahoo! Research Stefano Mizzaro, University of Udine Gheorghe Muresan, Microsoft Bing Iadh Ounis, University of Glasgow Mark Sanderson, University of Sheffield Mark Smucker, University of Waterloo Siddharth Suri, Yahoo! Research Fang Xu, Saarland University Questions? Email the organizers at cse2010@ischool.utexas.edu |
|