![]() |
| |||||||||||
PAN 2010 : Evaluation Campaign on Plagiarism Detection and Wikipedia Vandalism Detection | |||||||||||
Link: http://pan.webis.de | |||||||||||
| |||||||||||
Call For Papers | |||||||||||
-------------------------------------------------------------------------------
Call for Participation: Evaluation Campaign on Plagiarism Detection and Wikipedia Vandalism Detection ------------------------------------------------------------------------------- held in conjunction with the CLEF'10 conference in Padua, Italy, September 20-23 http://pan.webis.de ------------------------------------------------------------------------------- About the Campaign: ------------------------------------------------------------------------------- Plagiarism detection in text documents is a challenging retrieval task: today's detection systems are faced with intricate situations, such as obfuscated plagiarism or plagiarism within and across languages. Moreover, the source of a plagiarism case may be hidden in a large collection of documents, or it may not be available at all. Informally, the respective CLEF-Lab task can be described as follows: 1. Plagiarism Detection. Given a set of suspicious documents and a set of source documents, the task is to find all plagiarized sections in the suspicious documents and, if available, the corresponding source sections. Following the success of the 2009 campaign and based on our experience we will provide a revised evaluation corpus consisting of artificial and simulated plagiarism. Vandalism has always been one of Wikipedia's biggest problems. However, the detection of vandalism is done mostly manually by volunteers, and research on automatic vandalism detection is still in its infancy. Hence, solutions are to be developed which aid Wikipedians in their efforts. Informally, the respective CLEF-Lab task can be described as follows: 2. Wikipedia Vandalism Detection. Given a set of edits on Wikipedia articles, the task is to identify all edits which are vandalism, i.e., all edits whose editors had bad intentions. Participants are invited to submit results for one or both of the tasks. ------------------------------------------------------------------------------- Important Dates: ------------------------------------------------------------------------------- open Registration Mar 01, 2010 Training corpora release (Preliminary training corpora are alread available!) May 03, 2010 Test corpora release Jun 01, 2010 Result submission deadline Jun 15, 2010 Notification of performance Jul 15, 2010 Paper submission deadline Aug 02, 2010 Notification of reviews Sep 01, 2010 Final paper deadline Sep 20-23, 2010 Evaluation lab at CLEF conference ------------------------------------------------------------------------------- Campaign Organization: ------------------------------------------------------------------------------- Webis @ Bauhaus-Universit�t Weimar http://www.webis.de NLEL @ Universidad Polit�cnica de Valencia http://www.dsic.upv.es/grupos/nle University of the Aegean http://www.icsd.aegean.gr/lecturers/stamatatos Bar-Ilan University http://u.cs.biu.ac.il/~koppel ------------------------------------------------------------------------------- Contact: ------------------------------------------------------------------------------- E-mail: pan@webis.de Campaign Web page: http://pan.webis.de |
|