posted by user: lucapulina || 1715 views || tracked by 1 users: [display]

QBFEVAL 2017 : QBFEVAL'17 - Competitive Evaluation of QBF Solvers

FacebookTwitterLinkedInGoogle

Link: http://www.qbflib.org/qbfeval17.php
 
When Aug 28, 2017 - Sep 1, 2017
Where Melbourne
Submission Deadline May 22, 2017
 

Call For Papers

******************************************************************
QBFEVAL'17 - Competitive Evaluation of QBF Solvers
Call for Participation

A joint event with SAT 2017 - The 20th International Conference on
Theory and Applications of Satisfiability Testing,
28 August - 1 September, Melbourne, Australia (2017)

******************************************************************


QBFEVAL'17 will be the 2017 competitive evaluation of QBF solvers, and
the twelfth evaluation of QBF solvers and instances ever. QBFEVAL'17
will award solvers that stand out as being particularly effective on
specific categories of QBF instances. The evaluation will run using
the computing infrastructure made available by StarExec.

We warmly encourage developers of QBF solvers to submit their work,
even at early stages of development, as long as it fulfills some very
simple requirements. We also welcome the submission of QBF formulas to
be used for the evaluation. Researchers thinking about using QBF-based
techniques in their area (e.g., formal verification, planning,
knowledge reasoning) are invited to contribute to the evaluation by
submitting QBF instances of their research problems (see the
requirements for instances). The results of the evaluation will be a
good indicator of the current feasibility of QBF-based approaches and
a stimulus for people working on QBF solvers to further enhance their
tools.

Details about solvers and benchmarks submission, tracks, and related
rules, are available at http://www.qbflib.org/qbfeval17.php

For questions, comments and any other issue regarding QBFEVAL'17,
please get in touch with qbf17@qbflib.org.



Important Dates

Registration open: April 1st 2017
Registration close: May 22nd 2017
Solvers and Benchmarks due: May 30th 2017
Final results: presented at SAT'17


Organizing committee

Organization
Luca Pulina, University of Sassari
Martina Seidl, Johannes Kepler Universität Linz

Judges
Olaf Beyersdorff, University of Leeds
Daniel Le Berre, Université d'Artois
Martin Suda, Technische Universität Wien
Christoph Wintersteiger, Microsoft Research Limited

Related Resources

LREC-COLING 2024   The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation
EASE 2024   28th International Conference on Evaluation and Assessment in Software Engineering
Learning 2024   Thirty-First International Conference on Learning
CRBL 2024   International Conference on Cryptography and Blockchain
HumEval 2024   The Fourth Workshop on Human Evaluation of NLP Systems
OPTLJ 2024   Integrated Optics and Lightwave : An International Journal
PEMWN 2024   Performance Evaluation & Modeling in Wired and Wireless Networks
CLEF 2024   Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization
ENASE 2024   19th International Conference on Evaluation of Novel Approaches to Software Engineering
QNDE 2024   51st Annual Review of Progress in Quantitative Nondestructive Evaluation