posted by user: lucapulina || 1357 views || tracked by 1 users: [display]

QBFEVAL 2017 : QBFEVAL'17 - Competitive Evaluation of QBF Solvers


When Aug 28, 2017 - Sep 1, 2017
Where Melbourne
Submission Deadline May 22, 2017

Call For Papers

QBFEVAL'17 - Competitive Evaluation of QBF Solvers
Call for Participation

A joint event with SAT 2017 - The 20th International Conference on
Theory and Applications of Satisfiability Testing,
28 August - 1 September, Melbourne, Australia (2017)


QBFEVAL'17 will be the 2017 competitive evaluation of QBF solvers, and
the twelfth evaluation of QBF solvers and instances ever. QBFEVAL'17
will award solvers that stand out as being particularly effective on
specific categories of QBF instances. The evaluation will run using
the computing infrastructure made available by StarExec.

We warmly encourage developers of QBF solvers to submit their work,
even at early stages of development, as long as it fulfills some very
simple requirements. We also welcome the submission of QBF formulas to
be used for the evaluation. Researchers thinking about using QBF-based
techniques in their area (e.g., formal verification, planning,
knowledge reasoning) are invited to contribute to the evaluation by
submitting QBF instances of their research problems (see the
requirements for instances). The results of the evaluation will be a
good indicator of the current feasibility of QBF-based approaches and
a stimulus for people working on QBF solvers to further enhance their

Details about solvers and benchmarks submission, tracks, and related
rules, are available at

For questions, comments and any other issue regarding QBFEVAL'17,
please get in touch with

Important Dates

Registration open: April 1st 2017
Registration close: May 22nd 2017
Solvers and Benchmarks due: May 30th 2017
Final results: presented at SAT'17

Organizing committee

Luca Pulina, University of Sassari
Martina Seidl, Johannes Kepler Universität Linz

Olaf Beyersdorff, University of Leeds
Daniel Le Berre, Université d'Artois
Martin Suda, Technische Universität Wien
Christoph Wintersteiger, Microsoft Research Limited

Related Resources

LREC 2020   12th Conference on Language Resources and Evaluation
SEA 2020   18th Symposium on Experimental Algorithms
MaSPECS 2020   Modeling and Simulation for Performance Evaluation of Computer-based Systems track of the 34th International ECMS Conference on Modelling and Simulation
ENASE 2020   15th International Conference on Evaluation of Novel Approaches to Software Engineering
Performance 2020   New Developments of Performance Evaluation
CLEF 2020   11th Conference and Labs of the Evaluation Forum
SPECTS 2020   2020 International Symposium on Performance Evaluation of Computer and Telecommunication Systems Conference
TRECVID 2020   Trec Video Retrieval Evaluation
QEST 2020   International Conference on Quantitative Evaluation of SysTems
P-RECS 2020   3rd International Workshop on Practical Reproducible Evaluation of Systems