ISSTA: International Symposium on Software Testing and Analysis

FacebookTwitterLinkedInGoogle

 

Past:   Proceedings on DBLP

Future:  Post a CFP for 2020 or later   |   Invite the Organizers Email

 
 

All CFPs on WikiCFP

Event When Where Deadline
ISSTA 2019 International Symposium on Software Testing and Analysis
Jul 15, 2019 - Jul 19, 2019 Beijing, China Jan 28, 2019
ISSTA 2018 International Symposium on Software Testing and Analysis
Jul 16, 2018 - Jul 22, 2018 Amsterdam Jan 29, 2018
ISSTA 2017 International Symposium on Software Testing and Analysis
Jul 9, 2017 - Jul 13, 2017 Santa Barbara, California, USA Feb 3, 2017
ISSTA 2016 International Symposium on Software Testing and Analysis
Jul 18, 2016 - Jul 20, 2016 Saarland University, Saarbrücken, German Jan 29, 2016
ISSTA 2015 International Symposium on Software Testing and Analysis ISSTA 2015
Jul 12, 2015 - Jul 17, 2015 Baltimore, Maryland Jan 23, 2015
ISSTA 2014 International Symposium on Software Testing and Analysis
Jul 21, 2014 - Jul 26, 2014 Hilton San Jose, Bay Area, CA, USA Jan 24, 2014
ISSTA 2013 International Symposium on Software Testing and Analysis
Jul 15, 2013 - Jul 20, 2013 Lugano, Switzerland Jan 25, 2013
ISSTA 2012 International Symposium on Software Testing and Analysis
Jul 16, 2012 - Jul 20, 2012 Minneapolis, MN, USA Feb 3, 2012
ISSTA 2011 International Symposium on Software Testing and Analysis
Jul 17, 2011 - Jul 21, 2011 Toronto, ON, Canana Feb 4, 2011
ISSTA 2010 International Symposium on Software Testing and Analysis
Jul 12, 2010 - Jul 16, 2010 Trento, Italy Feb 5, 2010
ISSTA 2009 International Symposium on Software Testing and Analysis
Jul 19, 2009 - Jul 23, 2009 Chicago, IL, USA Jan 30, 2009
ISSTA 2008 International Symposium on Software Testing and Analysis
Jul 20, 2008 - Jul 24, 2008 Seattle, WA, USA Jan 31, 2008
 
 

Present CFP : 2019

Call for Submission
Technical Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, in-depth case studies, infrastructures of testing and analysis methods or tools are welcome.


Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Of special interest are experience papers that report on industrial applications of software testing and analysis methods or tools.


Reproducibility Studies

ISSTA would like to encourage researchers to reproduce results from previous papers. A reproducibility study must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, reproducibility studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A reproducibility study should clearly report on results that the authors were able to reproduce as well as on aspects of the work that were irreproducible. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artifact only, but instead to perform a comparative experiment of multiple related approaches.
In particular, reproducibility studies should follow the ACM guidelines on reproducibility (different team, different experimental setup): The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently.
This means that it is also insufficient to focus on repeatability (i.e., same experiment) alone. Reproducibility Studies will be evaluated according to the following standards:
Depth and breadth of experiments
Clarity of writing
Appropriateness of Conclusions
Amount of useful, actionable insights
Availability of artifacts
In particular, we expect reproducibility studies to clearly point out the artifacts the study is built on, and to submit those artifacts to artifact evaluation (see below). Artifacts evaluated positively will be eligible to obtain the highly prestigious badges Results Replicated or Results Reproduced.

https://conf.researchr.org/track/issta-2019/issta-2019-Technical-Papers#Call-for-Submission

Submissions Guildline


Submissions must be original and should not have been published previously or be under consideration for publication while being evaluated for this symposium. Authors are required to adhere to the ACM Policy and Procedures on Plagiarism and the ACM Policy on Prior Publication and Simultaneous Submissions. More details are available at the Submission Policies page.

Research and Experience Papers as well as Reproducibility Studies should be at most 10 pages in length, with at most 2 additional pages for references. All papers must be prepared in ACM Conference Format.

Submit your papers via the HotCRP ISSTA 2019 submission website.

Double-blind Reviewing


ISSTA 2019 will conduct double-blind reviewing. Submissions should not reveal the identity of the authors in any way. Authors should leave out author names and affiliations from the body of their submission. They should also ensure that any citations to related work by themselves are written in third person, that is, "the prior work of XYZ" as opposed to "our prior work". More details are available at the Double-Blind Reviewing page. Authors with further questions on double-blind reviewing are encouraged to contact the Program Chair by email.
Supplementary Material


Authors are free to provide supplementary material if that material supports the claims in the paper. Such material may include proofs, experimental results, and/or data sets. This material should be uploaded at the same time as the submission. Any supplementary material must also be anonymized. Reviewers are not required to examine the supplementary material but may refer to it if they would like to find further evidence supporting the claims in the paper.

Reviews and Responses


Reviewing will happen in two phases.
In Phase 1, each paper will receive three reviews, followed by an author response. Depending on the response, papers with negative reviews might be rejected early at this point.
Other papers will proceed to Phase 2, at which they might receive additional reviews where necessary, to which authors can respond in a second author-response phase.
 

Related Resources

ISSTA 2018   International Symposium on Software Testing and Analysis
QRS 2019   The 19th IEEE International Conference on Software Quality, Reliability, and Security
RQD 2019   25th ISSAT International Conference on Reliability and Quality in Design
2019 IEEE AITest 2019   The First IEEE International Conference on Artificial Intelligence Testing
ITEQS 2019   3nd International Workshop on Testing Extra-Functional Properties and Quality Characteristics of Software Systems
ADON 2018   International Workshop on Anomaly Detection ON the Cloud and the Internet of Things
CAIP 2019   Computer Analysis of Images and Patterns
ICDAR 2019   International Conference on Document Analysis and Recognition
DSA 2019   The Frontiers in Intelligent Data and Signal Analysis DSA 2019
ENASE 2019   14th International Conference on Evaluation of Novel Approaches to Software Engineering