ISSTA: International Symposium on Software Testing and Analysis

FacebookTwitterLinkedInGoogle

 

Past:   Proceedings on DBLP

Future:  Post a CFP for 2023 or later   |   Invite the Organizers Email

 
 

All CFPs on WikiCFP

Event When Where Deadline
ISSTA 2022 International Symposium on Software Testing and Analysis
Jul 18, 2022 - Jul 22, 2022 Daejeon, South Korea Jan 28, 2022
ISSTA 2021 International Symposium on Software Testing and Analysis
Jul 12, 2021 - Jul 16, 2021 Aarhus, Denmark Jan 29, 2021
ISSTA 2020 International Symposium on Software Testing and Analysis
Jul 18, 2020 - Jul 22, 2020 Los Angeles, US Jan 27, 2020
ISSTA 2019 International Symposium on Software Testing and Analysis
Jul 15, 2019 - Jul 19, 2019 Beijing, China Jan 28, 2019
ISSTA 2018 International Symposium on Software Testing and Analysis
Jul 16, 2018 - Jul 22, 2018 Amsterdam Jan 29, 2018
ISSTA 2017 International Symposium on Software Testing and Analysis
Jul 9, 2017 - Jul 13, 2017 Santa Barbara, California, USA Feb 3, 2017
ISSTA 2015 International Symposium on Software Testing and Analysis ISSTA 2015
Jul 12, 2015 - Jul 17, 2015 Baltimore, Maryland Jan 23, 2015
ISSTA 2014 International Symposium on Software Testing and Analysis
Jul 21, 2014 - Jul 26, 2014 Hilton San Jose, Bay Area, CA, USA Jan 24, 2014
ISSTA 2013 International Symposium on Software Testing and Analysis
Jul 15, 2013 - Jul 20, 2013 Lugano, Switzerland Jan 25, 2013
ISSTA 2012 International Symposium on Software Testing and Analysis
Jul 16, 2012 - Jul 20, 2012 Minneapolis, MN, USA Feb 3, 2012
ISSTA 2011 International Symposium on Software Testing and Analysis
Jul 17, 2011 - Jul 21, 2011 Toronto, ON, Canana Feb 4, 2011
ISSTA 2010 International Symposium on Software Testing and Analysis
Jul 12, 2010 - Jul 16, 2010 Trento, Italy Feb 5, 2010
ISSTA 2009 International Symposium on Software Testing and Analysis
Jul 19, 2009 - Jul 23, 2009 Chicago, IL, USA Jan 30, 2009
ISSTA 2008 International Symposium on Software Testing and Analysis
Jul 20, 2008 - Jul 24, 2008 Seattle, WA, USA Jan 31, 2008
 
 

Present CFP : 2022

# Call for Papers

DRAFT: Please re-check by December 2021 for any updates!

## Technical Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, methods for emerging systems, in-depth case studies, infrastructures of testing and analysis, or tools are welcome.

## Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Of special interest are experience papers that report on industrial applications of software testing and analysis methods or tools.

## Replicability Studies

ISSTA would like to encourage researchers to replicate results from previous papers. A replicability study must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, replicability studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A replicability study should clearly report on results that the authors were able to replicate as well as on aspects of the work that were not replicatable. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artifact only, but instead to perform a comparative experiment of multiple related approaches. In particular, replicability studies should follow the ACM guidelines on replicability (different team, different experimental setup): The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently. This means that it is also insufficient to focus on reproducibility (i.e., different team, same experimental setup) alone. Replicability Studies will be evaluated according to the following standards:

- Depth and breadth of experiments
- Clarity of writing
- Appropriateness of conclusions
- Amount of useful, actionable insights
- Availability of artifacts

We expect replicability studies to clearly point out the artifacts the study is built on, and to submit those artifacts to artifact evaluation (see below). Artifacts evaluated positively will be eligible to obtain the prestigious Results Reproduced badge.

For more information see: https://conf.researchr.org/track/issta-2022/issta-2022-technical-papers#Call-for-Papers
 

Related Resources

ATVA 2022   The 20th International Symposium on Automated Technology for Verification and Analysis
ICSEA 2022   The Seventeenth International Conference on Software Engineering Advances
SOFTPA 2022   International Conference on Emerging Practices in Software Process & Architecture
ICPR 2022   26th International Conference on Pattern Recognition
ASE 2022   37th IEEE/ACM International Conference on Automated Software Engineering
ICST 2022   15th IEEE International Conference on Software Testing, Verification and Validation (ICST) 2022
ESEC/FSE 2022   The ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering
SEKE 2022   The 34th International Conference on Software Engineering and Knowledge Engineering
Computer SI on SE4RAI 2023   IEEE Computer - Special Issue on Software Engineering for Responsible AI
DAS 2022   DAS 2022: 15th IAPR International Workshop on Document Analysis Systems