posted by user: jaichberg || 3655 views || tracked by 26 users: [display]

ISSTA 2022 : International Symposium on Software Testing and Analysis

FacebookTwitterLinkedInGoogle


Conference Series : International Symposium on Software Testing and Analysis
 
Link: https://conf.researchr.org/home/issta-2022
 
When Jul 18, 2022 - Jul 22, 2022
Where Daejeon, South Korea
Submission Deadline Jan 28, 2022
Notification Due Apr 11, 2022
 

Call For Papers

# Call for Papers

DRAFT: Please re-check by December 2021 for any updates!

## Technical Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, methods for emerging systems, in-depth case studies, infrastructures of testing and analysis, or tools are welcome.

## Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Of special interest are experience papers that report on industrial applications of software testing and analysis methods or tools.

## Replicability Studies

ISSTA would like to encourage researchers to replicate results from previous papers. A replicability study must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, replicability studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A replicability study should clearly report on results that the authors were able to replicate as well as on aspects of the work that were not replicatable. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artifact only, but instead to perform a comparative experiment of multiple related approaches. In particular, replicability studies should follow the ACM guidelines on replicability (different team, different experimental setup): The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently. This means that it is also insufficient to focus on reproducibility (i.e., different team, same experimental setup) alone. Replicability Studies will be evaluated according to the following standards:

- Depth and breadth of experiments
- Clarity of writing
- Appropriateness of conclusions
- Amount of useful, actionable insights
- Availability of artifacts

We expect replicability studies to clearly point out the artifacts the study is built on, and to submit those artifacts to artifact evaluation (see below). Artifacts evaluated positively will be eligible to obtain the prestigious Results Reproduced badge.

For more information see: https://conf.researchr.org/track/issta-2022/issta-2022-technical-papers#Call-for-Papers

Related Resources

ISSTA 2023   The ACM SIGSOFT International Symposium on Software Testing and Analysis (Second Round)
ISSTA 2023   The ACM SIGSOFT International Symposium on Software Testing and Analysis (First Round)
InSTA 2023   10th International Workshop on Software Test Architecture
ITEQS 2023   7th International Workshop on Testing Extra-Functional Properties and Quality Characteristics of Software Systems
AST 2023   4th ACM/IEEE International Conference on Automation of Software Test (AST)
VST 2023   6th Workshop on Validation, Analysis and Evolution of Software Tests
SEKE 2023   The 35th International Conference on Software Engineering and Knowledge Engineering
IDA 2023   The 21th International Symposium on Intelligent Data Analysis
ICST 2023   16th IEEE International Conference on Software Testing, Verification and Validation
SOMET 2023   SOMET 2023 : The 22nd International Conference on Intelligent Software Methodologies, Tools, and Techniques