posted by user: msohaibayub || 776 views || tracked by 34 users: [display]

ISSTA 2018 : International Symposium on Software Testing and Analysis

FacebookTwitterLinkedInGoogle


Conference Series : International Symposium on Software Testing and Analysis
 
Link: http://conf.researchr.org/home/issta-2018
 
When Jul 16, 2018 - Jul 22, 2018
Where Amsterdam
Submission Deadline Jan 29, 2018
Notification Due May 2, 2018
Final Version Due Jun 8, 2018
 

Call For Papers

The ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA) is the leading research symposium on software testing and analysis, bringing together academics, industrial researchers, and practitioners to exchange new ideas, problems, and experience on how to analyze and test software systems. ISSTA’18 will be co-located with the European Conference on Object-Oriented Programming (ECOOP ’18), and with Curry On, a conference focused on programming languages & emerging challenges in industry.

Research Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, in-depth case studies, infrastructures of testing and analysis methods or tools are welcome.

Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Of special interest are experience papers that report on industrial applications of software testing and analysis methods or tools.

Reproducibility Studies (New!)

ISSTA would like to encourage researchers to reproduce results from previous papers, which is why ISSTA 2018 will introduce a new paper category called Reproducibility Studies. A reproducibility study must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, reproducibility studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A reproducibility study should clearly report on results that the authors were able to reproduce as well as on aspects of the work that were irreproducible. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artifact only, but instead to perform a comparative experiment of multiple related approaches.

In particular, reproducibility studies should follow the ACM guidelines on reproducibility (different team, different experimental setup): The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently.

This means that it is also insufficient to focus on repeatability (i.e., same experiment) alone. Reproducibility Studies will be evaluated according to the following standards:

Depth and breadth of experiments
Clarity of writing
Appropriateness of Conclusions
Amount of useful, actionable insights
Availability of artifacts
In particular, we expect reproducibility studies to clearly point out the artifacts the study is built on, and to submit those artifacts to artifact evaluation (see below). Artifacts evaluated positively will be eligible to obtain the highly prestigious badges Results Replicated or Results Reproduced.

Submission Guidelines

Submissions must be original and should not have been published previously or be under consideration for publication while being evaluated for this symposium. Authors are required to adhere to the ACM Policy and Procedures on Plagiarism and the ACM Policy on Prior Publication and Simultaneous Submissions.

Research and Experience Papers as well as Reproducibility Studies should be at most 10 pages in length, with at most 2 additional pages for references. All papers must be prepared in ACM Conference Format.

Double-blind reviewing

ISSTA 2018 will conduct double-blind reviewing. Submissions should not reveal the identity of the authors in any way. Authors should leave out author names and affiliations from the body of their submission. They should also ensure that any citations to related work by themselves are written in third person, that is, “the prior work of XYZ” as opposed to “our prior work”. Authors with further questions on double-blind reviewing are encouraged to contact the Program Chair by email.

Submit your papers via the HotCRP ISSTA 2018 submission website.

Reviews and Responses

Reviewing will happen in two phases. In Phase 1, each paper will receive three reviews, followed by an author response. Depending on the response, papers with negative reviews might be rejected early at this point. Other papers will proceed to Phase 2, at which they might receive additional reviews where necessary, to which authors can respond in a second author-response phase.

Important Dates
When Track What
Mon 29 Jan 2018 Technical Papers Paper Submission
Mon 19 - Wed 21 Mar 2018 Technical Papers Phase 1 Author Response
Fri 30 Mar 2018 Technical Papers Early-reject Author Notification
Tue 17 - Thu 19 Apr 2018 Technical Papers Phase 2 Author Response
Wed 2 May 2018 Technical Papers Final Author Notification
Mon 7 May 2018 Artifacts Artifact Submission
Fri 1 Jun 2018 Artifacts Artifact Author Notification
Fri 8 Jun 2018 Artifacts Camera-ready
Fri 8 Jun 2018 Technical Papers Camera-ready

Related Resources

ISSTA 2017   International Symposium on Software Testing and Analysis
A-MOST 2018   Advances in Model-Based Software Testing
ICSR 2018   17th International Conference on Software Reuse (ICSR 2018) Madrid (Spain), May 21-23rd, 2018
VVIoT 2018   International Workshop on Verification and Validation of Internet of Things
ITEQS 2018   2nd International Workshop on Testing Extra-Functional Properties and Quality Characteristics of Software Systems
ICSA 2018   International Conference on Software Architecture
IEEE - ICBDA 2018   2018 IEEE 3rd International Conference on Big Data Analysis (ICBDA 2018)--IEEE Xplore and Ei Compendex
COMPSAC 2018   Computer Software and Applications Conference
ICST 2018   The 11th IEEE International Conference on Software Testing, Verification, and Validation (ICST 2018)
ICSE Workshops 2018   ACM Workshops@International Conference on Software Engineering