posted by organizer: joaopascoalfaria || 1096 views || tracked by 2 users: [display]

A-TEST 2023 : 14th Workshop on Automating Test Case Design, Selection and Evaluation


When Sep 15, 2023 - Sep 15, 2023
Where Kirchberg, Luxembourg
Abstract Registration Due Jul 14, 2023
Submission Deadline Jul 21, 2023
Notification Due Aug 11, 2023
Final Version Due Aug 18, 2023
Categories    software testing   test automation

Call For Papers

A-TEST 2023
14th Workshop on Automating Test Case Design, Selection and Evaluation

September 15, 2023
Kirchberg, Luxembourg
Co-located with ASE 2023

Abstract submission deadline: July 14, 2023 (non-mandatory)
Submission deadline: July 21, 2023
Author notification: August 11, 2023
Camera-ready submission: August 18, 2023 (hard)
(All dates are 23:59:59 AoE)

Testing is at the moment the most important and mostly used quality assurance technique applied in the software industry. However, the complexity of software, and hence of their development amount, is increasing.

Even though many test automation tools are currently available to aid test planning and control as well as test case execution and monitoring, all these tools share a similar passive philosophy towards test case design, selection of test data and test evaluation. They leave these crucial, time-consuming and demanding activities to the human tester. This is not without reason; test case design and test evaluation through oracles are difficult to automate with the techniques available in current industrial practices. The domain of possible inputs (potential test cases), even for a trivial method, program, model, user interface or service is typically too large to be exhaustively explored.
Consequently, one of the major challenges associated with test case design is the selection of test cases that are effective at finding flaws without requiring an excessive number of tests to be carried out. Automation of the entire test process requires new thinking that goes beyond test design or specific test execution tools. These are the problems that this workshop aims to attack.

For the past thirteen years, the Workshop on Automating Test Case Design, Selection and Evaluation (A-TEST) has provided a venue for researchers and industry members alike to exchange and discuss trending views, ideas, state of the art, work in progress, and scientific results on automated testing.

The 14th edition of A-TEST will be co-located with and organised at ASE 2023.
A-TEST 2023 is planned to take place in person in Kirchberg, Luxembourg.

Topics of interest include, but are not limited to:

- Techniques and tools for automating test case design, generation, and selection, e.g., model-based approaches, mutation approaches, metamorphic approaches, combinatorial-based approaches, search-based approaches, symbolic-based approaches, chaos testing, machine learning testing.
- New trends in the use of machine learning (ML) and artificial intelligence (AI) to improve test automation, and new approaches to automate the testing of AI/ML-driven systems.
- Test case and test process optimization.
- Test case evolution, repair, and reuse.
- Test case evaluation and metrics.
- Test case design, selection, and evaluation in emerging domains like natural user interfaces, cloud, edge, and IoT-based systems, cyber-physical systems, social networks, games, and extended reality.
- Case studies that have evaluated an existing technique or tool on real systems (not only toy problems), to show its benefits as compared to other approaches.
- Experience/industry reports.
- Education on software testing.

Authors are invited to submit papers to the workshop, and present and discuss
them at the event on topics related to automated software testing. Paper
submissions can be of the following types:

- Full papers (max. 8 pages) describing original, complete, and validated research - either empirical or theoretical - in A-TEST related techniques, tools, or industrial case studies.
- Work-in-progress papers (max. 4 pages) that describe novel, interesting, and high-potential work in progress, but not necessarily reaching full completion (e.g., not completely validated).
- Tool papers (max. 4 pages) presenting some testing tool in a way that it could be presented to industry as a start of successful technology transfer.
- Technology transfer papers (max. 4 pages) describing industry-academia co-operation.
- Position papers (max. 2 pages) that analyse trends and raise issues of importance. Position papers are intended to generate discussion and debate during the workshop.

All submissions must be in English and in PDF format. At the time of submission, all papers must conform to the ASE 2023 Format and Submission Guidelines. A-TEST 2023 will employ a single-blind review process.

Contributions must be submitted electronically through EasyChair:

Each submission will be reviewed by at least three members of the program committee. Full papers will be evaluated on the basis of originality, importance of contribution, soundness, evaluation, quality of presentation, and appropriate comparison to related work. Work-in-progress and position papers will be reviewed with respect to relevance and their ability to start up fruitful discussions. Tool and technology transfer papers will be evaluated based on improvement on the state-of-the-practice and clarity of lessons learned.

Submitted papers must not have been published elsewhere and must not be under review or submitted for review elsewhere during the duration of consideration. To prevent double submissions, the chairs may compare the submissions with related conferences that have overlapping review periods. The double submission restriction applies only to refereed journals and conferences, not to unrefereed pre-publication archive servers (e.g., Submissions that do not comply with the foregoing instructions will be desk rejected without being reviewed.

All accepted contributions will appear in the ACM Digital Library, providing a lasting archived record of the workshop proceedings. At least one author of each accepted paper must register and present the paper in person at A-TEST 2023 in order for the paper to be published in the proceedings.

João Pascoal Faria, General Chair (University of Porto, Portugal)
Anna Rita Fasolino, Program Co-Chair (University of Naples Federico II, Italy)
Freek Verbeek, Program Co-Chair (Open University, The Netherlands)
Kevin Moran, Student Competition Chair (George Mason University, USA)
Bruno Lima, Publicity and Web Chair, University of Porto, Portugal

All questions about submissions should be emailed to the workshop organisers at

Related Resources

JEDT 2024   International Journal of Electronic Design and Test
BIBC 2024   5th International Conference on Big Data, IOT and Blockchain
VST 2024   7th Workshop on Validation, Analysis and Evolution of Software Tests
ACITY 2024   14th International Conference on Advances in Computing and Information Technology
CypressConf 2024   Test Automation Conference 2024
SOFEA 2024   10th International Conference on Software Engineering and Applications
VDAT 2024   28th International Symposium on VLSI Design and Test (VDAT-2024)
CBIoT 2024   5th International Conference on Cloud, Big Data and IoT
CSET 2024   Cyber Security Experimentation and Test (CSET)
IJCSES 2024   International Journal of Computer Science and Engineering Survey