Vinaora Nivo Slider 3.xVinaora Nivo Slider 3.xVinaora Nivo Slider 3.x
Mobile Testing
AGILE Technologies
WRSAT 2020 - WoRkshop on System and Acceptance Testing

WRSAT 2020 - WoRkshop on System and Acceptance Testing

WRSAT is the 1th WoRkshop on System and Acceptance Testing and it will be held in conjunction with the 13th IEEE International Conference on Software Testing, Verification and Validation (ICST) 2020.

System testing processes are conducted to assess the compliance of the complete, integrated, and deployed software system with its specified functional and nonfunctional requirements. Acceptance Testing is conducted with respect to user needs and requirements to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers, or other authorized subjects to determine whether or not to accept the system. The main goal of these testing processes is to generate test cases starting from requirement specifications and to execute them on the entire software system in execution on its hardware/software environment. Another goal is to assess whether the software system under test satisfies specific design, system and hardware specifications.

There are several issues affecting system and acceptance testing processes that need to be executed in the context of modern software systems development. It is well known that, due to the more and more reduced time to market, system and acceptance testing processes are often executed with limited budget and time resources. In industrial software processes, software companies need to demonstrate the quality of their products to third party authorities, and have to execute these processes in accordance with specific standards such as ISO/IEC 61508, ISO/IEC/IEEE 29119, ISO 26262, etc. In Agile and Devops software processes, system testing and acceptance testing should be executed within each iteration, thus requiring a considerable effort. The pervasive diffusion of IoT systems, Cloud, Microservices, and Blockchains in several application contexts is impacting the traditional approaches to system and acceptance testing and obliges to look for a different set of tools and techniques. 

WRSAT provides a conjunction place where researchers, practitioners, end users will have the opportunity to  present novel testing methodologies and technologies in the field of system and acceptance testing, as well as they can discuss about new problems they are facing and possible solutions, or present empirical results obtained by the execution of industrial case studies or controlled experiments. These novel solutions may relate to different types of modern software systems, such as context aware software systems, distributed software systems, IoT and Industry 4.0 systems, advanced driver-assistance systems and autonomous driving systems, web and mobile applications with rich and responsive GUIs, cyber physical software systems, blockchain, microservices based software systems, artificial intelligence and big data analytics software systems, developing frameworks, operating systems. At the same time, WRSAT aims to present novel approaches to system and acceptance testing that are applicable in various kinds of software processes and architectures, including Agile, DevOps, Microservices and Cloud systems. The problems related to the design, implementation and execution of test cases for system and acceptance testing depend on the complexity of the overall systems, the dependency of the software systems from the contexts that surround them, the heterogeneous technological nature of modern systems, etc. These problems become more severe each time a novel technology enters forcefully in the software market and new methodologies and techniques need to be introduced or well-known techniques and tools need to be adapted.

Specific problems regard the design and implementation of test cases:

  • How test cases are designed starting from requirements, models, or other artefacts?
  • How can end-users be effectively involved in the design of acceptance test cases?
  • How test cases can take into account the variability of the context where they system runs?
  • How the test cases can take into account the many different usage scenarios of the system?

Other concerns affect the dynamic execution of the test cases:

  • How test cases can be implemented and executed on the system under test?
  • How test cases can be executed on real or emulated testing environments?
  • How the testing environment can be set up for the execution and monitoring of the test cases?

WRSAT is meant to be cross technological and does not focus on a specific type of software system or a given testing technique. It addresses topics that can be relevant for different stakeholders involved in the considered testing processes, such as researchers, practitioners, programmers, end users, and project managers.

Topics of interest include, but is not limited to, the following:

  • Requirements based testing techniques and tools for system, or acceptance, testing.
  • Data Driven testing techniques and tools for system, or acceptance, testing.
  • Model Driven and Model Based testing techniques and tools for system, or acceptance, testing.
  • Artificial Intelligence based testing techniques for system, or acceptance, testing.
  • Scenario based testing techniques and tools for system, or acceptance, testing.
  • Context driven testing techniques and tools for system, or acceptance, testing.
  • System, or acceptance, testing processes automation.
  • Techniques, tools, and frameworks supporting the design, implementation, and deploy of real or emulated execution environments for system, or acceptance, testing processes.  
  • Functional and nonfunctional (performance, security, reliability, usability, portability, etc.) system, or acceptance, testing of modern software systems, such as: context aware software systems, IoT and Industry 4.0 systems, advanced driver-assistance systems and autonomous driving systems, new generations of web and mobile applications, cyber physical software systems, blockchain, microservices based software systems, cloud based software systems, artificial intelligence and big data analytics software systems, etc.
  • Functional and nonfunctional (performance, security, reliability, usability, portability, etc.) system, or acceptance, testing of legacy software systems, operating systems, development frameworks, etc.
  • Functional and nonfunctional (performance, security, reliability, usability, portability, etc.) system, or acceptance, testing of tools used in testing processes, i.e., coverage tools, process management tools, dynamic testing tools and frameworks, etc.
  • Empirical studies or practical experiences showing preliminary results about the execution  of industrial case studies or controlled experiments in the field of system, or acceptance, testing.
  • Empirical studies or practical experiences showing preliminary results about novel system, or acceptance, testing approaches in Agile software processes.

How to submit

Three types of papers can be submitted: 

  • Full research contributions. They will be 8 pages in two-column IEEE conference publication format. 
  • Short papers describing important directions for our community. They will be 4 pages in two-column IEEE conference publication format. 
  • Tool demo papers. They will be 2 pages in two-column IEEE conference publication format.

Papers must be submitted through EasyChair. The submissions should adhere to the IEEE template for conference proceedings.

Review Process

Each paper will undergo a review process, receiving at least two reviews from members of the program committee — using a standard bidding process. Following reviews, there will be an online discussion, and finally, the organizers will make final decisions on paper acceptance, using referee reviews and conclusions of the discussions. 

Submissions will be reviewed on the basis of:

  • Their relevance with the topics of System and Acceptance Testing
  • The novelty of the proposed technique or tool.
  • The maturity of the proposed technique or tool.
  • The presence of experiments and lessons learned.
  • The quality of the presentation.

Accepted papers will be published in the IEEE ICST 2020 Workshop Proceedings.

Workshop organizers

Domenico Amalfitano - University of Naples Federico II, Italy

Anna Rita Fasolino - University of Naples Federico II, Italy

Program Committee 

(being updated as invitations have just been sent)

  • Nataniel Borges Jr. - Saarland University, Germany
  • Santiago Matalonga - University of the West of Scotland, UK
  • Bao Nguyen - Google, USA
  • Maurizio Leotta - Università di Genova, Italy
  • Vânia Neves - UFF, Brazil
  • Shingo Takada - Keio University, Japan
  • Pekka Aho - Open Universiteit, the Netherlands
  • Alessio Gambi - Passau University, Germany
  • Henrique Madeira - University of Coimbra, Portugal
  • Takashi Ishio - Nara Institute of Science and Technology, Japan
  • Xiao Qu - ABB Corporate Research, USA
  • Ana Paiva - University of Porto, Portugal
  • Vinicius Durelli - Federal University of São João del Rei, Brazil
  • Adnan Causevic - Mälardalen University, Sweden
  • Xia Li - The University of Texas at Dallas, USA
  • Guowei Yang - Texas State University, USA
  • Pasqualina Potena - RISE Research Institutes of Sweden AB, Sweden

Important Dates

Abstract Submission: December 28th Extended to: January 14th 2020 (NOT Mandatory)

Paper Submission: January 7h Extended to: January 19th 2020 (Strict Deadline)

Notification of Acceptance: January 27th

Camera Ready: February 4th

Workshop: March 27th