Towards Feature-based ML-enabled Behaviour Location

Research output: Contribution in Book/Catalog/Report/Conference proceedingConference contribution

2 Downloads (Pure)

Abstract

Mapping behaviours to the features they relate to is a prerequisite for variability-intensive systems (VIS) reverse engineering. Manually providing this whole mapping is labour-intensive. In black-box scenarios, only execution traces are available (e.g., process mining). In our previous work, we successfully experimented with variant-based mapping using supervised machine learning (ML) to identify the variants responsible of the production of a given execution trace, and demonstrated that recurrent neural networks (RNNs) work well (above 80% accuracy) when trained on datasets in which we label execution traces with variants. However, this mapping (i) may not scale to large VIS because of combinatorial explosion and (ii) makes the internal ML representation hard to understand. In this short paper, we discuss the design of a novel approach: feature-based mapping learning.
Original languageEnglish
Title of host publicationProceedings of the 18th International Working Conference on Variability Modelling of Software-Intensive Systems (VaMoS 2024)
Place of PublicationBern, Switzerland
PublisherACM Press
Number of pages3
DOIs
Publication statusPublished - 7 Feb 2024
Event18th International Working Conference on Variability Modelling of Software-Intensive Systems (VaMoS 2024) - Bern, Switzerland
Duration: 7 Feb 20249 Feb 2024
Conference number: 18
https://vamos2024.inf.unibe.ch

Publication series

NameProceedings of the 18th International Working Conference on Variability Modelling of Software-Intensive Systems

Conference

Conference18th International Working Conference on Variability Modelling of Software-Intensive Systems (VaMoS 2024)
Country/TerritorySwitzerland
CityBern
Period7/02/249/02/24
Internet address

Fingerprint

Dive into the research topics of 'Towards Feature-based ML-enabled Behaviour Location'. Together they form a unique fingerprint.

Cite this