Optimized Feature Selection towards Functional
and Non-functional Requirements
in Software Product Lines
Xiaoli Lian
School of Computer Science and Engineering
Beihang University
Beijing, China
e-mail: lianxiaoli@buaa.edu.cn
Li Zhang
School of Computer Science and Engineering
Beihang University
Beijing, China
e-mail: lily@buaa.edu.cn
Abstract—As an important research issue in software product
line, feature selection is extensively studied. Besides the basic
functional requirements (FRs), the non-functional requirements
(NFRs) are also critical during feature selection. Some NFRs
have numerical constraints, while some have not. Without clear
criteria, the latter are always expected to be the best possible.
However, most existing selection methods ignore the combination
of constrained and unconstrained NFRs and FRs. Meanwhile,
the complex constraints and dependencies among features are
perpetual challenges for feature selection. To this end, this
paper proposes a multi-objective optimization algorithm IVEA
to optimize the selection of features with NFRs and FRs by
considering the relations among these features. Particularly, we
first propose a two-dimensional fitness function. One dimension is
to optimize the NFRs without quantitative constraints. The other
one is to assure the selected features satisfy the FRs, and conform
to the relations among features. Second, we propose a violation-
dominance principle, which guides the optimization under FRs
and the relations among features. We conducted comprehen-
sive experiments on two feature models with different sizes to
evaluate IVEA with state-of-the-art multi-objective optimization
algorithms, including IBEA
HD
, IBEA
ε+
, NSGA-II and SPEA2.
The results showed that the IVEA significantly outperforms
the above baselines in the NFRs optimization. Meanwhile, our
algorithm needs less time to generate a solution that meets the
FRs and the constraints on NFRs and fully conforms to the
feature model.
Keywords—Software Product Line; Feature Models; Feature Se-
lection; Multi-objective Optimization; Non-functional requirements
optimization.
I. INTRODUCTION
Product Line Engineering (PLE) has been paid significant
attention, and several organizations such as Boeing, Nokia,
Bosch Group and so on claimed that it is a way to produce
software better, faster and cheaper. As an important part of
Feature-Oriented Domain Analysis (FODA) [1], feature model
is widely used in expressing the commonalities and variability
of all products of the product line in terms of features and their
relations. A “feature” is defined as a “prominent or distinctive
user-visible aspect, quality, or characteristic of a software
system or system” [1]. Feature selection is an essential step
to derive an individual product with specific functional re-
quirements (FRs) which satisfies some non-functional require-
ments (NFRs) meanwhile. Here, the NFRs mean the narrow
perspective in [26], which describe the quality attributes that
the software product must have.
Unfortunately, selecting an appropriate set of features for
a product that meets all requirements is difficult.
• The first barrier is the complex dependencies and con-
straints relations among features. Taking the example
of customizing a phone, given the cheap basic screen
has been selected, it is impossible to add the GPS
function which requires the color screen. So if the
phone with GPS is demanded, the type of screen must
be reselected. This kind of iteration makes feature
selection time-consuming, even to a small feature
model.
• Another major challenge is to meet the NFRs.
Some NFRs have numerical constraints. For example,
cost < $500 may be required when configuring a
mobile phone and it can be used to filter selections
definitely. While some other NFRs without clear quan-
titative constraints as the criteria to determine the
satisfication always need to be in the best possible.
What’s more, most of the NFRs compete with each
other, and they have to be traded-off during feature
selection.
• The FRs have to be satisfied too. The real feature
models always have massive features. For instance,
Linux core has 6320 features [15]. Stakeholders al-
ways have explicit feature demands corresponding to
FRs and it is boring for them to say yes or no to each
feature, especially to the great number of unrelated
ones. Therefore, it is important to support the selection
of a partial set of features.
Various methods have been proposed to do feature selec-
tion automatically. Some approaches turned feature selection
with quantitative constrained NFRs into Constraint Satisfaction
Problem (CSP) [4], [9], [14]. The main problem of these tech-
niques is the high computation cost. To the multiple NFRs, [8],
[9], [13] aggregated them into a comprehensive optimization
objective. However, it is difficult to find appropriate coefficient
for every NFR. Sayyad et al. [10] firstly attempted to optimize
multiple quantitative constrained NFRs with Multi-objective