984 J. ZOU ET AL.
1.2. Objective reduction
Typically, an increase in the number of objectives makes analysis more dicult for search and deci-
sion-making. Some researchers have suggested reducing objective redundancy in order to lower the
dimension of the algorithm to two or less. In 2005, the basic concept of dimensional reduction (Deb
& Saxena, 2006; Saxena & Deb, 2008) was introduced, based on the method of Principal Component
Analysis (PCA). In Saxena and Deb (Saxena & Deb, 2008), a new method of reduction, based on a con-
strained optimisation algorithm, was proposed. In 2006, Zittler et al. raised a number of key analytical
issues: whether each objective is necessary, whether increasing the objective dimensions increases the
diculty of the algorithm, and so on. In response, the minimum objective subset (MOSS) algorithms for
reducing redundant objectives were introduced (Brockho & Zitzler, 2006a, 2006b, 2007; Kalyanmoy,
Thiele, Laumanns, & Zitzler, 2001; Saxena & Deb, 2007). These algorithms can be divided into two types:
those that use the deviation of the minimum objective subset (Minimum Objective Subset-MOSS) and
those that use the scale of the minimum objective for subset of size K (Minimum Objective Subset of
size K with Minimum as freely K-EMOSS). In 2008, Coello suggested another algorithm for reduction
based on feature selection techniques (FST) (Jaines, Coello, & Chakraborty, 2008). This algorithm uses
redundancy goals in correlation matrix analysis. Similar to Zitzler’s proposed algorithm, Coello also
proposed a target reduction algorithm that is divided into two categories; this algorithm has obvious
advantages because of its time complexity.
Algorithms for objective reduction continue to attract attention, in part because such reduction
is not an exhaustive process, and may involve user interaction. Although objective functions can be
classied as redundant or non-redundant, this classication is often dicult in practice, since users
may provide incorrect or incomplete information on those functions. When all the reduced objectives
are redundant objectives, it is dicult to estimate the extent of its inuence on the reduction in the
target. Also of interest to researchers is the performance of various objective reduction algorithms,
which has proven to be dicult to measure (Adra & Fleming, 2011; Bader & Zitzler, 2011; Cheng, Yang,
& Ping, 2012; Hadka & Reed, 2012; Jaimes, Coello, & Barrientos, 2006; Justesen & Ursem, 2010; Saxena,
Duro, Tiwari, Deb, & Zhang, 2013; Wang, Purshouse, & Fleming, 2013).
There are some methods evaluating EMO algorithms in many-objective optimisation, such as
Jaimes and Coello Coello (2009) and Li et al. (2014). Jaimes and Coello based the comparison on
the Tchebyche distance of the approximation set to the ‘knee’ of the Pareto front. Additionally, the
convergence induced by the preference relations is studied by analysing the generational distance
observed at each generation of the search. Li et al. propose a diversity comparison indicator (DCI)
to assess the diversity of Pareto front approximations in many-objective optimisation. DCI evaluates
relative quality of dierent Pareto front approximations rather than providing an absolute measure
of distribution for a single approximation. However, these two methods are only methods of evalu-
ation for individuality.
Indeed, a crucial issue for objective reduction algorithms is the lack of evaluation methodology.
Although some researchers (Jaines et al., 2008) have used reverse generation distance as an evaluation
metric, this method does not accurately reect algorithm performance. In this paper, we propose a new
method for reducing objectives that specically improves arithmetic performance. The feasibility and
accuracy of this method are demonstrated through experimental results.
2. Related denitions
Denition 1: Multi-objective Optimisation Problems (MOP) (Justesen & Ursem, 2010).
Formally, Multi-objective Optimisation Problems (MOP) is dened as:
(1)
{f
1
(x), f
2
(x), ⋅⋅⋅, f
M
(x)}
x ∈ S