An improved Harmony search algorithm with dynamic control
parameters for continuous optimization problems
Biao Zhang
1
, Huihui Yan
2
, Junhua Duan
3
, J.J.Liang
4
1. College of Computer Science, Liaocheng University, Liaocheng, 252059
E-mail: zhangbiao1218@gmail.com
2. College of Computer Science, Liaocheng University, Liaocheng, 252059
E-mail: yanhui525@gmail.com
3. College of Computer Science, Liaocheng University, Liaocheng, 252059
E-mail: duanjunhua@lcu.edu.cn
4. School of Electrical Engineering, Zhengzhou University, Zhengzhou, 450001
E-mail: liangjing@zzu.edu.cn
Abstract: An improved harmony search algorithm is presented for solving continuous optimization problems in this
paper. In the proposed algorithm, an elimination principle is developed for choosing from the harmony memory, so
that the harmonies with better fitness will have more opportunities to be selected in generating new harmonies. Two
key control parameters, pitch adjustment rate (P AR) and bandwidth distance (bw), are dynamically adjusted to favor
exploration in the early stages and exploitation during the final stages of the search process with the different search
spaces of the optimization problems. Numerical results of 12 benchmark problems show that the proposed algorithm
performs more effectively than the existing HS variants in finding better solutions.
Key Words: Harmony search, Continuous optimization, Meta-heuristics, Dynamic parameter, Evolutionary algorithms
1 INTRODUCTION
The harmony search (HS) algorithm, developed by Geem
et al.[1] in 2001, is a relatively new population-based meta-
heuristic optimization algorithm. It imitates the music im-
provisation process where the musicians improvise their in-
struments’pitch by searching for a perfect state of harmony.
In the HS algorithm, the solution vector is analogous to the
harmony in music.
The harmony search (HS) algorithm has its characteristic-
s of few mathematical requirements, easy implementation
and fast convergence speed, so that it has attracted many
researchers from various fields especially these working on
solving optimization problems [2]. Although the HS al-
gorithm is good at identifying good regions in the search
space within a reasonable time, it is not so efficient in per-
forming local search in numerical optimization application-
s [6]. In addition, the performance of the algorithm suffers
from a serious problem as other meta-heuristics does, i.e., it
is sensitive to parameter setting [12]. Thus, a few modified
variants were developed for enhancing its solution accuracy
recently. For example, Mahdavi et al.[3] presented an im-
proved HS algorithm (IHS) by introducing a strategy to dy-
namically adjust the control parameters. Later, Omran and
Mahdavi [4] proposed a global best HS algorithm (GHS),
taking advantage of the global best solution to enhance the
performance of the classical HS algorithm. Cheng et al.[6]
developed another improved HS algorithm, called the MH-
This research is partially supported by National Science Foundation
of China 61174187.
S algorithm, on the basis of the idea that the better harmony
has a higher selection probability and a number of new har-
monies are generated in one iteration. Their experiments
all demonstrated that their algorithms perform better than
the classical HS.
In this article, a new harmony search algorithm (IDHS)
for solving continuous optimization problems is present-
ed. The proposed method is different from the classical
HS in the following two aspects. Firstly, inspired by MH-
S algorithm [6], the candidate harmony is chosen from the
harmony memory by an elimination procedure, so that the
harmonies with better fitness have more opportunities to
be used in generating new harmonies. Secondly, two key
control parameters, pitch adjustment rate (P AR) and band-
width distance (bw), are adjusted dynamically with respec-
t to the evolution of the search process and the different
search spaces of the optimization problems. Experimental
results and comparisons show that the proposed IDHS al-
gorithm generally outperforms the existing HS, IHS, GHS,
NGHS and SGHS algorithms when applied to 12 bench-
mark global optimization problems.
Optimization is the process of selecting the best element
from sets of available alternatives under certain constraints
(if any). This process can be solved by minimizing or
maximizing the objective or cost function of the problem.
Without loss of the generality, bound-constrained optimiza-
tion problems can be formulated as an n-dimensional min-
imization problem as follows:
min f(x)