Abstract—This study proposes a novel way to improve
investors’ total return rate of portfolio optimization by
de-noising the data using Ensemble Empirical Mode
Decomposition (EEMD). Firstly, the authors briefly introduce
risk measure theory and EEMD methodology. Then, empirically
demonstrating that the de-noising technique using EEMD surely
has some efficient impact on the portfolio, and the cumulative
return rate of the portfolio when the objective function is CVaR
with the data de-noised 3 Intrinsic Mode Functions (IMFs) is the
highest one. It indicates that the impact of de-noising the data
using EEMD is much more significant on the portfolio when the
objective functions have less powerful risk discrimination, and
vice versa.
Index Terms—portfolio optimization, risk measures,
Ensemble Empirical Mode Decomposition (EEMD), hypothesis
test
I. INTRODUCTION
ORTFOLIO optimization of the asset is a necessary way
to maximize investors’ total return rate. As the most
efficient method to quantify risk, the risk measure theory
receives widespread attention. In order to improve the
cumulative return rate of portfolio optimization, the
traditional way is to improve the property of risk measures,
while in this paper, we propose a novel methodology from the
perspective of de-noising securities’ data series using
Ensemble Empirical Mode Decomposition (EEMD).
From the perspective of traditional way, the quantitative
analysis of modern financial portfolio theory date from the
Portfolio Theory of Markowitz [1], which means that the
portfolio would minimize the risk under certain expected
return, or realize the optimization under certain risk. This
theory introduced quantitative analysis to the financial field,
laid the foundation of modern finance. However, high
standard deviation doesn’t truly mean a high level of risk,
non-monotonicity is one of its defects.
VaR (Value at risk) was proposed by Philippe Jorion at
the end of 1980s [2] which contains both the uncertainty and
This work is supported by Humanities and Social Science Planning
Fund from Ministry of Education (Grant No. 16YJAZH078), National
Natural Science Foundation of China (Grant No.71171095),
Self-determined Research Funds of CCNU of MOE (Grant No.
CCNU15A02021).
Chengli Zheng is with the School of Economics and Business
Administration, Hua Zhong Normal University, Wuhan, China (e-mail:
zhengchengli168@163.com).
Yinhong Yao is with the Institutes of Science and Development, Chinese
Academy of Science, Beijing, China (phone: +86 15527038609; e-mail:
yyh0418@ 126.com).
loss. While VaR only considers the quantile of the distribution
without caring about what is happening to the left and to the
right of the quantile, and it is concerned only with the
probability of the loss, while does not care about the size of
the loss [3].
In the late of 20th, Artzner et al. proposed the concept of
coherent risk measure based on the axiomatic foundation, the
coherent risk measure must satisfy monotonicity, translation
invariance, positive homogeneous, and sub-additivity [4].
Then, Fӧllmer and Schied extended the coherent risk measure
to convex risk measure [5-7], Frittelli and Gianin [8] defined
the convex risk measure based on axioms, i.e. monotonicity,
translation invariance and convexity.
CVaR (Conditional VaR) is one of the best choices of
coherent risk measure [9], Kusuoka proved that CVaR is the
smallest law invariant coherent risk measure that dominates
VaR [10]. Wang and Ma proved that VaR is consistent with
the first-order stochastic dominance, CVaR is consistent with
the second-order stochastic dominances [11]. However,
CVaR takes into consideration only the tail of the distribution.
Then, risk measures that pay more attention to the left tail
of distribution were proposed. Krokhmal proposed HMCR
(Higher Moment Coherent Risk measure) based on CVaR
[12], Chen and Wang gave some proofs and derivations of the
proprieties of P-norm (i.e. HMCR) [13]. Zheng and Yao
proved that HMCR(p=n) is consistent with (n+1) th order
stochastic dominance from the perspective of Kusuoka
representation [14]. Zheng and Chen proposed iso-entropic
risk measure based on relative entropy, which is obtained
under the theoretical framework of the coherent risk measure,
and proved that it is consistent with stochastic dominance of
almost all the orders and it has the highest power of risk
discrimination compared with VaR and CVaR [15-16].
However, noise can affect real information, which affects
the efficiency of portfolio optimization. From the perspective
way of the de-noising method, Huang et al. introduced EMD
(Empirical Mode Decomposition) method, it is an empirical,
intuitive, direct and self-adaptive data processing method
which is proposed especially for nonlinear and non-stationary
data [17]. The core of EMD is decomposing the target data
into a small number of independent and nearly periodic
Intrinsic Modes Functions (IMFs) and one residue. EEMD
(Empirical EMD) was an improved version by Wu and Huang,
which add a series of finite, not infinitesimal, amplitude white
noise to overcome the mode mixing problem of EMD, and it
is a truly noise-assisted data analysis method [18].
EEMD have been applied in many areas, such as
biomedical engineering, structured health monitoring,
earthquake engineering, etc. In social science area, Zhang et
Portfolio Optimization based on Risk Measures
and Ensemble Empirical Mode Decomposition
Chengli Zheng, Yinhong Yao
IAENG International Journal of Computer Science, 46:1, IJCS_46_1_09
(Advance online publication: 1 February 2019)
______________________________________________________________________________________