Physica A 406 (2014) 163–168
Contents lists available at ScienceDirect
Physica A
journal homepage: www.elsevier.com/locate/physa
On the axiomatic requirement of range to
measure uncertainty
Xinyang Deng
a
, Yong Deng
a,b,∗
a
School of Computer and Information Science, Southwest University, Chongqing, 400715, China
b
School of Engineering, Vanderbilt University, Nashville, TN, 37235, USA
h i g h l i g h t s
• This manuscript studies the uncertainty measure in Dempster–Shafer theory.
• The irrationality of the axiomatic requirement of range has been pointed out.
• The correct range of uncertainty is [0, log
2
2
|X|
] rather than [0, log
2
|X|].
a r t i c l e i n f o
Article history:
Received 30 December 2013
Received in revised form 18 March 2014
Available online 22 March 2014
Keywords:
Uncertainty measure
Entropy
Dempster–Shafer theory
Belief function
a b s t r a c t
How to measure uncertainty is still an open issue. Probability theory is a primary tool
to express the aleatoric uncertainty. The Shannon’s information entropy is an effective
measure for the uncertainty in probability theory. Dempster–Shafer theory, an extension
of probability theory, has the ability to express the aleatoric and epistemic uncertainty,
simultaneously. With respect to such uncertainties in Dempster–Shafer theory, a justifiable
uncertainty measure is required to satisfy five axiomatic requirements based on previous
studies. In this paper, we show that one of the axiomatic requirements, the requirement of
range, is questionable. The correct range of uncertainty should be [0, log
2
2
|X|
] rather than
[0, log
2
|X|] according to the concept of entropy.
© 2014 Elsevier B.V. All rights reserved.
1. Introduction
Uncertainty is ubiquitous in nature. How to measure the uncertainty has attracted much attention [1–3]. Various cate-
gorizations exist to accommodate different kinds of uncertainties. An existing classification scheme [4] divides the uncer-
tainties into five types: (i) aleatoric uncertainty which mainly comes from random or stochastic processes; (ii) epistemic
uncertainty which is due to the lack of knowledge; (iii) irreducible uncertainty which is a natural variability that cannot be
reduced but only quantified; (iv) reducible uncertainty that is from the lack of specific information, knowledge and can be
reduced with acquisition of more information; (v) inference uncertainty. Numerous uncertainty theories have been devel-
oped, such as probability theory [5], fuzzy set theory [6], possibility theory [7], Dempster–Shafer theory [8,9], and random
intervals [10].
Since first proposed by Clausius in 1865 for thermodynamics [11], the concept of entropy has emerged in a large
number of fields and has been a measure of disorder and uncertainty [12–21]. Information entropy [22], derived from
the Boltzmann–Gibbs (BG) entropy [23] in thermodynamics and statistical mechanics, has been an indicator to measure
∗
Corresponding author at: School of Computer and Information Science, Southwest University, Chongqing, 400715, China.
E-mail addresses: ydeng@swu.edu.cn, prof.deng@hotmail.com (Y. Deng).
http://dx.doi.org/10.1016/j.physa.2014.03.060
0378-4371/© 2014 Elsevier B.V. All rights reserved.