1548 IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 10, OCTOBER 2015
Using Binocular Feature Combination for Blind
Quality Assessment of Stereoscopic Images
Feng Shao, Kemeng Li, Weisi Lin, Senior Member, IEEE,GangyiJiang, Member, IEEE,and Mei Yu
Abstract—The quality assessment of 3D images is more chal-
lenging than its 2D counterparts, and little investigation has been
dedicated to blind quality assessment of stereoscopic images. In
this letter, we propose a novel blind quality assessment for stereo-
scopic images based on binocular feature combination. The promi-
nent contribution of this work is that we simplify the process of
binocular quality prediction as monocular feature encoding and
binocular feature combination. Experimental results on two pub-
licly available 3D image quality assessment databases demonstrate
the promising performance of the proposed method.
Index Terms—Binocular feature combination, blind image
quality assessment, stereoscopic image, support vector regression.
I. INTRODUCTION
I
MAGE QUALITY ASSESSMENT (IQA) is a fundamental
and challenging researched topic in image processing
[1][2]. In the three-dimensional (3D) processing chain, the
demands for 3D IQA (3D-IQA) are particularly urgent due
to the quality degradation of 3D images has more effects on
human visual system (HVS). Taking stereoscopic image as an
example, 3D-IQA generally performs (subsets of) the following
elements:
1) Input: left and right images.
2) Process: stereoscopic/binocular features modeling.
3) Output: stereoscopic image quality pooling.
The main goal of the 3D-IQA is to characterize the binocular
features so that they can reflect the actual perceived quality and
the assessment outcome is acted as feedback to optimize the
3D image production. This process may be known or absolutely
unknown (regarded as a black box) to the predictor. Therefore, it
is important to understand how and to what extent the perceptual
factors will affect the 3D quality. In other words, the existing
3D-IQA algorithms may benefit from the deep understanding
of the perceptual properties.
Manuscript received January 19, 2015; revised March 14, 2015; accepted
March 15, 2015. Date of publication March 18, 2015; date of current version
March 23, 2015. This work was supported in part by the Natural Science
Foundation of China under Grants 61271021, 61271270, and U130125, and
by the K.C.Wong Magna Fund at Ningbo University. The associate editor
coordinating the review of this manuscript and approving it for publication
was Prof. Oscar Au.
F. Shao, K. Li, G. Jiang, M. Yu are with the Faculty of Information Sci-
ence and Engineering, Ningbo University, Ningbo 315211, China (e-mail:
shaofeng@nbu.edu.cn; jianggangyi@nbu.edu.cn; yumei@nbu.edu.cn).
W. Lin is with the Centre for Multimedia and Network Technology, School of
Computer Engineering, Nanyang Technological University, Singapore 639798
(e-mail: wslin@ntu.edu.sg).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/LSP.2015.2413946
Existing 2D-IQA metrics can be directly applied to the left
and right images to predict the quality of stereoscopic images,
which omits the relevance between the left and right images.
Currently, some existing full-reference (FR) 3D-IQA metrics
focused on addressing 3D perceptual properties in the human
visual system (HVS). Bensalma et al. [3] devised a Binocular
Energy Quality Metric (BEQM) by modeling the complex cells
responsible for the construction of the binocular energy. Chen
et al. [4] constructed a “Cyclopean” view from the stereoscopic
image pair by modeling the influence of binocular rivalry. Lin
et al. [5] incorporated binocular integration behaviors into the
existing 2D models to enhance the ability in evaluating 3D im-
ages. Our previous work incorporated binocular perception and
binocular combination properties into the 3D-IQA framework
[6]. Only few works are dedicated to the area of no-reference
(NR) 3D-IQA. Ryu et al. [7] computed perceptual blurriness and
blockiness scores of left and right images independently, and
combined into an overall quality index by modeling the binoc-
ular quality perception in the context of blurriness and block-
iness. Chen et al. [8] extracted both 2D and 3D features by
natural scene statistics from stereoscopic images and the dis-
parity map, and adopted support vector regression (SVR) to
learn a regression function to predict the quality of a tested
stereoscopic image pair. Gu et al. [9] proposed a blind stereo-
scopic IQA metric by extracting 3D factors of nonlinear addi-
tive model, ocular dominance model and saliency based parallax
compensation.
The main challenges for machine learning (ML)-based
3D-IQA metrics are feature description and modeling, so that
binocular vision and depth perception can be well addressed
to improve the performance of 3D-IQA metrics. The existing
blind image quality assessment (BIQA) methods, such as
Distortion Identification-based Image Verity and INtegrity
Evaluation (DIIVINE) [10], Blind/Referenceless Image Spatial
Quality Evaluator (BRISQUE) [11], BLind Image Integrity
Notator using DCT Statistics-II (BLIINDS-II) [12], extracted
features by natural scene statistics (NSS), or learnt a set of
centroids as codebook to compute the quality levels of image
patches by quality-aware clustering (QAC) [13]. However,
these methods may not be effective to evaluate the perceived
quality of stereoscopic images, because features for binocular
vision are comprehensively different with those of 2D images.
Currently, some perceptual models, such as Gain-Control
model [14], binocular energy model [15], are used as standard
binocular combination models for stereopsis. In this study,
we aim to develop a blind quality assessment for stereoscopic
images based on binocular feature combination. It is accepted
that binocular rivalry is a fundamental property occurred in the
distorted stereoscopic images, but directly modeling it may not
be accurate enough for assessing the perceptual quality of 3D
1070-9908 © 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.