M. R. Khosravi et al.
116
and TIRS and they provide images in nine and two frequency bands, respective-
ly. The OLI sensor provides multispectral images which contain almost all of the
ETM+ bands. However, they have been improved in SNR and spatial resolution.
This sensor takes images in visible and IR, and the TIRS sensor which is consi-
dered as a thermal sensor, just takes images in two IR bands. Some of the OLI
sensor’s bands are presented in
Table 1. Lansat-8 collects spatial data with me-
dium resolution (30 meters). The TIRS thermal sensor has a 100 meters resolu-
tion in both of its frequency bands where this resolution is considered a weak
accuracy. Landsat-8 images are freely available and their format is Geo-TIFF, a
file format for lossless storage which is geo-referenced. Therefore, they are not
required to the geometrical corrections. Combination of 2nd, 3rd and 4th bands
makes a visible image with 30 meters resolution and the 8th band which con-
tains the widest spectrum of visible light (it contains about two third of the spec-
trum) has 15 meters resolution (each pixel shows 225 m
2
on the ground), so it
has a good resolution as seen in
Table 1. This band named panchromatic and is
considered as the highest resolution band in the OLI sensor [3].
As follows, we overview some fusion approaches, however due to the concen-
tration on another process (
i.e
. interpolation [4] [5] [6]), this overview will be
short. Optical/passive remote sensing satellites provide multispectral (MS) and
panchromatic (PAN) images which have different spatial, spectral, radiometric,
temporal resolution/accuracy. The multispectral images have high spectral in-
formation and low spatial information, whereas the PAN image has lower spec-
tral and high spatial information. Fusion of the low (spatial) resolution multis-
pectral images and the high (spatial) resolution PAN images has been a hot
problem. The synthetic high spatial and spectral resolution MS image can pro-
vide increased interpretation capabilities and more reliable results [7] [8]. Dur-
ing the last two decades, various fusion approaches have been developed. These
methods are mainly divided into four categories: projection-substitution-based
methods, relative spectral contribution methods, ARSIS-based fusion methods
and model-based methods [8] [9]. In [9] differences among these schemes have
explicitly been explained. The rest of this paper is organized as follow. Second
section expresses the foundations of the proposed method, third section over-
views a specific interpolation algorithm entitled linear minimum mean square
error-estimation (LMMSE), the forth section evaluates the proposed idea for
better image fusion in the Landsat-8 sensor and final section is conclusion of all
topics discussed in this paper.
Table 1. Spectrum of OLI.
Spatial Resolution
(meter)
Wavelength Range
(micro meter)
30 0.450 - 0.515 Blue (B) 2
30 0.525 - 0.600 Green (G) 3
30 0.630 - 0.680 Red (R) 4
15 0.500 - 0.680 Panchromatic (PAN) 8