Bilateral Filtering for Gray and Color Images
C. Tomasi
*
Computer Science Pepartment
Stanford University
Stanford,
CA
94305
tomasi
@
cs.stanford.edu
Abstract
Bilateral filtering smooths images while preserving
edges, by means
of
a nonlinear combination of nearby
image values. The method
is
noniterative, local, and sim-
ple. It combines gray levels or colors based on both their
geometric closeness and their photometric similariv, and
prefers near values to distant values in both domain and
range.
In
contrast with filters that operate on the three
bands
of
a color image separately, a bilateral filter can en-
force the perceptual metric underlying the CIE-Lab color
space, and smooth colors and preserve edges in a way
that is tuned to human perception.
Also,
in contrast with
standardjltering, bilateral filtering produces no phantom
colors along edges in color images, and reduces phantom
colors where they appear
in
the original image.
1
Introduction
Filtering is perhaps the most fundamental operation of
image processing and computer vision. In the broadest
sense of the term “filterihg,” the value of the filtered image
at a given location is a function of the values of the in-
put image in a small neighborhood
of
the
same
location. In
particular, Gaussian low-pass filtering computes a weighted
average
of
pixel values in the neighborhood, in which, the
weights decrease with distance from the neighborhood cen-
ter. Although formal and quantitative explanations of this
weight
fall-off
can be given
[
111,
the intuitionis that images
typically vary slowly over space,
so
near pixels are likely
to have similar values, and it is therefore appropriate to
average them together. The noise values that corrupt these
nearby pixels are mutually less correlated than the signal
values,
so
noise is averaged away while signal is preserved.
The assumption of slow spatial variations fails at edges,
which are consequently blurred by low-pass filtering. Many
efforts have been devoted to reducing this undesired effect
[l,
2,
3,
4,
5,
6,
7,
8,9,
10,
12,
13, 14, 15,
171.
How can
*Supported
by
NSF
grant
IRI-9506064
and
DoD
grants
DAAHOC
94-G-0284
and
DAAH04-96-1-0007,
and
by
a
gift from
the Charles
Lee
Powell foundation.
R.
Manduchi
Interactive Media Group
Apple Computer, Inc.
Cupertino,
CA
95014
manduchi
@
apple.com
we prevent averaging across edges, while still averaging
within smooth regions? Anisotropic diffusion
[
12,
141 is a
popular answer: local image variation is measured at every
point, and pixel values are averaged from neighborhoods
whose size and shape depend on local variation. Diffusion
methods average over extended regions by solving partial
differential equations, and are therefore inherently iterative.
Iteration may raise issues of stability and, depending on the
computational architecture, efficiency. Other approaches
are reviewed in section
6.
In this paper, we propose a noniterative scheme for edge
preserving smoothing that is inoniterative and simple. Al-
though we claims no correlation with neurophysiological
observations, we point out that our scheme could be imple-
mented by a single layer of neuron-like devices that perform
their operation once per image.
Furthermore, our scheme allows explicit enforcement
of any desired notion of photometric distance. This is
particularly important for filtering color images. If the
three bands of color images are filtered separately from
one another, colors are corrupted close to image edges. In
fact, different bands have different levels of contrast, and
they are smoothed differently. Separate smoothing perturbs
the balance of colors, and unexpected color combinations
appear. Bilateral filters, on the other hand, can operate on
the three bands at once, and can be told explicitly,
so
to
speak, which colors are similar and which are not. Only
perceptually similar colors are then averaged together, and
the artifacts mentioned above disappear.
The idea underlying bilateral filtering is to do in the
range of an image what traditional filters do in its domain.
Two pixels can be
close
to one another, that is, occupy
nearby spatial location, or they can be
similar
to
one an-
other, that is, have nearby values, possibly in a perceptually
meaningful fashion. Closeness refers to vicinity in
the
do-
main, similarity to vicinity in the range. Traditional filter-
ing is domain filtering, and enforces closeness by weighing
pixel values with coefficients that fall off with distance.
Similarly, we define range filtering, which averages image
a39