Bilateral Filtering for Gray and Color Images
C. Tomasi
R. Manduchi
Computer Science Department Interactive Media Group
Stanford University Apple Computer, Inc.
Stanford, CA 94305 Cupertino, CA 95014
tomasi@cs.stanford.edu manduchi@apple.com
Abstract
Proceedings of the 1998IEEE International
Conference on Computer Vision, Bombay,
India
Bilateral filtering smooths images while preserving
edges, by means of a nonlinear combination of nearby
image values. The method is noniterative, local, and sim-
ple. It combines gray levels or colors based on both their
geometric closeness and their photometric similarity, and
prefers near values to distant values in both domain and
range. In contrast with filters that operate on the three
bands of a color image separately, a bilateral filter can en-
force the perceptual metric underlying the CIE-Lab color
space, and smooth colors and preserve edges in a way
that is tuned to human perception. Also, in contrast with
standard filtering, bilateral filtering produces no phantom
colors along edges in color images, and reduces phantom
colors where they appear in the original image.
1 Introduction
Filtering is perhaps the most fundamental operation of
image processing and computer vision. In the broadest
sense of the term “filtering,” the value of the filtered image
at a given location is a function of the values of the in-
put image in a small neighborhoodof the same location. In
particular, Gaussian low-passfilteringcomputes a weighted
average of pixel values in the neighborhood, in which, the
weights decrease withdistance from the neighborhood cen-
ter. Although formal and quantitative explanations of this
weightfall-offcan be given[11], the intuitionis that images
typically vary slowly over space, so near pixels are likely
to have similar values, and it is therefore appropriate to
average them together. The noise values that corrupt these
nearby pixels are mutually less correlated than the signal
values, so noise is averaged away while signal is preserved.
The assumption of slow spatial variations fails at edges,
which are consequently blurredby low-pass filtering. Many
efforts have been devoted to reducing this undesired effect
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 17]. How can
Supported by NSF grant IRI-9506064 and DoD grants DAAH04-
94-G-0284 and DAAH04-96-1-0007, and by a gift from the Charles Lee
Powell foundation.
we prevent averaging across edges, while still averaging
within smooth regions? Anisotropic diffusion [12, 14] is a
popular answer: local image variation is measured at every
point, and pixel values are averaged from neighborhoods
whose size and shape depend on local variation. Diffusion
methods average over extended regions by solving partial
differentialequations, and are therefore inherentlyiterative.
Iteration may raise issues of stability and, dependingon the
computational architecture, efficiency. Other approaches
are reviewed in section 6.
In this paper, we propose a noniterativescheme for edge
preserving smoothing that is noniterative and simple. Al-
though we claims no correlation with neurophysiological
observations, we pointout that our scheme could be imple-
mented by a single layer of neuron-like devices that perform
their operation once per image.
Furthermore, our scheme allows explicit enforcement
of any desired notion of photometric distance. This is
particularly important for filtering color images. If the
three bands of color images are filtered separately from
one another, colors are corrupted close to image edges. In
fact, different bands have different levels of contrast, and
theyare smoothed differently. Separate smoothing perturbs
the balance of colors, and unexpected color combinations
appear. Bilateral filters, on the other hand, can operate on
the three bands at once, and can be told explicitly, so to
speak, which colors are similar and which are not. Only
perceptually similar colors are then averaged together, and
the artifacts mentioned above disappear.
The idea underlying bilateral filtering is to do in the
range of an image what traditional filters do in its domain.
Two pixels can be close to one another, that is, occupy
nearby spatial location, or they can be similar to one an-
other, that is, have nearby values, possiblyin a perceptually
meaningful fashion. Closeness refers to vicinity in the do-
main, similarity to vicinity in the range. Traditional filter-
ing is domain filtering, and enforces closeness by weighing
pixel values with coefficients that fall off with distance.
Similarly, we define range filtering, which averages image