Real-Time Fur over Arbitrary Surfaces
Jerome Lengyel
Microsoft Research
http://research.microsoft.com/~jedl
Emil Praun
Princeton University
http://www.cs.princeton.edu/~emilp
Adam Finkelstein
Princeton University
http://www.cs.princeton.edu/~af
Hugues Hoppe
Microsoft Research
http://research.microsoft.com/~hoppe
Abstract
We introduce a method for real-time rendering of fur on surfaces
of arbitrary topology. As a pre-process, we simulate virtual hair
with a particle system, and sample it into a volume texture. Next,
we parameterize the texture over a surface of arbitrary topology
using “lapped textures” — an approach for applying a sample
texture to a surface by repeatedly pasting patches of the texture
until the surface is covered. The use of lapped textures permits
specifying a global direction field for the fur over the surface. At
runtime, the patches of volume textures are rendered as a series of
concentric shells of semi-transparent medium. To improve the
visual quality of the fur near silhouettes, we place “fins” normal
to the surface and render these using conventional 2D texture
maps sampled from the volume texture in the direction of hair
growth. The method generates convincing imagery of fur at
interactive rates for models of moderate complexity. Further-
more, the scheme allows real-time modification of viewing and
lighting conditions, as well as local control over hair color, length,
and direction.
Additional Keywords: hair rendering, lapped textures, volume textures.
1. Introduction
A distinguishing characteristic of mammals is that they have hair.
For many computer graphics applications, a sense of immersion
requires the presence of realistic virtual creatures. Unfortunately,
generating convincing imagery of people and animals remains a
challenging problem, in part because the hair often looks artificial.
This paper presents a method for rendering realistic fur over
surfaces of arbitrary topology at interactive frame rates. In this
paper, we use the terms fur and hair interchangeably.
Perhaps the most effective imagery of fur is due to Kajiya and
Kay [4], who ray-traced a model with explicit geometric detail
represented as volume textures. In the computer animation indus-
try, fine geometric modeling of fur led to convincing artificial
mammals, such as the lemurs in Disney’s Dinosaur [1]. Another
example is the dog fur n 101 Dalmatians [2], which was repre-
sented using a stochastic (rather than geometric) model.
However, rendering hair is computationally expensive, and the
majority of the methods to date are too slow for interactive use
(see Thalmann et al. [10] for a survey). In an interactive setting,
Van Gelder and Wilhelms [11] showed that various parameters of
fur can be manipulated in real time. However, their renderer is
limited to drawing polylines, which limits the number of hairs that
can be drawn per frame as well as the realism of the result.
Recent work by Meyer and Neyret [6][7] showed that rendering
volume textures at interactive rates is possible by exploiting
graphics card hardware. Lengyel [5] subsequently developed a
similar method optimized for the specific case of fur. The method
renders a furry surface as a series of concentric, semi-transparent,
textured shells containing samples of the hair volume. By exploit-
ing conventional texture mapping, the method allows interactive
rendering of furry models represented by thousands of polygons.
Our approach is based on the shell method and thus achieves
interactive frame rates. However, in this project, we address three
limitations of previous work. First, the shell method requires a
global parameterization of the surface. While this is easy to achieve
for special cases (such as a torus or disc), it prevents application of
this method to a surface of arbitrary topology, without first cutting
the surface into pieces and rendering them separately. Second, the
method requires a significant amount of texture memory, because
each shell needs a separate texture covering the whole surface and
these textures must be large enough to resolve individual hairs.
Finally, the shell method provides an effective approximation to
volume textures only when the viewing direction is approximately
normal to the surface. Near silhouettes where shells are seen at
grazing angles, the hair appears to be overly transparent, and gaps
become evident between the shells.
In our work, we address the first two limitations of the shell
method – parameterization and texture memory size – by using
the lapped textures of Praun et al. [8]. Lapped textures cover a
surface of arbitrary topology by repeatedly pasting small patches
of example texture over the surface ( ). Because the
surface is covered by a collection of patches, we only need local
parameterizations within the individual patches, as opposed to a
global parameterization. Furthermore, the many patches instanti-
ated over the surface can all share the same texture, thus greatly
reducing texture memory overhead.
Figure 1
The third limitation – silhouettes – is not specific to the shell
method. In interactive settings such as games, designers use low-
polygon-count models to maintain high frame rates. Detailed
textures help to embellish these coarse models, except at the
silhouette where tangent discontinuities detract from the visual
quality. Furthermore, in the specific case of fur, the visual cues at
the silhouette play a critical role in perceiving the characteristics
of the fur. Polygonal artifacts may be alleviated by using higher
resolution models, or by clipping to a high-resolution 2D contour
as described by Sander et al. [9]. To address the silhouette prob-
lem, we introduce a scheme for rendering textured fins normal to
the surface near silhouettes. The fin textures are created using the
same volumetric model for hair as the shell textures, but sampled
in a different direction more appropriate for oblique viewing.
Alternatively, an artist may create a fin texture directly.
We offer interactive control over local and global properties of the
fur, such as its direction, length, or color. For example, the user
may globally adjust the angle of the hair with respect to the
surface or may locally adjust the hair direction using a “combing
tool”.
The principal contributions of this work are: (1) integrating the
shell method with lapped textures to allow for arbitrary topology
and to reduce texture requirements; (2) improving the visual
quality of silhouettes by rendering fins; and (3) demonstrating
interactive local and global control over hair properties.