Light Stage 2.0 Design
Virtual characters - digitally generated humans that speak, move, and think - are
a core component of the Experiential Learning System. Making these characters look
realistic, as well as lighting them convincingly, is a central goal of the ICT's
graphics laboratory. Current graphics techniques do not model how light scatters
within skin, how it bounces between parts of the face, and how it originates from
all parts of the environment, yielding plastic-looking characters that do not fully
blend with their surroundings. Using the data taken with the light stage, renderings
can be made that reproduce all these effects of global illumination, subsurface
scattering, and self-shadowing.
Light Stage 2.0 is a device for capturing realistic computer models of human faces.
Thirty-two sequential strobe lights arranged on a rotating arc illuminate a person's
face from all possible directions while the person's appearance is recorded by
synchronized high-speed video cameras. The person is lit from hundreds of directions
in just a few seconds, directly recording how the person's face transforms incident
light into radiant light. Once this data is captured, the face can be rendered in
arbitrary illumination by recombining the color channels of the original images.
Our
interactive demonstration allows you to try this yourself - lighting a person's
face by a variety of captured lighting environments, or by a set of lights you
position yourself.
Light Stage 2.0 is an evolved version of the
first light stage presented by ICT
researchers at the SIGGRAPH 2000 conference. The first version consisted of a
single incandescent light spun in a spiral about the subject, requiring the
subject to remain still for over a minute. Light Stage 2.0's array of lights
and high-speed camera reduce the acquisition time to just a few seconds. This
increase in speed allows a person to be captured in a variety of natural
expressions, allowing us to extrapolate the appearance for fully animated characters.
Light Stage 2.0 forms a platform for basic research in human facial reflectance,
virtual actor technology, and live-action compositing of real actors into
computer-generated environments. Upcoming experiments will involve using
directionally polarized light, observing reflectance in greater detail across
the spectrum, and using arrays of cameras to more completely measure the field
of reflectance. New software techniques are under development that will allow
real-time rendering of animated characters with correct lighting. Future versions
of the light stage will allow the virtual re-lighting of live human performances,
producing a valuable tool for digital filmmaking as well as for creating rich
immersive environments.
At the 2004 Eurographics Symposium on Rendering we presented Animatable Facial Reflectance Fields , a technique for producing animated faces using Light Stage 2 data.