A new reconstruction method based on fringe projection
of three-dimensional measuring system
Jinhui Huang, Qingyang Wu
n
Shenzhen Key Laboratory of Micro-Nano Photonic Information Technology, College of Electronic Science and Technology, Shenzhen University,
Shenzhen 518060, China
article info
Article history:
Received 13 March 2013
Received in revised form
26 May 2013
Accepted 2 July 2013
Available online 24 July 2013
Keywords:
Three-dimensional measurement
Fringe projection
Phase shifting
Whole-field calibration
Reconstruction
abstract
This paper presents a new reconstruction method of three-dimensional measuring system, with the
simple calibration process, whole-field calibration and fast reconstruction. First, we project a sinusoidal
fringe pattern on the surface of the calibration target within the scope of a calibrated field. At the same
time, camera captures fringe images on the planar target. Then, we extract image coordinates and
absolute phase values of the feature point of planar target, and obtain camera parameters and world
coordinates by using image coordinates. Next, we use fitting polynomials to match the spatial relation-
ships among image coordinates, absolute phases and world coordinates from planar target, solve the
parameters from fitting equations. Finally, we solve the absolute phase values of object's surface in phase
shifting and gray coding techniques, substitute it into fitting equations and reconstruct 3D shape of
object's surface. The experimental results show that this reconstruction method is simple, with high
precision. Most of all, the calibration target can be placed arbitrarily within entire calibrated fields in the
calibration procedure.
& 2013 Elsevier Ltd. All rights reserved.
1. Introduction
The method based on fringe projection belongs to a kind of
active triangulation technology, with the advantages of non-
contact, high speed, high stability and high accuracy, has been
widely applied in the protection of cultural relics, art sculpture,
virtual reality and biomedical science [1,2]. The theory goes that
periodic sinusoidal fringe pattern is projected on the surface of
measured objects, and deformed fringe pattern is formed.
To describe the spatial distribution of fringe, phase shifting and
gray coding technologies are used to obtain the absolute phases of
the fringe pattern. After calibrating the system, image coordinates
are converted to real world coordinates and absolute phase map
is converted to absolute height map. With this information, 3D
surface information of the object will be generated. However, only
a surface contour of the object can be reconstructed when fringe is
projected from one direction each time [3].
The shape reconstruction is an important step in 3D measuring
system. It affects the precision of measuring results. Traditional
reconstruction methods based on the theory of fringe projection
have two kinds. The first kind is based on phase–height mapping
relationship [4]. First, to obtain the phase difference, a reference
plane should be set up. Then, encoded fringe pattern is projected
on the surface of the calibration target. To build relationship
between phase and height, we can move calibration target by
high-precision translation stage at a certain distance normal to
optical axis [5], and calculate Z coordinates difference between
planar target and the reference plane. With phase shifting and
gray coding technologies, the phase difference between planar
target and the reference plane can be obtained [1,6,7]. Then, Z
coordinates difference and phase difference is substituted into
phase–height mapping relation that Zhou and Su have presented
[4]. Calibrating parameters can be solved from the mapping
equations. Finally, the height of the object from the phase
difference between object's surface and the reference plane is
solved, and the coordinates (X,Y) is solved from image coordinates
and height of the object. At this moment, 3D shape is recon-
structed. This reconstruction method needs a big and high-
precision calibration target, and its accuracy is affected by the
translation stage. Moreover, there is an angle between the direc-
tion of the camera and the projector. The overlap regions between
the range of camera imaging and projection are different from
each position, so we cannot obtain the different phase between
planar target and the reference plane for overall region [8]. The
second kind is based on the binocular vision theory [9], and
camera and projector must be calibrated respectively. The princi-
ple of the projector can be seen as a reverse imaging process of
camera. A spatial point on the planar target is projected on the
plane of the projector and camera, form a pair of feature points.
There is a certain relationship between the spatial point and
Contents lists available at ScienceDirect
journal homepage: www.elsevier.com/locate/optlaseng
Optics and Lasers in Engineering
0143-8166/$ - see front matter & 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.optlaseng.2013.07.002
n
Corresponding author. Tel.: +86 75 526 535060.
E-mail address: wuqy@szu.edu.cn (Q. Wu).
Optics and Lasers in Engineering 52 (2014) 115–122