Rocseding
of
thc
1992
IEEE
Iatemitiad
Chfcrcnffi
QI
Robotics
and
Autanatiar
Nics,pnncC-May1992
3D
Relative Position And
Using Kalman Filter
Orientation Estimation
For
Robot
Control
Jiang Wang William J. Wilson
Dept.
of
Electrical and Computer Engineering
University
of
Waterloo
Waterloo, Ontario
N2L
3G1
Abstract
A vision based position sensing system which provides
three-dimensional (3D) relative position and orienta-
tion (pose) of an arbitrary moving object with respect
to
a
camera for
a
real-time tracking control is stud-
ied in this paper. Kalman filtering is applied to vision
measurements for the implicit solution of the photogra-
metric equations and to provide significant temporal
filtering of the resulting motion parameters resulting
in optimal pose estimation. Both computer simulation
and real-time experimental results are presented to ver-
ify the effectivenss of the Kalman filter approach with
large vision measurement noise.
1
Introduction
One of the central problems in vision guided robot
tracking control is the determination of the relative
position and orientation (pose) of
a
randomly moving
object with respect to the robot end-effector mounted
camera in three-dimensional (3D) space.
Many ap-
proaches have been proposed to estimate the 3D rela-
tive position and orientation using
a
single image view.
The common approachs used are based
on
one of two
methods. One method is to use task specific image pre-
processing to extract the image feature location mea-
surements which are combined with known object CAD
descriptions to estimate pose parameters for effective
dynamic tracking control. Another method is to use
general scene analysis to identify objects and to deter-
mine the pose parameters. The second method requires
considerable preprocessing
so
it is difficult to perform
these computations at
a
rate suitable for real-time dy-
namic control. Photogrammetric techniques[lO] and
2D to 3D line
or
point correspondence techniques[3]
[2]
[7]
are generally used in the first method, however
they need accurate image measurements.
In the manufacturing environment, low-cost camera
systems are commonly used for many tasks. This kind
of vision measurement includes significant noise due to
the camera’s characteristics, image signal spatial quan-
tization and amplitude discretization, lens distortion,
sensor pixel level errors, etc. Even for high-cost camera
systems, these error sources cannot be totally avoided.
Noisy image can result in poor individual pose esti-
mates. To reduce the effect of image measurement noise
on the pose estimates,
a
time based filter can be applied
to filter out some of noise when a sequence of images
are used for tracking control. Several authors have ap-
plied filter theory to estimate pose parameters from a
sequence of noisy images. Chang et a1[4] presented an
approach to estimate 3D motion parameters for target
tracking using
a
Kalman filter. But their method
was
based on the assumption that the target is distant and
it can be considered as
a
single feature point. There-
fore, the motion parameters do not include the orienta-
tion parameters which are difficult to estimate. Broida
and Chellappa[l] have presented
a
dynamic model to
estimate 3D motion parameters using a Kalman filter.
However, they confined their discussion to planar ob-
ject motion, and the 3D problem
was
far from solved.
Dickmanns et a1[5] have applied Kalman filter theory
to do state estimation for the motion of object using
image sequence processing for guidance of autonomous
vehicles. Wilson[S] also presented an approach to esti-
mate 3D motion parameters for tracking control using
a
Kalman filter. This approach has been demonstrated
to work well for
a
planar object motion in 2D tracking
control[6][8]. This paper presents the extension of this
approach for estimating 3D motion parameters for 3D
tracking control. Both computer simulation and real-
2638
0-8186-2720-4/92
$3.00
61992
IEEE