Extrinsic Calibration of a 3D Laser Scanner
and an Omnidirectional Camera
?
Gaurav Pandey
∗
James McBride
∗∗
Silvio Savarese
∗
Ryan Eustice
∗
∗
University of Michigan, Ann Arbor, MI 48109 USA
(e-mail: pgaurav,eustice,silvio@umich.edu)
∗∗
Ford Motor Company Research and Innovation Center, Dearborn,
MI 48124 USA (e-mail: jmcbride@ford.com)
Abstract: We propose an approach for external calibration of a 3D laser scanner with an
omnidirectional camera system. The utility of an accurate calibration is that it allows for
precise co-registration between the camera imagery and the 3D point cloud. This association
can be used to enhance various state of the art algorithms in computer vision and robotics. The
extrinsic calibration technique used here is similar to the calibration of a 2D laser range finder
and a single camera as proposed by Zhang (2004), but has been extended to the case where
we have a 3D laser scanner and an omnidirectional camera system. The procedure requires
a planar checkerboard pattern to be observed simultaneously from the laser scanner and the
camera system from a minimum of 3 views. The normal of the planar surface and 3D points
lying on the surface constrain the relative position and orientation of the laser scanner and the
omnidirectional camera system. These constraints can be used to form a non-linear optimization
problem that is solved for the extrinsic calibration parameters and the covariance associated
with the estimated parameters. Results are presented for a real world data set collected by a
vehicle mounted with a 3D laser scanner and an omnidirectional camera system.
Keywords: Sensor Calibration, 3D Laser Scanner, Omnidirectional Camera.
1. INTRODUCTION
One of the basic tasks of mobile robotics is to automat-
ically create a 3D map of the environment. However, to
create realistic 3D maps, we need to acquire visual infor-
mation (e.g. color, texture) from the environment and this
information has to be precisely mapped onto the range
information. To accomplish this task, the camera and 3D
laser range finder must be extrinsically calibrated, i.e.,
the rigid body transformation between the two reference
systems must be estimated. On platforms where a camera
provides intensity information in the form of an image
and laser supplies depth information in the form of a set
of 3D points, external calibration allows reprojection of
the 3D points from the laser coordinate frame to the 2D
coordinate frame of the image.
Most previous works on extrinsic laser-camera calibration
concern calibration of perspective cameras to 2D laser
scanners (Zhang (2004)). Mei and Rives (2006) have de-
scribed the calibration of a 2D laser range finder and
an omnidirectional camera. They showed the results for
both visible (laser is observed in camera image also) and
invisible lasers. Unnikrishnan and Hebert (2005) extended
Zhang’s (Zhang (2004)) method to calibrate a 3D laser
scanner with a perspective camera. Recently Aliakbarpour
et al. (2009) have proposed a novel approach for calibration
of a 3D laser scanner and a stereo camera, which uses an
Inertial Measurement Unit (IMU) to decrease the number
?
This work is supported through a grant from Ford Motor Company
via the Ford-UofM Alliance (Award #N008265).
of points needed for a robust calibration.
In contrast to previous works, here we consider the ex-
trinsic calibration of an omnidirectional camera with a
3D laser range finder. The problem of extrinsic calibration
of a 3D scanner and an omnidirectional camera was first
addressed by Scaramuzza et al. (2007). There, they pro-
posed a technique that requires manual selection of point
correspondences from a scene viewed from the two sensors.
In this work, we describe a method of extrinsic calibration
of an omnidirectional camera and a high resolution 3D
laser scanner (with invisible lasers) that does not require
any explicit point correspondence.
The outline of the paper is as follows: In Section 2.1
we describe a procedure for the automatic refinement
of the intrinsic calibration of the Velodyne laser scanner
range correction. Section 2.2 describes the omnidirectional
camera system used. Section 2.3 describes the proposed
extrinsic laser-camera calibration method and in Section 3
we present some calibration results. In Section 4 we discuss
the implications of the laser-camera mapping presented in
this paper.
2. METHODOLOGY
Extrinsic calibration requires co-observable features in
both camera and laser data, moreover these features
should be easy to extract from both sensor modalities.
In our calibration procedure we employ a checkerboard
pattern mounted on a planar surface, which we will refer
to as the target plane from now onwards. Our selection