Published in IET Intelligent Transport Systems
Received on 27th March 2008
Revised on 4th July 2008
doi: 10.1049/iet-its:20080017
ISSN 1751-956X
Wide-angle camera technology for automotive
applications: a review
C. Hughes
1
M. Glavin
1
E. Jones
1
P. Denny
2
1
Connaught Automotive Research Group, Department of Electronic Engineering, National University of Ireland, Galway, Ireland
2
Valeo Vision Systems, I.D.A. Business Park, Dunmore Road, Tuam, County Galway, Ireland
E-mail: ciaran.hughes@nuigalway.ie
Abstract:
The development of electronic vision systems for the automotive market is a strongly growing ďŹeld, driven
in particular by customer demand to increase the safety of vehicles both for drivers and for other road users,
including vulnerable road users (VRUs), such as pedestrians. Customer demand is matched by legislative
developments in a number of key automotive markets; for example Europe, Japan and the US are in the process
of introducing legislation to aid in the prevention of fatalities to VRUs, with emphasis on the use of vision systems.
The authors discuss some of the factors that motivate the use of wide-angle and ďŹsh-eye camera technologies in
vehicles. The authors describe the beneďŹts of using wide-angle lens camera systems to display areas of a vehicleâs
surroundings that the driver would, otherwise, be unaware of (i.e. a vehicleâs blind-zones). However, although wide-
angle optics provide greater ďŹelds of view, they also introduce undesirable effects, such as radial distortion,
tangential distortion and uneven illumination.
These distortions have the potential to make objects difďŹcult for the vehicle driver to recognise and, thus, potentially
have a greater risk of accident. The authors describe some of the methods that can be employed to remove these
unwanted effects, and digitally convert the distorted image to the ideal and intuitive rectilinear pin-hole model.
1
Introduction
Safety in vehicles is a growing concern for most modernised
countries. For example, the European New Car Assessment
Program (Euro NCAP)
[1]
, which was established in 1997,
provides objective information on the safety of drivers
and passengers in cars in crash situations. In a study
commissioned by the Euro NCAP, 94% of respondents list
safety in vehicles as a major concern
[2]
. There are similar
organisations in both Japan
[3]
and the US
[4]
(known as
JNCAP and USNCAP, respectively).
However, more recently, interest in the protection of
vulnerable road users (VRUs), for example pedestrians and
cyclists, has increased. Back-over collisions caused by the
inability of drivers to detect VRUs when they are present in a
vehicleâs blind-zones is a major cause of VRU fatalities
globally. The term âblind-zoneâ is used in preference over the
other commonly used term âblind-spotâ, as this is the term
used in some of the jurisdictions described in this paper, and
more accurately describes the areas that cannot be seen by a
driver of a vehicle. It is somewhat ironic to note that, with
vehicle manufacturers increasing the emphasis on improving
the Euro NCAP ratings of their vehicles by strengthening
and increasing the size of the vehiclesâ A-pillars (the vertical
or near vertical shaft of material that supports the vehicle roof
on either side of the wind-screen), they are also increasing the
size of the blind-zones caused by those A-pillars. Increased
A-pillar size is often used to improve a vehicleâs âsecondary
safetyâ, which can be deďŹned as all structural and design
features that reduce the consequences of accidents as far as
possibleâ
[5]
. However, this is at a potential loss of direct
vision, an element of âprimary safetyâ, which can be deďŹned as
âthe vehicle engineering aspects which as far as possible
reduce the risk of an accident occurringâ
[6]
.
To match the consumer desire for increased safety in
vehicles, the European Union has introduced legislation that
requires large goods vehicles (LGVs) to have large portions
of their blind-zones made visible to the driver. The term
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
19
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
âLGVâ is used in this paper, instead of the term âheavy goods
vehicle (HGV)â, as this is the terminology used in most
European Union documentation because of the fact that the
word âheavyâ does not have a direct translation in all
European languages. Japan is also in the process of
introducing similar legislation
[7]
. While European and
Japanese legislation focuses primarily on LGV safety,
proposed legislation in the US is targeted at privately owned
vehicles, with the aim of preventing back-over collisions.
In Section 2, we describe radial distortion, which is the
primary distortion introduced by wide-angle cameras to an
image. We review several models of radial distortion and
describe some of the problems in camera calibration and
distortion correction. This is followed, in Section 3, by a
description of some of the other geometric distortion
factors in wide-angle cameras, such as centre of distortion
(COD), tangential distortion and uneven illumination, and
certain methods that may be employed for their mitigation
are discussed.
1.1 Vehicle blind-zones
A vehicleâs blind-zones are the areas around the vehicle that
cannot be seen directly by the driver by looking forwards or
by using any of the vehicleâs standard rear-view mirrors
(internal and external) from the normal sitting position.
Fig. 1
shows the potential blind-zones around a vehicle.
The sizes of these areas are determined by the size and
design of the vehicle and mirrors, and will vary according
to car model and manufacturer.
Consumers Union
[8]
has examined the rearward blind-
zone on many non-commercial light-duty passenger vehicles
[from small passenger cars to large sports utility vehicles
(SUVs)]. The zone was measured by determining how far
behind the vehicle a 28 inch (0.71 m) trafďŹc cone needed to
be before a person, seated in the driverâs seat, could see its
top while looking through the rear window. For a 5 foot
8 inch (1.73 m) tall driver, the distance measured was up to
44 feet (13.4 m) for a commercially available four-wheel
drive vehicle registered in 2006. In the same vehicle, the
blind-zone distance for a 5 foot 1 inch (1.55 m) driver
extends to 69 feet (21 m).
The blind-zone for LGVs is naturally much larger than
that of light-duty vehicles. Ehglen and Paidla
[9]
calculated
the forward blind-zones of a given LGV as shown in
Fig. 2
. Furthermore, the rearward blind-zones of LGVs
tend to be very large; we have calculated that several HGVs
have a rearward blind-zone that can extend up to 65 m
behind the vehicle on the ground plane.
1.2 Blind-zone collision statistics
1.2.1 Europe:
In Europe, ofďŹcial statistics for VRU
deaths because of the victims not being visible to the driver
of a vehicle are not readily available, as there is no single
repository for such information. However, the European
Commissionâs CARE road accident database
[10]
claims
that there were 3961 VRU pedestrian fatalities within
urban areas in 2005. It is reasonable to assume that a
signiďŹcant portion of these deaths were caused by the VRU
not being visible to the driver of the vehicle.
This assumption is supported by several statistics. The
European Commission Directorate-General for Energy and
Transport estimates that the lack of visibility in the blind-
zone towards the rear of a vehicle directly causes 500 deaths
a year in the EU
[11]
. The Commission of the European
Communities estimates that every year, approximately 400
European road users lose their lives in collisions with LGVs,
because the driver did not see them when turning right
[12]
.
The UK Health and Safety Executive reports that in the 12
months between 2005 and 2006, 18 people were killed and
620 sustained major injuries because of the intentional
Figure 1
Five blind-zone areas (shown in grey) in a
standard left-hand drive car
Sizes of the blind-zones are dependent on the design of the car
and the viewing angle of the mirrors
Figure 2
Blind zones around the front of a given (left-hand
drive) LGV [9]
20
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
motion of the vehicle (either forwards or backwards) in the
workplace
[13]
.
1.2.2 US:
Statistics for the US are equally disjointed, with
no ofďŹcial statistics directly available for VRU injuries
because of vehicle blind-zones. However, the Kids and
Cars Organisation in the US
[14]
claims that 941 children
were killed in non-trafďŹc collisions in the US between 2002
and 2006. Non-trafďŹc collisions are collisions involving
vehicles while they are not in a trafďŹc situation (for
example, while on private residential property). They
further claim that 49.5% (or 466 children) of the fatalities
were because of the vehicle reversing while children were
present in the vehicleâs rearward blind-zone.
In a study between July 2000 and June 2001, the Centers for
Disease Control and Prevention (CDC)
[15]
reported that
there were an estimated 9160 non-fatal injuries to children in
non-trafďŹc automotive collisions, with
20% (or 1832
children) of these injuries caused by the vehicle moving
backwards. Between 2001 and 2003, the CDC reported that
an estimated 7475 children (2492 per year) were treated for
vehicle back-over injuries
[16]
. Again, although the blind-
zone is not directly implicated in these injuries, it is
reasonable to assume that a signiďŹcant proportion of these
injuries were because of the children being present in the
vehicleâs blind-zone.
Wang and Knipling
[17]
estimated that lane change
/
merge crashes in 1991 accounted for approximately
244 000 police-reported crashes with 224 associated
fatalities. Furthermore, the authors reported that the
principal causal factor in such crashes is that the driver âdid
not see the other vehicleâ.
1.3 Legislation relating to vehicle
blind-zones
Because of the increasing awareness of the vulnerability of
pedestrians, legislation has been introduced, or is in the
process of being introduced, in several jurisdictions around
the world. In this section, the legislative or potential
requirements for the visibility of blind-zones in the EU,
Japan and the US are described.
1.3.1 Europe:
In the EU, legislation in the form of
Directive 2003
/
97
/
EC
[18]
was introduced in 2003.
Although the initial requirements of this directive aim to
reduce collisions caused by the blind-zones of LGVs and
improve road safety for new vehicles circulating from 2006
/
2007 onwards, the legislation does not cover the existing
ďŹeet of lorries in the EU. However, as it has been
estimated that existing ďŹeets will not be fully replaced until
2023, Directive 2007
/
38
/
EC
[19]
was introduced in 2007.
This legislation requires the retroďŹtting of the required
indirect vision systems (IVSs) to all existing ďŹeets within
24 months of enactment of the bill (i.e. by July 2009).
The shaded areas in
Fig. 3
show the areas of a left-hand
drive LGVâs environment that must be visible to the driver
via the use of IVS, as required by these directives.
Although the ďŹgure shows a left-hand drive vehicle, for
right-hand drive vehicles within the jurisdiction of the
legislation, the areas of required coverage are reversed.
Examples of IVSs include additional mirrors to the
standard rear-view mirrors (internal and external), as well
as cameraâ monitor devices. However, practical problems
arise with the use of additional mirrors as the extra
mirrors can themselves introduce additional blind-zones, by
obstructing direct forward vision, as well as having
additional cost and styling implications.
There is a clear provision for the use of cameraâ monitor
devices for the coverage of vehicle blind-zones in this
directive. In fact, the use of cameraâ monitor devices over
mirrors is often not only desirable, but also necessary in
certain situations. For example, it is practically impossible
to cover the area at the rear of an LGV with mirrors alone,
and so cameraâ monitor systems are the only practical
solution.
1.3.2 Japan:
In Japan, legislation has also been proposed,
which would require medium and large vehicles to be
equipped with devices that allow the driver to detect
objects in the vehicleâs blind-zones, either directly or
indirectly using mirrors or cameraâ monitor devices (
Fig. 4
)
[20]
. For the purpose of the proposed legislation, a cylinder
1 m high with a diameter of 0.3 m placed anywhere within
the coverage areas must be at least partially visible to the
driver of the LGV directly, by mirror or by camera.
However, in this legislation, it is proposed that objects
within blind-zones caused by A-pillars and external mirrors
need not be visible to the driver of the vehicle
[7]
.
1.3.3 US:
In the US, proposed legislation, in the form of
S.694 (The Cameron Gulbransen Kids and Cars Safety
Act of 2007)
[21]
, is designed to protect against children
being injured or killed in non-trafďŹc incidents, such as
when the vehicle is reversing. Relating to the potential use
of cameras, the S.694 bill requires a âďŹeld of view (FOV) to
Figure 3
Area required by Directive 2003
/
97
/
EC to be
visible to drivers of (left-hand drive) LGVs via the use of
IVSs (not including areas covered by the standard and
wide-angle wing-mirrors)
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
21
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
enable the driver of a motor vehicle to view areas behind the
motor vehicle to reduce death and injury resulting from
backing incidents, particularly incidents involving small
children and disabled personsâ. Unlike the EU and Japanese
legislation, however, the US legislation fails to describe in
any technical detail the methods by which the objectives of
the bill are to be implemented.
1.4 Wide-angle cameras to cover
vehicle blind-zones
As we shown in previous sections, the blind-zones on some
vehicles can be quite large, particularly for vehicles such as
SUVs and LGVs. The aim of this paper is to demonstrate
that a good quality, undistorted image of a vehicleâs blind-
zones can be displayed to the driver of the vehicle using a
wide-angle camera system. As shown in
Fig. 5
a
, standard
lens camera systems (e.g. 45
8
FOV lenses) are unable to
fully cover the blind-zone of some SUVs. Considering that
cameraâ monitor systems generally display a âreferenceâ
point (i.e. part of the body of the vehicle) on screen, a
standard lens camera with an FOV of 45
8
can only cover,
perhaps, 1 m of the SUV blind-zone.
Fig. 5
b
illustrates
how the use of a wide-angle lens camera system (e.g.
.
100
8
FOV lenses) enables the entire SUV rearward
blind-zone to be covered.
Fig. 6
shows a sample placement of two wide-angle
cameras mounted on an LGV. Camera 1 is a 135
8
wide-
angle camera, located approximately half-way down the
length of the LGV, and 3 m off the ground plane. The
optical axis of camera 1 is tilted at 15
8
from the side of the
LGV trailer. Camera 2 is a 135
8
wide-angle camera,
located in the middle of the front cabin at about 2 m off
the ground plane. The optical axis of camera 2 is tilted at
20
8
from the front face of the cabin. With both cameras
corrected for distortion,
Fig. 7
shows the areas in the
vicinity of the vehicle that can be displayed to the driver.
Such a camera system would cover all the blind zones of
the LGV shown in
Fig. 2
, and would meet the requirements
of both the EU Directive 2003
/
97
/
EC (
Fig. 3
) and the
proposed Japanese legislation (
Fig. 4
).
Certain areas around LGVs need very wide wide-angle
lens cameras (e.g. ďŹsh-eye lens cameras) to display the
entire blind-zone to the driver. However, problems arise
because of the deviation of wide-angle lens cameras from
the rectilinear pin-hole camera model, because of
geometric distortion effects caused by lens elements. Fish-
eye cameras deviate substantially from the pin-hole model,
introducing high levels of geometric nonlinear distortion.
Because of this distorted representation of the real-world
scene onscreen, there is potential for obstacles and
VRUs to not be recognised by the driver. Additionally,
the distortion may cause the driver to mis-judge the
distance to objects, because of the nonlinearity of the view
presented.
Figure 4
Proposed Japanese legislation: area in which a
cylinder (1 m high, 0.3 m diameter) must be at least
partially visible to the driver of the (right-hand drive) LGV
[20]
a
LGV
.
7.5 t
b
LGV
,
7.5 t
Figure 5
Coverage area for standard and wide-angle lens
cameras mounted on the rear of a typical SUV
a
Example coverage area for standard lens camera (e.g. 45
8
FOV)
b
Example coverage area for wide-angle lens camera (e.g. 100
8
FOV)
Figure 6
Example of a potential wide-angle camera
placement on an LGV that would meet the requirements
of the EU Directive 2003
/
97
/
EC and the proposed
Japanese legislation
22
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
Thus, camera calibration and distortion correction are
important tasks for automotive camera applications. Not
only do they make images captured by the camera more
visually intuitive to the human observer, they are often also
necessary for computer vision tasks that require the
extraction of geometric information from a given scene.
The following sections describe some of the effects of
wide-angle and ďŹsh-eye lens distortion.
2
Radial lens distortion and
its correction
Radial lens distortion causes points on the image plane in the
wide-angle
/
ďŹsh-eye camera to be displaced in a nonlinear
fashion from their ideal position in the rectilinear pin-hole
camera model, along a radial axis from the centre of
distortion in the image plane (
Fig. 8
). The visual effect of
this displacement in ďŹsh-eye optics is that the image will a
higher resolution in the foveal areas, with the resolution
decreasing nonlinearly towards the peripheral areas of the
image.
For normal and narrow FOV cameras, the effects of radial
distortion can be considered negligible for most applications.
However, in wide-angle and ďŹsh-eye lenses, radial distortion
can cause severe problems, not only visually but also
for further processing in applications such as object
detection, recognition and classiďŹcation. Additionally, the
radial distortion introduced by these lenses does not preserve
the rectilinearity of an object in its transformation from real-
world coordinates to the image plane. Straight lines in the
real world can usually be approximated as circular sections in
the distorted image plane
[22 â 24]
, as evidenced in
Fig. 9
.
The models described in this section are relationships between
the distorted radial distance,
r
d
, and the corresponding
undistorted radial distance,
r
u
. Both are measured from the
COD (described in Section 3.1). Models of radial distortion
can be grouped into two main categories: polynomial and non-
polynomial models.
2.1 Polynomial models of radial
distortion
The use of polynomials to model radial distortion in lenses is
well established
[25â 33]
. From an embedded point of view,
polynomials are desirable as they do not require the
implementation of computationally demanding numerical
algorithms, in contrast to log and tan-based functions that
are required for non-polynomial models (however, through
the use of look-up tables, this advantage is lessened).
With the exception of the division model, the models
described in this section are the functions of the undistorted
radial distance, that is
r
d
is a function of
r
u
. It is usually
necessary to convert a distorted image to an undistorted
image, and thus obtaining
r
u
as a function of
r
d
is
desirable. Problems arise with polynomial models because
of the fact that there is no general analytical method to
invert them, that is, there is no general method to invert a
forward model to an inverse model for use in the radial
distortion correction. However, back-mapping (described in
Figure 8
Displacement of points in radial distortion, where
P is the, r
d
is the distorted radius and r
u
is the undistorted
radius
Figure 7
Areas of coverage if the video from the cameras
mounted in
Fig. 6
were corrected for geometric distortion
Figure 9
Photographic example of ďŹsh-eye distortion
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
23
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
Section 2.3.2) provides a means by which the forward model
can be used to convert a distorted image to an undistorted
rectilinear image.
2.1.1 Standard polynomial model:
The standard
model for radial distortion is an odd-ordered polynomial,
as described by Slama in
[25]
and, subsequently, used in
[26, 29, 32, 34]
r
d
Âź
r
u
Ăž
X
1
n
Âź
1
k
n
r
2
n
Ăž
1
u
Âź
r
u
Ăž
k
1
r
3
u
Ăž
k
2
r
5
u
Ăž Ăž
k
n
r
2
n
Ăž
1
u
Ăž
(1)
where
r
d
is the distorted radius,
r
u
the undistorted radius and
k
the coefďŹcients of the polynomial. It is generally considered
that (1) cannot accurately describe the level of distortion
introduced by wide-angle lenses
[35, 36]
. However, for
standard lenses, it is usually considered that a ďŹfth-order
polynomial is sufďŹcient for the radial distortion correction
[32]
r
d
Âź
r
u
Ăž
k
1
r
3
u
Ăž
k
2
r
5
u
(2)
Although there is no exact analytical method of inverting (1),
an inverse to the ďŹfth-order version (2) can be approximated
as described by Mallon and Whelan
[32]
r
u
Âź
r
d
r
d
k
1
r
2
d
Ăž
k
2
r
4
d
Ăž
k
2
1
r
4
d
Ăž
k
2
2
r
8
d
Ăž
2
k
1
k
2
r
6
d
1
Ăž
4
k
1
r
2
d
Ăž
6
k
2
r
4
d
!
(3)
2.1.2 Division model for radial distortion:
Fitzgibbon introduces the so-called âdivision modelâ in
[31]
r
u
Âź
r
d
1
Ăž
P
1
n
Âź
1
k
n
r
2
n
d
Âź
r
d
1
Ăž
k
1
r
2
d
Ăž
k
2
r
4
d
Ăž Ăž
k
n
r
2
n
d
Ăž
(4)
It should be noted that this is inherently an inverted model,
that is, it models the undistorted radial distance of a point as a
function of the distorted radial distance of that point.
Although (4) is similar in form to (1), it is important to
note that this is not an approximation to an inversion of
the
standard
polynomial
model.
Rather,
both
are
approximations to the cameraâs true distortion curve. The
division model suffers from problems similar to those in
(1), that is, that there is no general method for inversion,
and it cannot accurately describe the distortion introduced
by wide-angle
/
ďŹsh-eye lenses.
The division model is often used when circle-ďŹtting is
employed to calibrate a speciďŹc lens, as this allows
distortion estimation to be reformulated as circle-ďŹtting for
which many algorithms are available
[23, 24]
.
2.1.3 Polynomial ďŹsh-eye transform (PFET):
It has
been noted already that (1) and (4) can be used to describe
distortion in standard, non-ďŹsh-eye lenses, but that these
models are insufďŹcient to accurately describe the level of
distortion introduced by wide-angle
/
ďŹsh-eye lenses.
A polynomial that uses both odd and even coefďŹcients can
be used to accurately model the radial distortion introduced
by a ďŹsh-eye lens
[30, 33, 37, 38]
. Basu and Licardie (30,
37) refer to this as the polynomial ďŹsh-eye transform (PFET)
r
d
Âź
X
1
n
Âź
0
k
n
r
n
u
Âź
k
0
Ăž
k
1
r
1
u
Ăž
k
2
r
2
u
Ăž Ăž
k
n
r
n
u
Ăž
(5)
The beneďŹt of using the coefďŹcients beyond the ďŹfth order is
generally considered negligible with the PFET
[30, 37]
.
2.2 Non-polynomial models of
radial distortion
In this section, we introduce several ďŹsh-eye distortion
models that are not based on a polynomial approximation
of the ďŹsh-eye lens distortion curve. One of the more
notable advantages of using non-polynomial models over
the polynomial models is that they are, in general, readily
inverted using analytical methods. This is particularly useful
where distortion models are used in the correction of wide-
angle camera systems.
2.2.1 Perspective model:
The principle of central
projection states that for a rectilinear pin-hole camera, the
following is true
r
u
Âź
f
tan (
u
)
(6)
where
f
is the apparent focal length of the ďŹsh-eye camera
and
u
the angle of a light ray from the optical axis, in
radians. The principle of equidistant projection states that
for a lensed camera, the following is true
r
d
Âź
f
u
(7)
In
[39]
, the âperspective modelâ of radial distortion is derived
from these two principles
r
d
Âź
f
tan
1
r
u
f
(8)
This is the conversion between the rectilinear space described
by (6) and the distorted space described by (7). The apparent
focal length
f
is not necessarily the same as the actual focal
length of the ďŹsh-eye camerasystem, since the ďŹsh-eye
optics often include several different groups of lenses that
affect the actual physical focal length of the ďŹsh-eye camera.
24
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
The inverse of this model is
r
u
Âź
f
tan
r
d
f
(9)
2.2.2 Fish-eye transform:
Basu and Licardie proposed
the FET in
[30, 37]
. This model is based on the observation
that the ďŹsh-eye image has a higher resolution in the foveal
areas and a lower resolution towards the peripheral areas
r
d
Âź
s
ln 1
Ăž
l
r
u
(10)
where
s
is a simple scalar and
l
controls the amount of distortion
across the image. The inverse of this model is given by
r
u
Âź
e
r
d
=
s
1
l
(11)
2.2.3 FOV model:
Devernay and Faugeras describe the
FOV model in
[40]
, based on a simple optical model of a
ďŹsh-eye lens
r
d
Âź
1
v
tan
1
2
r
u
tan
v
2
(12)
where
v
is the apparent angular FOV of the ďŹsh-eye camera.
As with
f
in the perspective model,
v
may not correspond
with the actual camera-system FOV, since the complex
ďŹsh-eye optics involved may not exactly follow this model.
Additionally, they point out that this model may not always
be sufďŹcient to model the complex distortion of ďŹsh-eye
lenses. In these cases, Devernay and Faugeras state that
(1) can be used with
k
1
Âź
0 before applying (12).
The inverse of this model is
r
u
Âź
tan(
r
d
v
)
2 tan(
v
=
2)
(13)
2.3 Radial distortion correction
Radial distortion correction is the process by which points in the
distorted ďŹsh-eye image are transformed to points in the
undistorted image. It is possible that radial distortion can be
optically corrected using an appropriate combination of
compensating lenses. However, according to Bogner in
[41]
,
the amount of distortion that can be corrected by lenses is
physically limited by the refractive, reďŹective and transmissive
characteristics of the materials from which they are made.
The best wide-angle optics produce acceptable rectilinear
images at FOVs up to 110
8
. Therefore for wide-angle lenses
with FOVs greater than 110
8
, it is necessary to perform some
form of post-processing to convert the image to the rectilinear
model. Indeed, often it is preferable to use post-processing to
correct the distortion of cameras with FOVs of up to 110
8
, as
the optics to convert from the radially distorted image to the
rectilinear image can be expensive.
2.3.1 Calibration:
In general, the correction of radial
distortion involves a calibration procedure to determine the
parameters of one of the ďŹsh-eye models described in the
previous sections, to ďŹt the distortion of a particular ďŹsh-
eye lens camera. The distortion can then be corrected by
inverting the model and transforming each pixel in the
image according to that model (the exception is the
division model described by (4), which is already an inverse
model). This results in an âundistortedâ image.
The majority of calibration procedures require the use of a
calibration diagram with known geometry in 3D space.
Features from the calibration diagram, such as corners,
dots, lines and circles or any other feature that are easily
extracted from the image, are used to calibrate the camera.
This is known as photogrammetric calibration, and there
are numerous published methods using calibration diagrams
[26, 30, 36 â 38, 42 â 44]
. Alternatively, a self-calibration
method can be employed, whereby the calibration system
has no a priori knowledge of the scene. With this
method, the necessary information is extracted from an
arbitrary scene via point correspondences in multiple view
geometry, circle-ďŹtting or other suitable techniques
[22, 29,
31, 45, 46]
.
2.3.2 Vacant pixels:
Because of the essential âstretchingâ
effect of distortion correction (undistortion), the resultant
image will contain many vacant pixels that will not been
mapped (i.e. will not been ďŹlled with a re-mapped pixel
value) during the undistortion procedure, as shown by the
dark pixels in
Fig. 10
. Interpolation methods can be used
to overcome this, as is implemented in
[39, 42, 47]
.
As an alternative to simple forward stretching and
interpolation, a technique known as âback-mappingâ (or
âinverse-mappingâ) may be used [33]. Instead of mapping
every pixel in the distorted space to the undistorted space,
back-mapping does the inverse. Back-mapping calculates
the location of the pixel in undistorted space and uses a
forward transform model to determine that pixelâs location
in distorted space. This overcomes the problem of vacant
Figure 10
This is a corrected version of
Fig. 9
Dark lines in this image is the pattern that is created by the vacant
pixels in an undistorted image
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
25
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
pixels because every pixel in the undistorted image will be
assigned a value, as shown in
Fig. 11
.
A problem with back-mapping arises because each
pixel in distorted space may be spread over several pixels
in the undistorted space during the radial distortion
correction
and
back-mapping
procedure.
To
help
overcome this problem, bilinear interpolation can be built
into the back-mapping algorithm by allowing the location
of each pixel in distorted space to be estimated with non-
integer accuracy using a weighted average of the
surrounding pixels.
3
Other geometric considerations
3.1 Centre of distortion
Radial lens distortion is the displacement of image points
along a radial axis from a single point on the image plane.
This point is known as the COD, which does not
necessarily align with the centre of the image sensor.
Therefore in order to be able to fully model and correct
radial lens distortion, it is necessary to accurately determine
the position of the COD on the image plane. In physical
terms, the COD can often be assumed to be the point at
which the optical axis of the camera lens system intersects
the image plane. Inaccurate estimation of the COD will
introduce additional radial distortion (after radial distortion
correction has been applied), as well as a degree of tangential
distortion, described in Section 3.2 (
Fig. 12
). The
estimation of the COD is only relevant in wide-angle
camera
systems
that
display
radial
lens
distortion.
According to Ruiz
et al.
[48]
, the location of the COD in
cameras with small to moderate FOV, in which no
distortion is readily apparent, is of minor importance. A
range of methods to estimate the COD are described in
[30, 33, 34, 37, 38]
.
3.2 Tangential distortion
According to Mallon and Whelan
[32]
, tangential distortion
causes a geometric shift of the image along, and tangential to,
the radial direction through the COD, as demonstrated in
Fig. 13
.
There are two primary causes of tangential distortion:
inaccurate COD estimation and thin prism distortion.
According to Weng
et al.
[27]
, thin prism distortion arises
from imperfection in lens design and manufacturing as well
as camera assembly, and causes a degree of both radial
and tangential distortion. Slama
[25]
and Shah and
Aggarwal
[36]
, both give mathematical models to deal
with tangential distortion. Stein
[49]
demonstrates that
low levels of tangential distortion can be compensated
for by using COD estimation. Most other published
research makes the assumption that other causes of
tangential distortion can be considered negligible
[24, 26,
32, 40, 50]
.
3.3 Uneven illumination
In cameras with wide FOV, such as those with ďŹsh-eye and
wide-angle lenses, there is often a noticeable nonlinear loss
Figure 12
Example of the shift of the centre of distortion
away from the centre of the image
The dotted lines show the distortion with the COD at the image
centre
Figure 13
Effect of tangential distortion
Figure 11
Example of undistorted image using back-
mapping with bilinear interpolation to overcome the
problem of vacant pixels
26
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
of illuminance towards the periphery of an image because of
the structure of the camera lens system. This is noticeable in
Fig. 9
, where the extremities of the image are considerably
darker than the central area. One common cause of this
uneven illumination is the fact that the camera pupil
appears elliptical when viewed from an angle off the optical
axis. A commonly used model for this cause of uneven
illumination is the âcosine fourthâ (or cos
4
) model. Under
this model, the illumination of a given scene point is
reduced by cos
4
u
, where
u
is the off-axis angle of the scene
point
[51]
E
u
E
0
Âź
cos
4
u
(14)
where
E
0
the illuminance for a scene point on the optical axis
and
E
u
the illuminance for a scene point off the optical axis.
However, Welford
[51]
states that this simple model does
not fully describe the uneven illumination in a camera system.
Aggarwal
et al.
[52]
and Welford [51] describe another
position-dependent cause of the loss of illuminance, which
is attributed to vignetting. It is based on the fact that,
although an on-axis beam emanating from a distant point
fully illuminates the camera aperture, an off-axis beam
emanating from a distant point is obstructed by the lens
elements and unable to fully ďŹll the aperture.
However, Aggarwal et al.
[52]
observed that there are still
disparities between observed uneven illumination effects for
given cameras and the uneven illumination that would be
predicted by the standard cos
4
and vignetting models. They
state that the these models do not take into consideration
an effect they refer to as pupil aberration, which refers to
the nonlinear refraction of rays that results in a signiďŹcant
non-uniform light distribution across the aperture. Other
possible causes include light being obstructed by the ďŹeld-
stop, lens-rim or other components of the camera.
There have been several other model-based approaches to
correct for uneven illumination. For example, Asada
et al.
[53]
developed a variable cone model to correct for uneven
illumination and Yu
et al.
[54]
proposed to use a
hyperbolic function to describe the uneven illumination for
each scanline in the image.
A non-model-based approach was proposed by Leong
et al.
in
[55]
. They observed that uneven illumination is an additive
low-frequency signal in the image. They convolve a given
image with a Gaussian kernel ďŹlter, the aim being to smooth
the image until it is devoid of all features but retains the
underlying uneven illumination pattern. This pattern is then
used to correct the uneven illumination in a given image. The
problem with this method is that it is potentially difďŹcult to
calculate the cut-off frequency of the Gaussian ďŹlter because
of the complex nature of the uneven illumination. Equally
important is the fact that if this method is employed, any
external cause of uneven illumination will be removed, which
is undesirable (it is only desirable that uneven illumination
because of the properties of the camera and lens system is
removed).
A relatively simple method to correct for uneven
illumination introduced by the camera is described in
[56]
.
If a Lambertian surface (i.e. a surface that reďŹects light
equally in all directions) is imaged, an intensity proďŹle can
be obtained that describes the uneven illumination for a
given camera. A uniformly illuminated white surface can be
used as an approximation to a Lambertian surface. The
maximum intensity response (or pixel value) is found, and a
correction factor is determined for each pixel location using
the following
P
lut
i
,
j
Ă°
Ă Âź
P
ref , max
P
ref
i
,
j
Ă°
Ă
(15)
where
P
lut
i
,
j
Ă°
Ă
is the correction factor for a given pixel
location to be stored in a look-up table,
P
ref
i
,
j
Ă°
Ă
is the
Figure 14
Intensity proďŹle of the camera used to capture
Fig. 9
a
Image of evenly illuminated white surface
b
The corresponding correction factor LUT
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
27
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
intensity response for the same pixel location and
P
ref , max
is
the maximum value of
P
ref
i
,
j
Ă°
Ă
. Any image taken with the
same camera can be corrected by simply multiplying each
pixel value by the corresponding LUT value
im
corr
i
,
j
Ă°
Ă Âź
im
orig
i
,
j
Ă°
Ă
P
lut
i
,
j
Ă°
Ă
(16)
Fig. 14
shows an image of an evenly illuminated white
surface, and the corresponding correction factor LUT.
Fig. 15
shows an example of a corrected image using this
LUT.
4
Summary
In this paper, we discussed the rationale within the
automotive industry for the on-vehicle use of camera
systems. SpeciďŹcally, with increasing numbers of vehicles
on the worldâs roads, statistics show that a signiďŹcant
percentage of trafďŹc fatalities are caused by drivers who are
not aware of VRUs in their vehicleâs blind-zones.
Although customer demand for products that give
information about a vehicleâs blind-zones to the driver is
already high, pending and existing legislation is, de facto,
making the installation of such cameras on vehicles a
necessity, particularly for large SUVs and LGVs. However,
although small vehicles are exempt from current European
and Japanese legislation, the US legislation speciďŹcally
targets the smaller private vehicles.
Ultimately, camera systems, in conjunction with other
available technology, will be necessary equipment for
improved visibility around vehicles, particularly SUVs and
LGVs. Wide-angle
/
ďŹsh-eye camera systems are currently
the best candidates, because of their ability to display even
the largest blind-zones of a vehicle.
Considering this, we described several types of visual
distortion resulting from the use of wide-angle cameras
(particularly
ďŹsh-eye
cameras),
and
reviewed several
methods to correct for these effects. Radial distortion is
by far the most evident geometric effect of using
ďŹsh-eye cameras, although other distortions should be
considered, such as inaccurate COD estimation and uneven
illumination.
If all the necessary distortions are removed from a ďŹsh-eye
image, the result is an image that, for many applications,
accurately approximates the desired rectilinear model.
Fig. 16
shows an example of the correction of all distortions
described in this paper. Comparison of this ďŹgure with the
original distorted image in
Fig. 9
illustrates the combined
effectiveness of the methods described.
Additionally, there is the potential for advanced processing
if camera systems are used in place of mirrors, such as VRU
detection, which can actively warn the driver of the vehicle
of potential collisions. Future work could include the
examination of existing products, and potential products,
that employ this type of detection. Additionally, although
this paper did not describe it, fusion of camera systems
with other sensory devices (e.g. passive
/
active infra-red,
sonar and radar) is an area that deserves a full review.
Figure 16
Fig. 9
with all distortions removed
Figure 15
Fig. 9
with the uneven illumination removed
a
Original image
b
Uneven illumination removed
28
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
5
Acknowledgments
This research is funded by Enterprise Ireland and Valeo Vision
Systems (formerly Connaught Electronics Ltd.) under the
Enterprise Ireland Innovation Partnerships Scheme.
6
References
[1]
European New Car Assessment Program (Euro NCAP):
http://www.euroncap.com/,
accessed March 2008
[2]
HOBBS A.
: âEuro NCAP
/
MORI survey on consumer
buying interests (speech and presentation)â. Proc. Euro
NCAP conf.: creating a market for safety â 10 years of
Euro NCAP, Brussels, Belgium, November 2005
[3]
Japanese National Agency for Automotive Safety and
Victims Aid: âNew Car Assessment (JNCAP)â.
http://www.
nasva.go.jp/mamoru/indexe.html
, accessed March 2008
[4]
US National Highway TrafďŹc Safety Administration:
âUSNCAPâ.
http://www.safercar.gov/
, accessed March 2008
[5]
Department for Transport Research Database: âSecondary
safetyâ.
http://www.dft.gov.uk/rmd/subprogramme.asp?
intProgrammeID
Âź
74&intSubProgrammeID
Âź
132
, accessed
June 2008
[6]
Department for Transport Research Database: âPrimary
and e-safetyâ.
http://www.dft.gov.uk/rmd/subprogramme.
asp?intProgrammeID
Âź
74&intSubProgrammeID
Âź
134
,
accessed June 2008
[7]
UNECE Working Party on General Safety Provisions
(GRSG): âGRSG-83-Inf15e: Outline of Draft Amendment to
ECE Regulation No.46 (Draft Requirements for Driverâs
Field of Vision of Immediate Frontward and Sideward)â,
2002
[8]
Consumers Union: âConsumer reports â blind-zone
measurementsâ.
http://www.consumerreports.org/
,
accessed March 2008
[9]
EHLGEN T.
,
PAIDLA T.
: âManeuvering aid for large vehicle
using omnidirectional camerasâ. Proc. IEEE Workshop on
Applications of Computer Vision, Austin, Texas, US,
February 2007, pp. 17 â 17
[10] European Commission: âCommunity database on
accidents on the roads in Europe (CARE)â.
http://ec.
europa.eu/
, accessed November 2007
[11] European Commission Directorate-General for Energy
and Transport: âHalving the number of road accident
victims in the EU by 2010: a shared responsibilityâ
(European Road Safety Action Programme), 2004
[12] Commission
of
the
European
Communities:
âCommission staff working document: accompanying
document to the proposal for a directive of the
European Parliament and of the Council on the retroďŹtting
of mirrors to heavy goods vehicles 19 registered in the
Community Full Impact Assessment COM(2006)570â,
October 2006
[13] UK Health and Safety Executive: âStatistics: number of
workplace transport injuriesâ.
http:
//
www.hse.gov.uk
/
,
accessed March 2008
[14] Kids and Cars Organisation:
http://kidsandcars.org/
,
accessed March 2008
[15]
MCLOUGHLIN E.
,
MIDDLEBROOKS J.
,
ANNEST J.
,
HOLMGREEN P.
,
DELLINGER A.
: âInjuries and deaths among children left
unattended in or around motor vehicles â US, July
2000 â June 2001â,
Morb. Mortal. Wkly. Rep.
, 2002,
51
,
(26), pp. 570 â 572
[16]
PATEL R.
,
DELLINGER A.
,
ANNEST J.
: âNonfatal motor-vehicle â
related backover injuries among children â United States,
2001 â 2003â,
Morb. Mortal. Wkly. Rep.
, 2005,
54
, (6),
pp. 144 â 146
[17]
WANG J.
,
KNIPLING R.
: âLane change
/
merge crashes:
problem size assessment and statistical descriptionâ.
Report No. DOT HS 808-075. Technical report, United
States Department of Transportation Publication, 1994
[18] European Parliament and Council: âDirective 2003
/
97
/
EC of 10 November 2003 on the approximation of the laws
of the Member States relating to the typeapproval of
devices for indirect vision and of vehicles equipped with
these devices, amending Directive 70
/
156
/
EEC and
repealing Directive 71
/
127
/
EECâ. November 2003
[19] European Parliament and Council: âDirective 2007
/
38
/
EC of 11 July 2007 on the retroďŹtting of mirrors to heavy
goods vehicles registered in the Communityâ. July 2007
[20] UNECE Working Party on General Safety Provisions
(GRSG): âGRSG-89-26: Proposal for Step-2 revision of
Regulation No. 46â. 2005
[21] American Library of Congress: âS.694: Cameron
Gulbransen Kids and Cars Safety Act of 2007 (Reported in
Senate)â.
http://thomas.loc.gov/cgibin/query/D?c110:2:./
temp/c110G9XilI::
, accessed June 2008
[22]
BRAUER-BURCHARDT C.
,
VOSS K.
: âA new algorithm to correct
ďŹsh-eye- and strong wide-angle-lens-distortion from
single imagesâ. Proc. IEEE Int. Conf. Image Processing,
Thessaloniki, Greece, October 2001, vol. 1, pp. 225 â 228
[23]
BARRETO J.P.
,
DANIILIDIS K.
: âFundamental matrix for
cameras with radial distortionâ. Proc. IEEE Int. Conf.
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
29
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
Computer Vision, Beijing, China, October 2005, vol. 1,
pp. 625 â 632
[24]
STRAND R.
,
HAYMAN E.
: âCorrecting radial distortion by
circle ďŹttingâ. Proc. BMVA British Machine Vision Conf.,
Oxford, UK, September 2005
[25]
S L A M A
C . C .
( E D. )
: âManual of photogrammetryâ
(American Society of Photogrammetry, 1980, 4th edn.)
[26]
TSAI R.
: âA versatile camera calibration technique for
high-accuracy 3D machine vision metrology using off-the-
shelf TV cameras and lensesâ,
IEEE J. Robot. Autom.
, 1987,
3
, (4), pp. 323 â 344
[27]
WENG J.
,
COHEN P.
,
HERNIOU M.
: âCamera calibration with
distortion
models
and
accuracy
evaluationâ,
IEEE
Trans. Pattern Anal. Mach. Intelli.
, 1992,
14
, (10),
pp. 965 â 980
[28]
NOMURA Y.
,
SAGARA M.
,
NARUSE H.
,
IDE A.
: âSimple calibration
algorithm
for
high-distortion
lens
cameraâ,
IEEE
Trans. Pattern Anal. Mach. Intell.
, 1992,
14
, (11),
pp. 1095 â 1099
[29]
DEVERNAY F.
,
FAUGERAS O.D.
: âAutomatic calibration and
removal
of
distortion
from
scenes
of
structured
environmentsâ. Proc. SPIE Investigative and Trial Image
Processing Conf., San Diego, California, US, 1995,
vol. 2567, pp. 62 â 72
[30]
BASU A.
,
LICARDIE S.
: âAlternative models for ďŹsh-eye
lensesâ,
Pattern Recognit. Lett.
, 1995,
16
, (4), pp. 433 â 441
[31]
FITZGIBBON A.W.
: âSimultaneous linear estimation of
multiple view geometry and lens distortionâ. Proc. IEEE
Computer Society Conf. Computer Vision and Pattern
Recognition, Kauai, Hawaii, US, December 2001, vol. 1,
pp. 125 â 132
[32]
MALLON J.
,
WHELAN P.F.
: âPrecise radial un-distortion of
imagesâ. Proc. IEEE Int. Conf. Pattern Recognition, Surrey,
UK, August 2004, vol. 1, pp. 18 â 21
[33]
ASARI K.V.
: âDesign of an efďŹcient VLSI architecture for
non-linear spatial warping of wide-angle camera imagesâ,
J. Sys. Architec.
, 2004,
50
, (12), pp. 743 â 755
[34]
AHMED M.
,
FARAG A.
: âNonmetric calibration of camera
lens
distortion:
differential
methods
and
robust
estimationâ,
IEEE Trans. Image Process.
, 2005,
14
, (8),
pp. 1215 â 1230
[35]
THIRTHALA S.
,
POLLEFEYS M.
: âThe radial trifocal tensor:
a tool for calibrating the radial distortion of wide-angle
camerasâ. Proc. IEEE Computer Society Conf. Computer
Vision and Pattern Recognition, San Diego, California, US,
June 2005, vol. 1, pp. 321 â 328
[36]
SHAH S.
,
AGGARWAL J.K.
: âIntrinsic parameter calibration
procedure for a (high-distortion) ďŹsh-eye lens camera
with distortion model and accuracy estimationâ,
J. Pattern
Recognit.
, 1996,
29
, (11), pp. 1775 â 1788
[37]
BASU A.
,
LICARDIE S.
: âModeling ďŹsh-eye lensesâ. Proc.
IEEE
/
RSJ Int. Conf. Intelligent Robots and Systems,
Yokohama, Japan, July 1993, vol. 3, pp. 1822 â 1828
[38]
SHAH S.
,
AGGARWAL J.K.
: âA simple calibration procedure
for ďŹsh-eye (high distortion) lens cameraâ. Proc. IEEE Int.
Conf. Robotics and Automation, New Orleans, Louisiana,
US, April 1994, vol. 4, pp. 3422 â 3427
[39]
ISHII C.
,
SUDO Y.
,
HASHIMOTO H.
: âAn image conversion
algorithm from ďŹsh eye image to perspective image for
human eyesâ. Proc. IEEE
/
ASME Int. Conf. Advanced
Intelligent Mechatronics, Port Island, Kobe, Japan, July
2003, vol. 2, pp. 1009 â 1014
[40]
DEVERNAY F.
,
FAUGERAS O.
: âStraight lines have to be
straight: automatic calibration and removal of distortion
from scenes of structured enviromentsâ,
Int. J. Mach. Vis.
Appl.
, 2001,
13
, (1), pp. 14 â 24
[41]
BOGNER S.L.
: âAn introduction to panospheric imagingâ.
Proc. IEEE Int. Conf. Systems, Man and Cybernetics
Intelligent Systems for the 21st Century, Vancouver, British
Columbia, Canada, October 1995, vol. 4, pp. 3099 â 3106
[42]
FERNANDES J.C.A.
,
FERREIRA M.J.O.
,
NEVES J.A.B.C.
,
COUTO C.A.C.
:
âFast correction of lens distortion for image applicationsâ.
Proc. IEEE Int. Symp. Industrial Electronics, GuimarËaes,
Portugal, July 1997, vol. 2, pp. 708 â 712
[43]
BAKSTEIN H.
,
PAJDLA T.
: âPanoramic mosaicing with a
180degree ďŹeld of view lensâ. Proc. IEEE Workshop on
Omnidirectional Vision, Hilton Head Island, South Carolina,
US, June 2002, pp. 60 â 67
[44]
ZHANG Z.
: âFlexible camera calibration by viewing a
plane from unknown orientationsâ. Proc. IEEE Int. Conf.
Computer Vision, Kerkyra, Greece, September 1999,
vol. 1, pp. 666 â 673
[45]
BRAUER-BURCHARDT C.
,
VOSS K.
: âAutomatic lens distortion
calibration using single viewsâ. Mustererkennung, DAGM-
Symposium, Kiel, Germany, September 2000, pp. 187 â 194
[46]
STEIN G.P.
: âLens distortion calibration using point
correspondencesâ. Proc. IEEE Computer Society Conf.
Computer Vision and Pattern Recognition, San Juan,
Puerto Rico, June 1997, pp. 602 â 608
[47]
YU W.
,
CHUNG Y.
: âAn embedded camera lens distortion
correction method for mobile computing applicationsâ.
Proc. IEEE Int. Conf. Consumer Electronics, Los Angeles,
California, US, June 2003, pp. 400 â 401
30
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
&
The Institution of Engineering and Technology 2008
doi: 10.1049/iet-its:20080017
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.
[48]
RUIZ A.
,
LOPEZ-DE TERUEL P.E.
,
GARCIA-MATEOS G.
: âA note on
principal point estimabilityâ. Proc. IEEE Int. Conf. Pattern
Recognition, Quebec, Canada, August 2002, vol. 2,
pp. 304 â 307
[49]
STEIN G.P.
: âInternal camera calibration using rotation
and geometric shapesâ. M.Sc Thesis, Massachusetts
Institute of Technology, 1993
[50]
HELFERTY
J.P.
,
ZHANG
C.
,
MCLENNAN
G.
,
HIGGINS
W.E.
:
âVideoendoscopic distortion correction and its application
to virtual guidance of endoscopyâ,
IEEE Trans. Med.
Imaging
, 2001,
20
, (7), pp. 605 â 617
[51]
WELFORD W.T.
: âUseful optics (Chicago lectures in
physics)â (University of Chicago Press, 1991)
[52]
AGGARWAL M.
,
HUA H.
,
AHUJA N.
: âOn cosine-fourth and
vignetting effects in real lensesâ. Proc. IEEE Int. Conf.
Computer Vision, Vancouver, British Columbia, Canada,
July 2001, vol. 1, pp. 472 â 479
[53]
ASADA N.
,
AMANO A.
,
BABA M.
: âPhotometric Calibration
of Zoom Lens Systemsâ. Proc. IEEE Int. Conf. Pattern
Recognition, Vienna, Austria, August 1996, vol. 1, pp. 186 â 190
[54]
YU W.
,
CHUNG Y.
,
SOH J.
: âVignetting distortion correction
method for high quality digital imagingâ. Proc. IEEE Int.
Conf. Pattern Recognition, Surrey, UK, August 2004,
vol. 3, pp. 666 â 669
[55]
LEONG F.J.W.M.
,
BRADY M.
,
MCGEE J.O.
: âCorrection of uneven
illumination (vignetting) in digital microscopy imagesâ,
BMJ
J. Clinical Pathology
, 2003,
56
, pp. 619 â 621
[56]
WYSZECKI G.
,
STYLES W.S.
: âColor science: concepts and
methods, quantitative data and formulaeâ (John Wiley
and Sons Inc., 2000, 2nd edn.)
IET Intell. Transp. Syst.
, 2009, Vol. 3, No. 1, pp. 19 â 31
31
doi: 10.1049/iet-its:20080017
&
The Institution of Engineering and Technology 2008
www.ietdl.org
Authorized licensed use limited to: NATIONAL UNIVERSITY OF IRELAND GALWAY. Downloaded on August 5, 2009 at 03:51 from IEEE Xplore. Restrictions apply.