Singular Point Detection
Singular Point Detection
Singular Point Detection
1 Introduction
The rest of the paper is organized as follows. Section 1.1 discusses the prior done in
this area. Section 2 presents an overview of the proposed approach. The concept of
linear phase portraits and its application to fingerprint images is discussed in detail in
section 3. The results of the experiments are in Section 4. And finally, the conclusions
are discussed in Section 5.
Figure 1 Illustration of fingerprint features. Global features: (b) Orientation image , singular
points Core(c) and Delta (d). Local minutiae features: Bifurcation(e), Ridge dot(f), Ridge
ending (g) and Ridge island(h).
There have been several approaches for the detection of singular points in literature.
By far, the most popular method is the one proposed by Kawagoe and Tojo [4] and is
based on the computation of the Poincaré index. The Poincaré index for any given
point in a vector field is computed by summing the orientation difference between
successive elements along a closed path around that point. It can be shown that around
closed curves the Poincaré index assumes only one of the discrete values
0˚,±180˚,±360˚. The singular regions can be easily detected since the singularities
corresponding to the loop, whorl and delta assume a Poincaré index value of
180˚,360˚ and -180˚ respectively . Recently, an interesting improvisation on the Poin-
caire index was proposed by Bazen and Gerez [1]. In this method, the line integration
around each point is avoided by using the result of Green’s theorem which states that,
a line integral around any point in a vector field can be replaced by a surface integral
of the curl of the vector field [12]. However, methods based on Poincaire index are
very sensitive to noise and lead to detection of false singularities in low-quality im-
ages.
Srinivasan and Murthy [10] proposed another approach based on the examination of
the local characteristics of the orientation image. They examine the dominant direction
of the orientation in four quadrants around each point and use several rules to detect
and classify the singular points. This method is very heuristic in its approach. A more
elegant approach was suggested by Capelli et al. [2]. They proposed a irregularity
operator that indicates the local deviation of the orientation around each point. The
operator is applied on each pixel in the image to yield an irregularity map. Singulari-
ties can be extracted by examining the dark regions within the map. However, the
extraction requires considerable processing of the irregularity map. They also pro-
posed a novel partitioning based approach, where the ridge flow is segmented into
regions of consistent orientation by an iterative clustering algorithm. Singular points
are determined by considering the locations where the region borders intersect.
2 Overview
Figure 2 shows the overview of the proposed approach. The orientation image describ-
ing the ridge flow is obtained using well known methods outlined in [6]. While it is
difficult to model the entire orientation image through any linear methods, it can be
modeled as a combination of piece-wise linear flows. Each piece wise linear flow is
obtained by moving a sliding window of size (WxW) across the orientation image. In
our case W=7. At each position of the window, we derive a parametric representation
of the local flow using linear phase portraits. This representation provides us not only
with the possible location of the singularity, but also about the nature of the singular-
ity. More details are presented in section 3. As each piece-wise linear flow is ana-
lyzed, evidence is accumulated about the possible location and type of the singulari-
ties in a spirit similar to generalized Hough transformation [3]. The two dimensional
accumulator array can be treated as an intensity image, where each position indicates
the likelihood of being a singular point. This image is processed using grey-scale
morphology and connected component analysis to obtain a best estimate of the type
and position of each of the singular points. The proposed approach of analyzing the
orientation locally has several advantages over global fingerprint models such as
1. The analysis assumes the orientation image to be piecewise linear, and can there-
fore handle any arbitrary flow pattern
2. The parameter estimation is done using a least square minimization approach
rendering the process very robust to noise
3. The analysis provides an over-complete description that can be used to recon-
struct a smooth estimate of the ridge flow.
In this section, we present an outline of linear phase portraits that is sufficient for
analysis of fingerprint images and also explain its application to singular point detec-
tion. A more complete discussion of linear phase portraits is presented in [6,7,9].
Phase portraits are based on the geometric theory of differential equations and are
used to derive qualitative description of vector flows and fluid flows in particular.
Consider a system of linear differential equations in two dimensions described by ,
dx (1)
x dt a11 a12 x b1
= = +
y dy a 21 a 22 y b2
dt
The two functions x(t) and y(t) represent a curve in the (x,y) plane. The differential
equation specified in Eqn. (1) specifies how the curve evolves with t. The geometrical
representation of the qualitative behavior of the curve is known as the phase portrait.
It can be shown that for a linear system of equations only a finite number of qualita-
tive phase portraits are possible [6], with the other instances being related by a equiva-
lence relation. The nature of the flow can be determined through its local properties
defined by
(5)
def ( X ) = (a11 − a22 ) 2 + (a21 + a12 ) 2
(6)
∆ = def ( X ) 2 − curl ( X ) 2
Table 1. Shows the possible configuration of phase portraits and their required condition. Of
all the possible configuration, only the spiral and the saddle type of phase portraits are relevant
to fingerprint images since they correspond directly with loop/whorl and delta type of
singularities
The orientation image is divided into a system of piecewise linear flows, using a slid-
ing window centered at each pixel in the orientation image. At each position the cor-
responding matrix A and B are determined. These represent the parameters of the
system of linear differential equations that is capable of reproducing the local
orientation. These parameters can be used to determine the location of the singular
point as outlined in the next section.
In order to derive a parameters A,B for each local flow, we adopt a least square mini-
mization approach that results in a synthesized flow with minimum deviation from the
observed flow. In particular we try to minimize the distance measures specified by
(7)
sin(θ − θˆ)
( i , j )∈W
D=
2
y a x + a 22 y + b2 (8)
θˆ( x, y ) = arctan = arctan 21
x a11 x + a12 y + b1
Here θˆ( x, y ) represents orientation from the synthesized flow and θ ( x, y ) repres-
sents the orientation in the observed flow. The distance metric D is invariant to ±180˚
ambiguity between θˆ( x, y ) θ ( x, y ) . Other distance metrics such as the acute
and
angle between the lines with slopes θˆ( x, y ) and θ ( x, y ) given by equation (9) are
also invariant to this phase ambiguity and may be used without affecting the nature of
the solution. The parameters (a11,a12,a21,a22,b1,b2) are evaluated using Levenberg-
Marquardt non-linear least squares minimization approach. More details of the im-
plementation is provided in [6].
(9)
tan θ ( x, y ) − tan θˆ( x, y )
D=
ˆ
( x , y ) 1 + tan θ ( x, y ) tan θ ( x, y )
The least squares optimization approach renders the estimation fairly robust to noise, a
common source of errors in the Poincaré index based approaches. The resilience of
the methods is shown in Figure 3 where the flow is reconstructed in the presence of
Gaussian noise in the orientation image.
Figure 3 (a)Original orientation flow (b) Flow in presence of noise (c) Reconstructed flow
using linear phase portrait analysis.
Given the linear phase portrait description, the singularities represent critical points
where the curve [x(t),y(t)] is stationary. This is specified by the condition,
[X ] = AX + B = 0 (1
0)
The position of the singular point is given by,
(
X c = − A−1 B ) (1
1)
However, all types of singularities satisfy this condition. In order to estimate the posi-
tions of the core and delta positions separately we make use of the classification pro-
vided in Table 1. As we analyze each section of the orientation image, an estimate of
the possible location for each singularity is obtained based on Eqn.(9). We use this
information to update an accumulator that represents the likelihood of each cell being
a singularity. Separate accumulators are maintained for core(spiral) and delta(saddle)
singularities. This is similar to the voting approach used in Generalized Hough Trans-
form. However, unlike Hough Transform, we also increment the vote of the neighbor-
ing cells in a weighted fashion to make the estimate more robust to noise.
The resulting position maps I(x,y) can be treated as grey scale images with each pixel
intensity proportional to the likelihood of the presence of a singularity. In order to
estimate the position of the singular points the following algorithm is used individu-
ally on both the spiral and the saddle map resulting from the phase portrait analysis.
1. The intensity image is binarized using both a relative and an absolute threshold T1
and T2. Here T1 represents threshold on the relative likelihood of the pixel being a
singular point and T2 represents the absolute threshold indicating the minimum
vote required by a pixel for it to be considered as a candidate position. The abso-
lute threshold is required to prevent the detection of false singular points in in-
stances where they are not present in the image at all. This case arises when the
fingerprint is placed on the sensor such that the core or delta(s) does not appear in
the image. However, due to noise, some linear neighborhoods may be incorrectly
classified as candidates for core or delta and may lead to a false detection. The
absolute threshold helps us to eliminate such spurious candidates. The binary im-
age M(x,y) is obtained using the rule,
1 if I ( x, y ) >max(T1 , T2 ) (12)
M ( x, y ) =
0 otherwise
T1 = max{I ( x, y )}, T2 = 10
2. Possible locations where the singularity may lie is obtained by performing con-
nected component analysis of the thresholded image. Each component is posi-
tioned roughly centered on each singularity. Outliers that have too few compo-
nents or too many components are rejected. The resulting map consists of candi-
date locations each associated with
3. The positions of the singular points are determined as the centroids of the candi-
date locations. For each connected component C
4 Experimental Evaluation
We evaluated the algorithm using 100 random images from FVC2002 DB1 database.
The images were cropped to remove the background prior to the processing. Fig. 3
and Figure 4 show the results over two such images.
Figure 4 (a)Original Image (b) Orientation Image (c) Spiral map (d) Saddle map (e)
Results of the spiral/saddle map processing.
In order to evaluate the accuracy of the proposed method, we estimated the position of
the singular points in each of the 100 images and compared it with the ground truth
locations manually marked by the authors. The deviation is measured in terms of the
absolute distance between the estimated and actual location of the singularities and is
shown for each sample. Also, the algorithm fails to detect singular points in certain
cases. A majority of these instances are where the singular points are located at the
edge of the fingerprint foreground and the absence of the surrounding regions make it
difficult to detect the deviation in the flow. The summary results are shown in Table
2. It can be seen that on an average the core and delta position can be localized within
a ridge width.
Figure 5 (a) Regression line for X position of the core location (b) Regression line for Y
position of the core position. Similar plot can be obtained for the delta location and have been
omitted for brevity.
In this paper we presented a novel approach for estimation of singular point position
in fingerprint images using linear phase portraits. The proposed approach presents
several advantages over existing methods based on Poincare index and in particular its
robustness to noise and its ability to provide a qualitative description of the flow at
each point. We also presented an objective evaluation of the algorithm over 100 im-
ages of FVC2002 DB1 database and showed that it provides a very accurate estimate
of the singular point positions. Future work will also involve providing an optimal
reconstruction algorithm for the orientation image based on the piece-wise linear
analysis. Currently, the flow parameters evaluated at each location in the orientation
image leading to an over determined system of piece-wise linear flows. We will ex-
plore the optimal sampling strategy in order to reduce computational complexity.
References
1. Bazen A. M. and Gerez S. H., “Systematic Methods for the Computation of Direc-
tional Fields and Singular Points of Fingerprints”, IEEE Transactions on PAMI,
Vol. 24, No. 7, July 2002
2. Cappelli R., Lumini A., Maio D., Maltoni D.,”Fingerprint Classification by Direc-
tional Image Partitioning”, IEEE PAMI, Vol. 21 No.5, pp.402-421, 1999
3. Duda R. O, Hart P. E. “Use of the Hough transformation to detect lines and curves
in pictures”, Communications of the ACM, Vol. 15, No. 1, 1972
4. Kawagoe M., Tojo A., “Fingerprint Pattern Classification”, Pattern Recognition,
Vol. 17 No. 3, pp. 295-303, 1984.
5. Maltoni D., Maio D., Jain A. K. and Prabhakar S, “Handbook of Fingerprint Veri-
fication”, Springer Verlag, 2003
6. Rao R., “A Taxonomy for Texture Description and Identification”, Springer-Verlag
1990.
7. Rao R., and Jain R., "Analyzing Oriented Textures through Phase Portraits" ICPR
1990, pp. 336-340.
8. Sherlock B. and Monro D. "A Model for Interpreting Fingerprint Topology", Pat-
tern Recognition, Vol. 26, No. 7, pp. 1047-1055, 1993.
9. Shu C. F., Jain R., and Quek F., "A linear Algorithm for Computing the Phase
Portraits of Oriented Textures," , CVPR 1991, pp. 352-357.
10.Srinivasan V. S., Murthy N. N., “Detection of Singular Points in Fingerprint Im-
ages”, Pattern Recognition, Vol. 25 No.2, pp.139-153, Feb. 1992.
11.Vizcaya P. and Gerhardt L. A Nonlinear Orientation Model for Global Description
of Fingerprints", Pattern Recognition, Vol. 29, No. 7, pp. 1221-1231, 1996.
12.Weisstein E. W, "Green' s Theorem", http://mathworld.wolfram.com
13.Zhou J, Gu J., Modeling Orientation Fields of Fingerprints with Rational Complex
Functions”, Pattern Recognition Vol. 37, No. 2, pp. 389-391, 2004.