+ All Categories
Home > Documents > Clasificación de huellas dactilares basada en el árbol de decisión de los puntos singulares y

Clasificación de huellas dactilares basada en el árbol de decisión de los puntos singulares y

Date post: 04-Jun-2018
Category:
Upload: david-robinson
View: 224 times
Download: 0 times
Share this document with a friend

of 13

Transcript
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    1/13

    Fingerprint classification based on decision tree from singular points and

    orientation field

    Jing-Ming Guo a, Yun-Fu Liu a, Jla-Yu Chang a, Jiann-Der Lee a,b,

    a Department of Electrical Engineering, National Taiwan University of Science and Technology, Taipei, Taiwanb Department of Electrical Engineering, Chang Gung University, Taipei, Taiwan

    a r t i c l e i n f o

    Keywords:

    Fingerprint

    Fingerprint identification

    Fingerprint singularity

    Decision tree classifier

    Fingerprint analysis

    a b s t r a c t

    In this study, a high accuracy fingerprint classification method is proposed to enhancethe performance in

    terms of efficiency for fingerprint recognition system. The recognition system has been considered as a

    reliable mechanism for criminal identification and forensic for its invariance property, yet the huge data-

    base is the key issue to make the system obtuse. In former works, the pre-classifying manner is an effec-

    tive way to speed up the process, yet the accuracy of the classification dominates the further recognition

    rate and processing speed. In this paper, a rule-based fingerprint classification method is proposed,

    wherein the two features, including the types of singular points and the number of each type of point

    are adopted to distinguish different fingerprints. Moreover, when fingerprints are indistinguishable,

    the proposed Center-to-Delta Flow (CDF) and Balance Arm Flow (BAF) are catered for further classifica-

    tion. As documented in the experimental results, a good accuracy rate can be achieved, which endorses

    the effectiveness of the fingerprint classification scheme for the further fingerprint recognition system.

    2013 Elsevier Ltd. All rights reserved.

    1. Introduction

    Fingerprint is widely used in individual identification, largely

    due to the bio-invariant characteristic of human fingerprints,

    which also provide more details for distinguishing various persons.

    Former fingerprint verification methods normally demand users

    to input their personal information through various means, such as

    a name or an ID card. This kind of system verifies the correspon-

    dence between the captured fingerprint and the users personal

    information, yet the system is inefficient as the users have to oper-

    ate the system. For example, keying in their name or directly

    inserting an ID card is an additional operation for user identifica-

    tion. To avoid this inconvenience, an alternative approach namely

    fingerprint identification (Maltoni, Maio, Jain, & Parbnhankar,

    2009) which does not require users interaction was presented,yet extensive processing complexity caused by its cross reference

    of fingerprints in the database is required. To cope with this, for-

    mer works (Ratha, Chen, Karu, & Jain, 1999; Tan, Bhanu, & Lin,

    2003) used a strategy to pre-classify the fingerprints in database

    into different categories. As a result, a fingerprint to be verified

    simply needs to cross-reference the fingerprints in the category

    which identical to the verified fingerprint. This manner effectively

    reduces the number of fingerprints for further matching process.

    The researches focusing on fingerprint classification are dis-

    cussed as below. For instance, in Henrys work (Henry, 1900) the

    fingerprints were separated into four classes (namely 4C): Arch

    (A), Whorl (W), Left loop (L), and Right loop (R), some examples

    are illustrated inFig. 1(a), (c)(e). Another classification approach

    indicates that the category A can be further classified into A and

    Tented arch (T) Watson & Wilson, 1992 (namely 5C), and

    Fig. 1(b) shows an example of the additional Tented arch category.

    Even more numbers of fingerprint categories are also employed

    (Cappelli, Lumini, Maio, & Maltoni, 1999), but which also raise

    other issues such as reduced accuracy (ambiguous categories even

    cannot be classified by experts (Maltoni et al., 2009; Tan & Bhanu,

    2005). Thus, in this work, the 4C system is adopted as the standard

    for classification.

    Feature extraction for fingerprint classification is anotherimportant issue. Several well-known types of methods have been

    proposed, including orientation field (OF), singular point (SP), ridge

    flow (RF), and Gabor filter (GF). Moreover, lots of classification

    methods based on these features are established, such as rule-

    based (RB), SYntactic (SY), STRuctural (STR), STAtistical (STA),

    Neural Network (NN), and Multiple Classifiers (MC). Among these,

    the RB approach is the most straightforward method than the oth-

    ers. This method relies on the acquired number and positions of

    the extracted singular points (Karu & Jain, 1996; Kawagoe & Tojo,

    1984; Msiza, Leke-Betechuoh, Nelwamondo, & Msimang, 2009;

    Wang & Dai, 2007; Zhang & Yan, 2004) to classify fingerprints.

    Since the singular points cannot be extracted from a fingerprint

    0957-4174/$ - see front matter 2013 Elsevier Ltd. All rights reserved.http://dx.doi.org/10.1016/j.eswa.2013.07.099

    Corresponding author. Tel.: +886 928856123.

    E-mail addresses: [email protected] (J.-M. Guo), [email protected] (Y.-F.

    Liu), [email protected](J.-Y. Chang),[email protected](J.-D. Lee).

    Expert Systems with Applications 41 (2014) 752764

    Contents lists available at ScienceDirect

    Expert Systems with Applications

    j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / e s w a

    http://-/?-http://dx.doi.org/10.1016/j.eswa.2013.07.099mailto:[email protected]:[email protected]:[email protected]:[email protected]://dx.doi.org/10.1016/j.eswa.2013.07.099http://www.sciencedirect.com/science/journal/09574174http://www.elsevier.com/locate/eswahttp://www.elsevier.com/locate/eswahttp://www.sciencedirect.com/science/journal/09574174http://dx.doi.org/10.1016/j.eswa.2013.07.099mailto:[email protected]:[email protected]:[email protected]:[email protected]://dx.doi.org/10.1016/j.eswa.2013.07.099http://-/?-http://crossmark.crossref.org/dialog/?doi=10.1016/j.eswa.2013.07.099&domain=pdf
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    2/13

    image directly, the SY classification approach (Chang & Fan, 2002)

    adopted the global distribution of 10 basic ridge patterns, the anal-

    ysis of the ridge shapes, and the sequence of ridge distributions to

    achieve their work. The STR method (Cappelli et al., 1999) was pro-

    posed to partition the orientation field of a fingerprint into differ-

    ent orientation regions, and the related graphs of these regions

    were employed to classify fingerprint. Meanwhile, the STA ap-

    proaches were based on a different input features to determine

    multi-dimensional regression equations. After that, the data fea-

    tures are extracted directly and input into the classifier, and the

    classified results can be obtained efficiently (Jung & Lee, 2009;Min, Hong, & Cho, 2006; Lia, Yau, & Wanga, 2008; Park & Park,

    2005). The STA and NN approaches employed a special strategy

    to classify fingerprints as they imitated human perception and

    empirical model. These approaches require lots of training data

    to yield the classifier, and sufficient data should be obtained to

    yield a more effective classifier (Bernard, Boujemaa, Vitale, & Bri-

    oct, 2001; Kristensen, Borthen, & Fyllingsnes, 2007; Senior & Boll,

    2004). Conversely, insufficient training data will be a problem

    since it significantly degrades the accuracy of a system. In all of

    the classification methods, the two key issues always affect the

    accuracy rate: (1) The quality of a fingerprint image, and (2) the

    ambiguity of a classification scheme.

    In this work, clear and explicit rules are established to re-

    move ambiguous classes of fingerprints. The RB technique isthen applied since it is easy to implement, and which does not

    require a training procedure for classifier, while the high accu-

    racy still can be achieved. Moreover, two issues affecting the

    accuracy rate indicated above are discussed. Finally, a decision

    tree is designed to realize an automatic fingerprint classification

    system.

    The rest of this paper is organized as follows. Section2 intro-

    duces the preprocessing and feature extraction steps. The results

    of our analysis and the descriptions of decision trees are provided

    in Section 3. Section 4 provides the experimental analyses and per-

    formance comparisons. Finally, conclusions are drawn in Section 5.

    2. Preprocessing and feature extraction

    The quality of the captured fingerprint ridge is very important

    since it dominates the singular point extraction. In an ideal case,

    a captured fingerprint should include sharp ridges and valleys,

    yet these are obstructed by other factors (Amengual et al.,

    1997). Thus, to achieve a better performance, an image enhance-

    ment process is highly demanded. In this study, the three public

    fingerprint databases FVC datasets (Fingerprint database FVC,

    2000, 2002, 2004) are adopted in this study for conducting

    experiments, in which the fingerprints are captured from differ-

    ent devices, thus perfect and imperfect fingerprints are involved.

    The critical issues involved can be solved by the proposed finger-

    print classification as shown inFig. 2, and the detail flows of the

    preprocessing is shown in Fig. 3 which is discussed belowfirstly.

    2.1. Preprocessing

    2.1.1. Histogram equalization

    To obtain an image with stable contrast distribution, due to the

    foreground and background are simple, the global histogram equal-

    ization is utilized. The transformation function is formulated as

    below,

    HEi;j 255

    PImagei;jg0 Hg

    PQ ; 1

    whereH(g) denotes the histogram value at grayscalegand variableImage(i,j) denotes the grayscale value of the captured fingerprint

    image of size P Qat location (i,j). Notably, in this study the full

    black and white colors are defined at 0 and 255, respectively.

    Fig. 4shows a series of results of each sub-function in the prepro-

    cessing block, and the result corresponding to histogram equaliza-

    tion is shown inFig. 4(b).

    2.1.2. Grad field

    The grad field represents the high-frequency energy distribu-

    tion of the captured fingerprint image, and it can yield a mask to

    assist the following segmentations process.

    GHi;j @

    @jImagei;j; 2

    GVi;j @@i

    Imagei;j; 3

    Gradi;j 1

    W2

    XiroundW2 uiroundW

    2

    XjroundW2 vjroundW

    2

    ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffiffiffiG2Hu;v

    2 G2Vu; v

    q ; 4

    where the notations Hand Vdenote horizontal and vertical, respec-

    tively; the constant W= 9 denotes the average filter size; the

    round() denotes the round down operation. Fig. 4(c) shows two dif-

    ferent examples of grad results.

    2.1.3. Segmentation

    TheHE(i,j) obtained by Eq.(1)is separated into foreground and

    background by the variance thresholding method (Mehtre, 1993).

    The processing steps are described as below:

    AverageGrayScalei;j 255

    Pr=2mr=2

    Pr=2nr=2HEim;jn

    r 12 ; 5

    SegmentMapi;j AverageGrayScalei;j Gradi;j

    2 ; 6

    Mean 1

    PQ

    XPi1

    XQj1

    SegmentMapi;j; 7

    Var

    1

    PQXPi1

    XQj1 SegmentMapi;j Mean

    2

    ; 8

    (a) A (b) T (c) W (d) L (e) R

    Fig. 1. Fingerprint categories. (a) Arch (A). (b) Tented arch (T). (c) Whorl (W). (d) Left loop (L). (e) Right loop (R).

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 753

    http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    3/13

    ImageSegmenti;j 255; if SegmentMapi;j Var

    HEi;j; if SegmentMapi;j> Var;

    9

    where the empirical parameter r= 6 which denotes the size of the

    referenced average range. An example of the obtained ImageSegment(i,j) is shown in Fig. 5. Yet, in the casethat ImageSegment(i,j) obtained

    from a brighter (causes low image contrast) captured fingerprint

    image, partial fingerprints are wrongly classified to the background,

    and which thus causes difficulty in detecting the singular points. An

    instance is shown inFig. 6(b), in which the circle indicates a singu-

    lar point. To deal with this, the ratio between the areas of the seg-

    mented background and the whole fingerprint image is estimated,

    and a parameter c, denoting the target ratio to be met, is used to ad-just the threshold Var for enlarging the area of segmented finger-

    print The parameterc is defined as below,

    cCovered gray background area

    PQ 100%: 10

    Fig. 6(c) shows an example of the improved segmented result.Although some fingerprint areas are wrongly segmented, the

    singular point of interest is clear. Some results of the segmentation

    are also shown inFig. 4(d).

    2.1.4. Orientation field

    To estimate the orientations of local fingerprint ridges, the ori-

    entation field is obtained with the following steps, and which is

    necessary for the next Gabor filtering process (Hong, Wan, & Jain,

    1998).

    VHi;j Xiround W2

    uiround W2

    Xjround W2 vjround W

    2

    2GHu; vGVu; v; 11

    VVi;j XiroundW2

    uiroundW2

    XjroundW2 vjroundW

    2

    G2Hu; v G2Vu; vh i

    ; 12

    Oi;j 1

    2tan1

    VHi;j

    VVi;j

    ; 13

    where variables GH(u, v) and GV(u, v) are obtained by Eqs. (2) and (3)

    and the size Wis identical to that used in Eq. (4)for the proposed

    algorithm.Fig. 4(e) shows the corresponding results of the obtained

    O(i,j) images.

    2.1.5. Simplified Gabor filtering

    The traditional Gabor filter used inHong et al. (1998)is a filter

    for enhancing the ridge at a specific frequency and angle. This filter

    with various frequencies (f) and angles (h) can be obtained by thefollowing equation as

    Preprocessing

    Singular pointextraction

    Bottom center point

    Top center point

    No

    Delta point No

    Method of

    CDF(Node 1)

    Method of

    BAF(Node 2)

    Delta pointposition(Node 3)

    Yes

    No

    Asymmetric Flow

    AQ

    End

    Classified fingerprint

    image

    LPLIP

    RPRIP

    Decision tree

    WPWIP

    AD LD RDAPAIP

    Symmetric

    FlowLeft

    Asymmetric

    Right

    Asymmetric

    >=-R and

    =R

  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    4/13

    Gx;y; h;f e1

    2

    x2h

    r2x

    y2h

    r2y

    cos2pfxh; 14

    where the variable rx andry denote the standard deviations, andvariables xh andyh are defined as below,

    xh

    yh

    cosh sinh

    sinh cosh

    x

    y

    : 15

    Supposing that the support region of the obtained Gabor filter is

    Mx My, every pixel requires Mx My computations. To speed up

    the processing, a simplified Gabor filtering is proposed in this study.

    The enhanced image can be obtained by

    Imageenhancedi;j X3

    u3

    X3v3

    filterMv ImagesegmentRoundi uh;Roundj vh; Ifv 0

    filterOu Imagesegment Roundi uh90 ;Roundj vh90 ; Ifu 0

    0; Otherwise

    ;

    8>:

    16

    filterMv 1

    7 1 1 1 1 1 1 1 ; where v2 3; 3; 17

    filterOu 1

    7 3 1 3 9 3 1 3 whereu2 3; 3;

    18

    The processed values forfilterO(u) are orthogonal with the values for

    filterM(v), and it is achieved by rotating the coordinates uh and vhwith an angelh obtained from O(i,j) in the Cartesian plane. The cor-

    responding definition is shown below,

    uh

    vh

    cos h sin h

    sin h cos h

    u

    v

    : 19

    An example is shown inFig. 7, where the gray lines represent the

    flows of a fingerprint. In this figure, the pixels labeled with FH(x)

    indicate that these pixels are processed byfilterM(v); FV(x) indicates

    that these pixels are processed by filterO(v), and 0 indicates that

    these pixels are not processed. The obtained results are shown in

    Fig. 4(f). In otherwise case of Eq.(16), the process does not do any

    calculation. Thus, this modification only requiresMx+ Mycomputa-

    tions (suppose the sizes of the main and orthogonal filters are MxandMyrespectively for comparison). The obtainedImageenhanced(i,j)

    image is used to acquire a more accurate orientation field O(i,j)

    through the processes discussed in Section2.1.4. Notably, the vari-ables GH(u, v) andGV(u, v) are calculated fromImageenhanced(i,j).

    (a) (b) (c) (d) (e) (f)

    Fig. 4. Results of the preprocessing block. Two types of captured fingerprints are adopted. (a) Original image. (b) Histogram equalization. (c) Grad image. (d) Segment

    image. (e) Orientation image. (e) Enhanced image.

    (a) (b) (c)

    Fig. 5. Example of the segmented result. (a) AverageGrayScale(i,j). (b) Segment-

    Map(i,j). (c)ImageSegment(i,j).

    (a) (b) (c)

    Fig. 6. Example of over-segmented caused by a brighter captured fingerprint. (a)

    Original fingerprint image. (b)ImageSegment(i,j). (c) Segmented image obtained with

    an adjusted thresholdVar, wherec = 35%.

    FV(3) 0 0 0 0 0 FH(3)

    0 FV(2) 0 0 0 FH(2) 0

    0 0 FV(1) 0 FH(1) 0 0

    0 0 0 FV(0)

    FH(0)0 0 0

    0 0 FH(-1) 0 FV(-1) 0 0

    0 FH(-2) 0 0 0 FV(-2) 0

    FH(-3) 0 0 0 FV(-3)0 0

    Fig. 7. Simplified Gabor filtering example, when (i,j) e37.552.5.

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 755

    http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    5/13

    2.2. Singular point extraction

    To distinguish various types of fingerprint under unpredictable

    image conditions, robust features are highly demanded. In this

    study, the commonly used singular points including the center

    and delta points as illustrated inFig. 8are used. To extract these

    singular points, due to the dramatic variations on fingerprint tex-

    ture, the orientation field obtained from Imageenhanced

    (i,j

    ) is em-

    ployed. Defining the Poincar index, which is a differential

    summation related to each of the two neighboring pixels going

    along a closed curve (Karu & Jain, 1996; Kawagoe & Tojo, 1984;

    Maltoni et al., 2009), and it is directly proportional to the variation

    of the orientation field. The mathematical definition is given below,

    anglei;jjm; n Oi;j Om;n; 20

    Pinternali;j X7k0

    anglerkjrk1mod 8 ; 21

    Pexternali;j X15k0

    angleRkjRk1mod16 ; 22

    Point type

    Center point; if 180 r Pinternali;j 180 r

    Center point; if 180 r Pexternali;j 180 r

    Deltapoint; if 180 r Pinternali;j 180 r

    Deltapoint; if 180 r Pexternali;j 180 r

    non-singular point; Otherwise

    ;

    8>>>>>>>>>>>:

    23

    where the function angle() denotes the difference between two ori-

    entation field values; vectors rk and Rk denote the positions in the

    inside and outside rings centered the current processing position

    (i,j), a corresponding example is illustrated in Fig. 9, where the

    notationx denotes the current processing position; the empirical

    parameterr = 25 denotes a single-side error toleration. Two Poin-car indexes calculated by Eqs. (21) and (22)are used to represent

    the summed angle differences of the internal and external rings,

    respectively. In general, the Pinternal(i,j) calculated from an ideal cen-

    ter singular point is close to 180, and the Pexternal(i,j) calculated

    from an ideal delta singular point is close to 180. Based upon this

    phenomenon, a decision rule is established as shown in Eq. (23)

    with a toleration parameterrto distinguish different singular pointtypes.

    Multiple candidates of center and delta points are obtained by

    the above methods. In this study, the center points are further sep-

    arated into top center point and bottom center point two classes

    for the following fingerprint classification, in which the top center

    point is defined when the neighboring flows down, an example is

    shown inFig. 8(a); conversely, the bottom center point is defined

    when the neighboring flows up. The decision rule to classify the

    two center point types are described as below,

    TPvalue jOi 1;j 1 45j jOi 1;j 1 135j; 24

    BPvalue jOi 1;j 1 135j jOi 1;j 1 45j; 25

    Center point type Top center point; if TPvalue BPvalue

    Bottom center point; Otherwise

    :

    26

    As a result, in total three types of singular points may appear on a

    captured fingerprint image. According to the observation from

    experiments, the candidates of each type of singular points are in

    clustering and surrounding the real singular point. Consequently,

    the averaged positions of the top and bottom center points are con-

    sidered as the real top and bottom singular points. Yet, due to the

    candidates of delta points are possible to have obvious differences

    in distance (an example is shown in Fig. 1(c), which contains two

    real delta points distributed on left- and right-side of the finger-

    print), the averaged positions are only considered within a circle re-

    gion centered the candidate itself with a radius of one-ten image

    width. Since the number of the remained positions may be still

    higher than two, thus two of these positions with the highest aver-aged from the candidates are considered as the real delta points in

    this study.

    3. Fingerprint classification

    In this section, various types of fingerprints are further investi-

    gated. From this analysis, a series of rules are introduced to assess

    and define different fingerprint categories. The captured finger-

    prints in the FVC database are adopted in this study to provide var-

    ious quality levels. One of the critical cases is the Incomplete

    fingerprint, in which the structures of the fingerprint are not en-

    tirely captured. The fingerprints in this class have low image qual-

    ity, yielding partially lost of the singular points. Thus, the proposed

    method is discussed in terms of two fingerprint quality conditions:perfect and imperfect. At the end of this section, an overall fea-

    ture analysis and a summary are specified. We will also show how

    to apply this feature analysis scheme to classify fingerprint with

    different methods. Moreover, two methods, namely Center-to-Del-

    ta Flow (CDF) and Balance Arm Flow (BAF), are used to classify a

    fingerprint when it cannot be classified by simply using the types

    and the numbers of the singular point.

    3.1. Symmetry estimation

    Five different numbers of extracted singular points from zero to

    four (at most two center points and two delta points) are possible

    to appear in the previous feature extraction. Among these, the case

    when the top center point is found and the bottom center point isnot found, the fingerprint classification employs the fingerprint

    (a) (b)

    Fig. 8. Two types of the singular points, (a) center point, and (b) delta point.

    Fig. 9. Operational range matrix.

    756 J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764

  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    6/13

    orientation symmetry characteristic to distinguish different fin-

    gerprint types. An example is shown in Fig. 10for explaining this

    characteristic, in which the regions A and B are reflexive symmetry,

    and the line C used to divide the two regions is called balance line

    in this study. This case exists when the values of the left-hand and

    right-hand sides of the balance line are identical. A computation

    concept is depicted inFig. 10(b), in which the reflexive symmetry

    is demonstrated on the obtained orientation field. In the following,

    two different estimation methods are proposed for estimating

    whether the reflexive symmetry is existed.

    3.1.1. Center-to-Delta Flow (CDF)

    This method is performed when only one delta point is ex-

    tracted from the classified fingerprint image. A corresponding

    example is depicted in Fig. 11(a), where the hdenotes the angle be-

    tween the horizontal line of the captured fingerprint image and the

    line Center-to-Delta Line (CDL) connected the top center and the

    delta point. To calculate every un of the points on the line CDL,an example is shown in Fig. 11(b), all of the orientation fields

    O(i,j) contained in a one-pixel-width region of length R which is

    perpendicular to the line CDL are averaged. The length Ris defined

    at the quarter of the width of the captured image. According to the

    above calculated h andun, the symmetric estimation of CDF isdefined as below,

    CDF Symmetric; if 1

    N

    XNn1

    /nh

    rCDF

    Asymmetric; Otherwise

    ;

    8>: 27

    where the constant N denotes the length of CDL, and parameter

    rCDFis set at 15 in this study. The pseudo code of the CDF is orga-nized as below:

    Pseudo code of the Center-to-Delta Flow (CDF)

    Step 1. Determine whether only one top center point and one

    delta point are extracted, if not, perform the BAF process

    introduced in the following sub-section

    Step 2. Connect the extracted center and delta points with the

    line CDL, and calculate the angle h.

    Step 3. Forn = 1 N, whereNdenotes the number of discrete

    points which constructs the line CDL

    Calculate averaged orientation fields unEnd For

    Step 4. Determine whether the two regions at the two sides of

    CDL are symmetric

    3.1.2. Balance Arm Flow (BAF)This method is used to obtain a Balance Arm Line (BAL) which is

    constructed by the calculated pivot points. The orientation fields of

    the both sides of BAL are used to determine whether it is symmet-

    ric.Fig. 12shows a corresponding example for better understand-

    ing. The top center point is considered as the start pivot point. The

    next pivot point is determined by Eq. (28),in which theun illus-trated inFig. 12(b) and (d) is calculated.

    Px;y Px 4 cos/n;y 4 sin/n: 28

    Subsequently, all of the orientation fields O(i,j) contained in the

    one-pixel-width region of length R which is perpendicular to the

    un1 or 90 are averaged at the current pivot point positionP(x,y). The length R is defined as that used in CDF. Afterward, all

    of the orientation fields O(i,j) used to obtain un are adopted toaccumulate theLeftFlow andRightFlow as below.

    LeftFlow LeftFlow 1; if Oi;j< 90

    RightFlow RightFlow 1; if Oi;j> 90

    None Otherwise

    :

    8>: 29According to these two variables, the symmetric estimation of the

    BAF can be determined with the following criterion,

    Left asymmetric; if LeftFlowRightFlow >1 rBAF

    Right asymmetric; if RightFlowLeftFlow

    >1 rBAF

    Symmetric; Otherwise

    ;8>>>>>: 30

    where the parameter rBAF is empirically set at 0.15 in this study.The pseudo code of the BAF is organized as below.

    Pseudo code of the BAF

    Step 1: Determine whether one top center point is extracted,

    or the asymmetric is determined by CDF

    Step 2: The initialLeftFlow andRightFlow are set at 0, and

    consider the top center point as pivot pointP(x,y) which is

    the starting point of the BAF as shown inFig. 12(a)

    Step 3: Repeat the following steps IIII until the current P(x,y)

    is beyond the areas of the fingerprint or the captured image

    I. For the calculation in the first time:

    Calculate the average orientation fieldu1,LeftFlow, andRightFlowwith both sides perpendicular as 90 of distance

    Rto the current position P(x,y), whereLeftFlow, and

    RightFloware updated by Eq. (29)

    II. Otherwise

    Calculate the average orientation fieldun,LeftFlow, andRightFlowwith both sides perpendicular toun1of distanceRto the current position P(x,y), whereLeftFlow, and

    RightFloware updated by Eq. (29)

    III. Move the pivot point to the next position by Eq.(28)

    Step 4: Determine whether the regions on the two sides of the

    line BAL symmetric or not by calculating Eq.(30)

    3.2. Fingerprint analysis

    The characteristics of the fingerprints in various categories is

    introduced, and the discussion is separated into perfect and

    imperfect two conditions because of the high variety of samples

    in the database. Moreover, the classification manners for each class

    are also explained.

    3.2.1. Fingerprint definitionsThis analysis is provided toward the above three types of singu-

    lar points (top center, bottom center, and delta), in which the types

    and the amounts of them are utilized to construct the hierarchical

    decision strategy, namely decision tree, as shown in Fig. 2(more

    about the fingerprint classes will be revisited later). Each node in-

    side this figure indicates that many fingerprint types are possible

    to enter with identical judgment for classification. These selections

    in the decision tree are feasible to choose a suited way to classify

    various fingerprint types. In addition, Table 1also shows the de-

    tails of each fingerprint type in perfect and imperfect condi-

    tions. The perfect condition indicates that a captured fingerprint

    image contains the entire fingerprint regions and all of the singular

    points, while the imperfect condition indicates that a captured fin-

    gerprint image losing part of the regions or singular points. More-over, the term Query inTable 1represents classifying a fingerprint

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 757

    http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    7/13

    according to its types or the numbers of the extracted singular

    points.

    Fig. 13 shows the fingerprint type AQ without any singular

    points. Figs. 14 and 15 show the fingerprint types WP and WIPwhich have one top-center point, one bottom-center point, and

    zero to two delta points. Since these two types both have a unique

    bottom-center point feature, these two fingerprint types and AQcan be classified with the Query manner. For another three finger-

    print types, AP, LP, RP, all of them have a top-center point and a del-

    ta point, as some examples shown in Figs. 16 and 17. Since the

    three cases have the same singular points, the Node 1s CDF is

    adopted for symmetry estimation for further classification. If the

    response of this decision is true, the fingerprint is determined as

    AP; if not, for further classifying the fingerprints, BAF is adopted

    for Node 2s BAF for symmetry estimation. According to the result

    of the BAF, the left asymmetry result determines the fingerprint is

    LP, otherwise it is determined as RP. Considering the imperfect con-

    dition (AIP, LIP, and RIPas some examples shown in Figs. 18 and 19),

    due to the delta point cannot be extracted or two delta points are

    extracted, the Node 2s BAF is employed. If the result of this esti-

    mation is symmetric, then the AIP is determined; type LIP for left

    asymmetry, and RIPfor Right asymmetry.

    Except for the above conditions, the incompletely captured fin-

    gerprint images have only delta points extracted, as shown in

    Fig. 20(a) and (b). In this situation, four classes, AI, WI, LI, and RI,

    are considered; if two delta points are extracted, then WI is deter-

    mined; if only one delta point are extracted, the Node 3s function

    will be used. As it can be seen fromFig. 20(c) and (d), the width of

    the extracted delta point, the center point DCP, and therRare ob-tained. If the delta point belongs to the right-side of the center

    point, and which is greater than rR, then LI is determined. Other-wise, if the delta point belongs to the left-side of the center point,

    and which is greater thanrR, thenRIis determined. Notably, in thisstudy the parameterrRis empirically set at 1/10 of the input imagewidth.

    3.2.2. The description of singular point quadrantal diagram

    The relationships among top center, bottom center, and delta

    singular points are constructed in a 3-D space as shown in

    Fig. 21, where the intersected point of these three axes is notated

    as (0, 0, 0). To further define this space, the positive axis of each

    point types denotes that the corresponding fingerprint category in-

    cludes at least one singular point of the category, and negative axis

    denotes that the corresponding fingerprint categories have no sin-

    gular point. One more fingerprint category namely Undecided (U)

    is additionally added. This class comprises three fingerprint cate-

    gories, A, L, and R. The three categories have a common feature that

    one center point and one delta point are included, thus the classi-

    fier cannot rely on these information to distinguish the three fin-

    gerprint categories. The CDF and BAF are used for distinguishing

    these categories. In which, the distance between the extracted cen-

    ter point and the delta point indicates that when the estimation of

    the CDF and BAF have larger asymmetry value, the more likely the

    fingerprints belong to L or R class, The estimation of the CDF and

    BAF is an asymmetry result when the distance between the ex-

    tracted center point and the delta point is larger, and an example

    is shown inFig. 22. The corresponding distances are calculated as

    below,

    Dx xcenterxdelta; 31

    Dy ycenterydelta: 32

    4. Experimental results

    Nowadays, the two databases, namely NIST (Watson & Wilson,

    1992) and FVC (Fingerprint database FVC, 2000, 2002, 2004),

    have been widely used on the related studies of fingerprint. In this

    work, the FVC database is adopted in our experiments with the fol-

    lowing two reasons: (1) The NIST database has about 17% captured

    fingerprints which are difficultly classified even by an expert; (2)

    comparing with the NIST database, containing ink-on-paper

    (a) (b)

    Fig. 10. Example of reflection symmetry. (a) Conceptual diagram. (b) Concept of

    calculation.

    (a) (b)

    Fig. 11. Conceptual processing diagrams. (a) Center-to-Delta Line (CDL) and the

    angleh. (b) Average orientationsun.

    (a) (b) (c) (d)

    Fig. 12. Conceptual diagrams of BAF. (a) The top center point regarded as the start pivot point. (b) Example the average orientations u1 calculation. (c) Calculating theposition of the next pivot point, and moving the pivot point. (d) Balance Arm Line (BAL) constructed by all of the pivot points.

    758 J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764

  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    8/13

    fingerprints, the images in FVC database are captured by optical

    device has been widely used in practical applications. Thus, nine

    various FVC databases are adopted, including three FVC2000 (Fin-

    gerprint database FVC, 2000), three FVC2002 (Fingerprint data-

    base FVC, 2002), and three FVC2004 (Fingerprint database

    FVC, 2004). The numbers of the classified properties are organized

    in Table 2, where the process is according to the prior knowledge of

    singular point of each fingerprint type as arranged in Table 1.

    In the Section2.1.3, the parameter c is used to determine thebackground ratio in segmentation.Fig. 23shows the Correct Clas-

    sification Rates (CCRs) vs. variousc, and the empirical parametercis set at 35%. When 35% 6 c 6 60%, the CCRs are descending, sincesome of the error segmentations of the background cannot be de-

    creased, which causes less ridge information obtained for finger-

    print classification with the CDF and BAF. The CCR is defined as

    below,

    Table 1

    Features of various singular point types and their corresponding classification methods.

    Types Features

    Top center point (\) Bottom center point ([) Delta point (N) Symmetry flow Classification method

    Perfect AP U U Symmetric CDF

    LP U U Left asymmetric CDF and BAF

    RP U U Right asymmetric CDF and BAF

    AQ Query

    WP U U U Query

    Imperfect AIP U Symmetric BAF

    LIP U Left asymmetric BAF

    RIP U Right asymmetric BAF

    WIP U U Query

    AI U Query

    LI U Query

    RI U Query

    WI U Query

    Fig. 13. Example of perfect Arch (A) fingerprint.

    (a) (b)

    Fig. 14. Examples of (a) perfect whorl class fingerprint. (b) Ridge appears around top center point and bottom center point.

    Fig. 15. Example of imperfect whorl class.

    (a) (b) (c)

    Fig. 16. Example of (a) perfect tented arch class fingerprint. (b) The ridges appear

    around the top center point and delta point. (c) Using CDF to determine the CDL,

    and calculating the average orientations un.

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 759

    http://-/?-http://-/?-http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    9/13

    CCRNumCorrect classification

    NumFingerprint classes ; 33

    Table 3 shows the correct rate of singular points extraction (Correct

    Extraction Rate, CER) of the above nine FVC databases, where the

    CER is defined as below,

    CERNumCorrect Extraction

    NumFingerprint classes: 34

    Notably, the positions of the singular points on fingerprint

    images are marked before this estimation. According to the obser-

    vation, the correct extraction of each singular points, including topcenter point, bottom center point, and delta point, are almost with

    high quality. Yet, when the delta point is closer to the boundary of

    the captured fingerprint, or when the image suffers from noise, the

    corresponding CER is significantly degraded. The influence can be

    seen from the CERs of L and R as shown in Table 3andFig. 24(a).

    (a) (b)

    (c) (d)

    Fig. 17. Examples of (a) perfect loop class fingerprint. (b) Ridge appears around top center point. (c) Using CDF to determine the CDL. (d) Using BAF to determine the BAL.

    (a) (b) (c)

    Fig. 18. Examples of (a) imperfect tented arch class. (b) Ridge appears around thetop center point and delta point. (c) Using BAF to determine the balance line.

    (a) (b) (c)

    Fig. 19. Examples of (a) imperfect loop class fingerprint. (b) Ridge appears around top center point. (c) Using BAF to determine the balance line.

    760 J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764

  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    10/13

    Among these, the delta points in A type are not close to the bound-ary of fingerprint area, as shown in Fig. 24(b).

    The CCRs of the four different fingerprint types (A, W, L, and R)

    classified by the proposed method are organized in Table 4with

    the confusion matrix form. Among these, the W, L, and R classes

    yield good performance in terms of CCRs, yet the A type despite

    in perfect or imperfect conditions leads to unsatisfactory perfor-

    mance. Most of the errors occurred in A type classification is due

    to the sensitivity of the calculated balance line affected by the local

    fingerprint flow and incomplete fingerprint images, as an example

    shown in Fig. 25. Nonetheless, the proposed scheme can achieve an

    accuracy rate of 92.74% in average.

    Fig. 26 and Table 5 show the CCR comparison among Msiza

    et al.s method (Msiza et al., 2009), Jung-Lees method (Jung &

    Lee, 2009), and the proposed method. The proposed method is sim-

    ilar to that of Msiza et al.s method (Msiza et al., 2009), includingthe feature types and classifier. Yet, their classification rules can

    hardly distinguish the ambiguous classes, such as the cases shown

    in Fig. 22. The Jung-Lees study (Jung & Lee, 2009) adopted the

    ridge as the features and considered the incomplete fingerprints

    case. Nonetheless, the W fingerprint type is hardly classified in

    their work due to various ridge orientations.

    Since the proposed classification method cannot achieve 100%

    in CCR, which causes an input fingerprint may not be found in

    the classified fingerprint class. Hence, a fingerprint searching strat-

    egy is required for searching for other fingerprint classes (except

    the class which is firstly determined) with high efficiency. The

    (a) (b) (c) (d)

    Fig. 20. Examples of (a) incomplete class fingerprint with one delta point and (b) two delta points. (c) The dotted line surrounded the area is the ridge area, and the line DCL

    width is the ridge area width in (a). (d) The threshold of classification for incomplete class in extracting one delta point.

    Fig. 21. Singular points quadrantal diagram, in which the Whorl, Arch, Incomple-

    tion, and Undecided are denoted by , , , , respectively.

    Fig. 22. Relationships of the singular point positions among tented arch, left loop and right loop.

    Table 2

    Properties of the nine FVC databases.

    Databases Types

    Perfect Imperfect

    A W L R A W L R

    FVC 2000DB1 141 239 103 52 3 1 153 188

    FVC 2000DB2 125 240 101 59 3 0 171 181FVC 2000DB4 55 262 100 73 9 2 228 151

    FVC 2002DB1 53 205 134 130 3 3 162 190

    FVC 2002DB2 68 222 138 155 4 2 166 125

    FVC 2002DB4 27 192 87 69 5 8 257 235

    FVC 2004DB1 112 213 125 82 0 3 163 182

    FVC 2004DB2 66 272 59 75 6 8 173 221

    FVC 2004DB4 64 190 117 130 0 10 195 174

    Sub-total 4535 3385

    Total 7920

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 761

    http://-/?-http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    11/13

    proposed searching strategy is illustrated in Fig. 27. An example is

    adopted to explain this figure: In the case that when a fingerprint is

    classified to A class but error occurs (the fingerprint cannot be

    found in this class), both branches L and R classes are adopted to

    compare theLeftFlowandRightFlow, Since the W has a unique fea-

    ture and this category has a higher CER, the bottom center point

    may not exist. The LeftFlow andRightFlow represent the numbers

    of each flows, thus these two flows are directly compared. If the

    LeftFlow is greater than RightFlow, the L type is considered as the

    next searched class; conversely, when RightFlow is greater than

    LeftFlow, then R type is considered as the next searched class.

    Moreover, when a fingerprint is classified to L or R firstly, the

    connected branch is class A since both L and R are similar to A,

    and the W class is the last consideration because it has a unique

    singular point and of higher CER. Given another case, when the fin-

    gerprint is classified to W, and error is occurred, it will dismiss bot-

    tom center point and then continue the classification by the CDF

    and BAF. To compare the efficiency of the fingerprint recognition

    system with and without the proposed fingerprint classification

    method, the related analysis is provided as below,

    EffectNC;NSTotal;SCS BranchNC;NSTotal;SCS; First calculation

    MathcingNC;NSTotal;SCS; Otherwise

    ;

    35

    89%

    90%

    90%

    91%

    91%

    92%

    92%

    93%

    93%

    0% 20% 40% 60% 80% 100%

    CCR

    Segmentation threshold ()

    CCR vs. Segmentation threshold ()

    Fig. 23. Relationship between Correct Classification Rate (CCR) andc.

    Table 3

    Correct Extraction Rates (CERs) of the nine databases.

    Databases Classes

    Arch (A) Whorl (W) Left (L) Right (R)

    FVC 2000DB1 95.14% (137/144) 95.42% (229/240) 72.66% (186/256) 82.66% (205/248)

    FVC 2000DB2 99.22% (127/128) 96.25% (231/240) 85.29% (232/272) 88.33% (212/240)

    FVC 2000DB4 100.00% (64/64) 98.48% (260/264) 77.13% (253/328) 71.43% (160/224)

    FVC 2002DB1 98.21% (55/56) 97.12% (202/208) 78.72% (233/296) 88.13% (282/320)

    FVC 2002DB2 98.61% (71/72) 94.20% (211/224) 86.84% (264/304) 86.07% (241/280)

    FVC 2002DB4 100.00% (32/32) 98.00% (196/200) 87.50% (301/344) 78.62% (239/304)

    FVC 2004DB1 99.11% (111/112) 96.76% (209/216) 70.49% (203/288) 79.17% (209/264)

    FVC 2004DB2 93.06%(67/72) 97.50% (273/280) 77.16% (179/232) 87.84% (260/296)

    FVC 2004DB4 100.00% (64/64) 94.00% (188/200) 73.72% (230/312) 69.74% (212/304)

    Average CER 98.12% 96.48% 79.07% 81.45%

    (a) (b)

    Fig. 24. Two examples of singular point locations in tented arch.

    Table 4

    Correct Classification Rates (CCRs) of the fingerprint types classified by the proposed method.

    Types Results

    Arch (A) Whorl (W) Left (L) Right (R) CCR (%)

    Perfect Arch (A) 595 5 63 48 83.68

    Whorl (W) 8 1988 21 18 97.69

    Left (L) 67 3 888 6 92.12

    Right (R) 58 2 4 761 92.24

    Imperfect Arch (A) 15 0 5 13 45.45

    Whorl (W) 4 33 0 0 89.19

    Left (L) 79 15 1561 13 93.59

    Right (R) 122 6 15 1504 91.32

    Average CCR 92.74%

    762 J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764

  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    12/13

    MathcingNC;NSTotal;SCS XNCi1

    CPCi NSCi 1 CPCif

    NSCiXNB

    j1

    NBCj

    NBSTotal Mathc NC 1;NSTotal NSCi;fSCS SCSig !#)X; 36

    BranchNC; NSTotal; SCS XNCi1

    NSCiNSTotal

    fMathcingNC

    1; NSTotalNSCi; fSCS SCSigg; 37

    Efficience NSTotalX

    EffectNC; NSTotal;SCS; 38

    where NC denotes the number of classes; NSTotal denotes the total

    number fingerprints in database; SCS denotes the current searching

    fingerprint class; NSCi denotes the number of fingerprints in SCS;

    CPCi denotes the correct probability of fingerprint class of SCS; NB

    denotes the number of branches under SCS, for instance the W classshown in the second row ofFig. 27 has three branches, thenNB= 3;

    NBCj denotes the number of fingerprints in the class ofjth branch;

    NBSTotaldenotes the number of fingerprints in all branched classes;Xdenotes the consumed time for classifying a fingerprint image.

    The calculated efficiency is 3.62 (this number will be explained be-

    low) as shown in Fig. 29, which yields the CCRs as shown in Table 4.

    For a fair comparison with other former methods, the numbers of

    each type of fingerprints are supposed to be equivalent. Fig. 28

    shows the relationship between the efficiency and CCR, and which

    indicates the direct proportion between them. Fig. 29 shows the

    classification efficiencies of various classification methods com-

    pared to that of searching the entire database (for example, value

    2 in this figure denotes that a method is faster than that of search-

    ing the entire database by a factor of 2). Notably, the random rep-

    resents a nave way that randomly selects a fingerprint class to be

    the firstly searched class; the other three methods denote that the

    strategy of Fig. 28 is used and various classification methods aregeared. According to this figure, the proposed method can yield a

    good efficiency, and which is obviously superior to other methods.

    (a) (b)

    Fig. 25. Some fingerprint classification errors of the A type. (a) Caused by wrong

    balance line, and (b) imperfect image.

    50%55%60%65%70%75%80%85%90%95%

    100%

    Jung-Lee's method Msiza et al.'s

    method

    Proposed methodTheCorrectC

    lassificationRate

    Arch Whrol Left loop Right loop Average CCR

    Comparison among various fingerprint classifications

    Fig. 26. Comparison of various methods in terms of the Correct Classification Rate.

    Table 5

    CCR comparison on various fingerprint classification methods.

    Methods Features Classifier CCRs Comments

    Msiza et al.s method

    (Msiza et al., 2009)

    Orientation field,

    singular point

    Rule-

    based

    84.50%

    (364/431)

    Select only 431 samples from the database FVC2000 DB1_A (total has 800 samples). Notably,

    rule-based method does not require training database

    Jung-Lees method

    Jung & Lee, 2009

    Ridge flow Markov

    model

    80.01%

    (650/812)

    Entire samples of the FVC2000 DB1 and FVC2002 DB1 are adopted (each of them contains 880

    samples). However, 136 unknown samples are excluded from their experiments. Half of the

    remaining samples are adopted for training, and the others are for testing

    Proposed method Orientation field,

    singular point

    Rule-

    based

    92.74%

    (7345/

    7920)

    Entire samples of nine FVC databases are adopted (FVC2000 DB1, DB2, DB4; FVC2002 DB1,

    DB2, DB4; FCV2004 DB1, DB2, DB4; each of the database contains 880 samples), and the

    rule-based method does not require training database

    A

    W

    L

    R

    L

    R

    R

    LW

    A

    L

    R

    L

    R W

    W

    L

    R

    R

    L

    A

    R

    L

    Start

    A

    A

    A

    Fig. 27. Proposed fingerprint classification strategy.

    0

    1

    2

    3

    4

    5

    0% 20% 40% 60% 80% 100%

    Theefficiencyvalue

    The average correct classification rate

    The curve of relative benefit increase

    Fig. 28. Relationship between classification efficiency and CCR.

    J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764 763

    http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-
  • 8/13/2019 Clasificacin de huellas dactilares basada en el rbol de decisin de los puntos singulares y

    13/13

    5. Conclusions

    Fingerprint classification system is an effective manner for

    further improving the correct recognition rate of the fingerprints,

    and further reducing the average required cross-reference time be-

    tween the test fingerprint and the fingerprints in existed database.

    In this study, nine public FVC databases are adopted, and these

    samples can be classified to perfect and imperfect fingerprints.To cope with these complex databases, a decision tree-based

    scheme comprising query, CDF, and BAF methods is proposed. As

    documented in the experimental results, the proposed method

    can achieve a CCR of 92.74%, and which outperforms the two for-

    mer works, Jung-Lees (Jung & Lee, 2009) and Msiza et al.s (Msiza

    et al., 2009) methods. Moreover, the corresponding processing effi-

    ciency is also significantly improved.

    References

    Amengual, J. C., Juan, A., Prez, J. C., Prat, F., Sez, S., & Vilar, J. M. (1997). Real time

    minutiae extraction in fingerprint images. In Proceedings of 6th internationalconference image processing and its application (pp. 871875).

    Bernard, S., Boujemaa, N., Vitale, D., & Brioct, C. (2001). Fingerprint classification

    using Kohonen topologic map.Proceedings of International Conference on Image

    Processing, 3, 230233.Cappelli, R., Lumini, A., Maio, D., & Maltoni, D. (1999). Fingerprint classification by

    directional image partitioning. IEEE Transactions on Pattern Analysis and MachineIntelligence, 21(5), 402421.

    Chang, J. H., & Fan, K. C. (2002). A new model for fingerprint classification by ridge

    distribution sequences.Pattern Recognition, 35(6), 12091223.

    Fingerprint database FVC2000: .

    Fingerprint database FVC2004: .

    Fingerprint database FVC2002: .

    Henry, E. R. (1900). Classification and use of fingerprint. London: Routledge.Hong, L., Wan, Y., & Jain, A. (1998). Fingerprint image enhancement: algorithm and

    performance evaluation. IEEE Transactions on Pattern Analysis and MachineIntelligence, 20(8), 777789.

    Jung, H.-W., & Lee, J.-H. (2009). Fingerprint classification using the stochastic

    approach of ridge direction information. In Proceedings of IEEE internationalconference on fuzzy systems (pp. 169174).

    Karu, K., & Jain, A. K. (1996). Fingerprint classification. Pattern Recognition, 29(3),389404.

    Kawagoe, M., & Tojo, A. (1984). Fingerprint pattern classification. PatternRecognition, 17(3), 295303.

    Kristensen, T., Borthen, J., & Fyllingsnes, K. (2007). Comparison of neural network

    based fingerprint classification techniques. In Proceedings of IEEE internationalconference on neural networks.

    Lia, J., Yau, W. Y., & Wanga, H. (2008). Combining singular points and orientation

    image information for fingerprint classification. Pattern Recognition, 41(1),353366.

    Maltoni, D., Maio, D., Jain, A. K., & Parbnhankar, S. (2009). Handbook of fingerprintrecognition(2nd ed.). Springer.

    Mehtre, B. M. (1993). Fingerprint image analysis for automatic identification.

    Machine Vision and Applications, 6(23), 124139.Min, J. K., Hong, J. H., & Cho, S. B. (2006). Effective fingerprint classification by

    localized models of support vector machines. Lecture Notes in Computer Science(LNCS), 3832, 287293.

    Msiza, I. S., Leke-Betechuoh, B., Nelwamondo, F. V., & Msimang, N. (2009). A

    fingerprint pattern classification approach based on the coordinate geometry of

    singularities. InProceedings of IEEE international conference on systems, man andcybernetics(pp. 510517). art. no. 5346860.

    Park, C. H., & Park, H. (2005). Fingerprint classification using fast Fourier transform

    and nonlinear discriminant analysis. Pattern Recognition, 38(4), 495503.Ratha, N. K., Chen, S., Karu, K., & Jain, A. (1999). A real-time matching system for

    large fingerprint databases. IEEE Transactions on Pattern Analysis and MachineIntelligence, 18(8), 799813.

    Senior, A. W., & Boll, R. (2004). Fingerprint classification by decision fusion. In

    Automatic fingerprint recognition systems(pp. 2753). New York: Springer.Tan, X., & Bhanu, B. (2005). Fingerprint classification based on learned features.IEEE

    Transactions on Systems, Man and Cybernetics Part C Applications and Reviews,35(3), 287300.

    Tan, X., Bhanu, B., & Lin, Y. (2003). Fingerprint identification: classification vs.

    indexing. In Proceedings of IEEE conference on advanced video and signal basedsurveillance (AVSS 03) (pp. 151156).

    Wang, L., & Dai, M. (2007). Application of a new type of singular points in

    fingerprint classification.Pattern Recognition Letters, 28(13), 16401650.Watson, C. I., &Wilson, C. L. (1992). NIST special database 4, US National Institute of

    Standards and Technology.Zhang, Q., & Yan, H. (2004). Fingerprint classification based on extraction and

    analysis of singularities and pseudo ridges. Pattern Recognition, 37(11),22332243.

    1.4628

    3.19573.385

    3.6212

    1

    2

    3

    4

    Im

    provedefficiency

    (times)

    Classification efficiency

    Random Jung-Lees method

    Msiza et al.s method Proposed method

    Fig. 29. The comparison of classification efficiency with different methods.

    764 J.-M. Guo et al. / Expert Systems with Applications 41 (2014) 752764

    http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://bias.csr.unibo.it/fvc2000/http://bias.csr.unibo.it/fvc2004/http://bias.csr.unibo.it/fvc2002/http://refhub.elsevier.com/S0957-4174(13)00600-3/h0020http://refhub.elsevier.com/S0957-4174(13)00600-3/h0020http://refhub.elsevier.com/S0957-4174(13)00600-3/h0020http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0070http://refhub.elsevier.com/S0957-4174(13)00600-3/h0070http://refhub.elsevier.com/S0957-4174(13)00600-3/h0070http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0085http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0080http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0075http://refhub.elsevier.com/S0957-4174(13)00600-3/h0070http://refhub.elsevier.com/S0957-4174(13)00600-3/h0070http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0065http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0060http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0055http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0050http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0045http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0040http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0035http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0030http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0025http://refhub.elsevier.com/S0957-4174(13)00600-3/h0020http://bias.csr.unibo.it/fvc2002/http://bias.csr.unibo.it/fvc2004/http://bias.csr.unibo.it/fvc2000/http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://refhub.elsevier.com/S0957-4174(13)00600-3/h0015http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0010http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005http://refhub.elsevier.com/S0957-4174(13)00600-3/h0005

Recommended