+ All Categories
Home > Documents > Biometric Verification with Correlation Filters

Biometric Verification with Correlation Filters

Date post: 30-Sep-2016
Category:
Upload: abhijit
View: 215 times
Download: 1 times
Share this document with a friend
12
Biometric verification with correlation filters B. V. K. Vijaya Kumar, Marios Savvides, Chunyan Xie, Krithika Venkataramani, Jason Thornton, and Abhijit Mahalanobis Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric such as a fingerprint, face image, or iris image to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images e.g., facial expressions, illumination changes, etc.. We inves- tigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. © 2004 Optical Society of America OCIS code: 100.6740. 1. Introduction Subject verification is important in many applica- tions, such as in control of access to physical spaces e.g., buildings and virtual spaces e.g., computers. Most current authentication systems use passwords or personal identification numbers, making them susceptible to problems ranging from human error to theft. One way to overcome these problems is to employ biometrics for authentication. Biometrics are characteristics that differ from person to person e.g., face images, fingerprints, iris images, palm ge- ometry, etc. and cannot be lost or stolen because they are physically attached to the authentic subject. Of course, a determined attacker would go to great lengths to spoof the verification system by synthesiz- ing a fake biometric input that is a close match to the authentic biometric signature. Applications de- manding very high levels of security may need a mul- tilayer approach in which both passwords and biometrics are employed. We will first clarify a few important terms. Veri- fication also known as 1:1 matching refers to the process of accepting or rejecting a claimant’s identity. Thus a live biometric is matched to a stored biometric or template during verification. In contrast, identi- fication or 1:N matching refers to matching a live biometric to one of N biometrics or templates stored in a database. Recognition refers to the union of verification and identification. For some applica- tions e.g., accessing your ATM or entering a build- ing, verification is the task at hand, whereas in other applications e.g., looking for a particular face in a crowd, identification is the task. In this paper our focus is on verification. Also, we limit our investi- gation to those biometrics that provide two- dimensional 2-D images e.g., face images, fingerprints, and iris images and do not consider three-dimensional representations. Biometric verification involves an enrollment or training stage and a verification or testing stage. During enrollment, several samples of the subject’s biometric signatures are collected, and the system creates one or more biometric templates. These training signatures should reflect the variability an- ticipated in the use of the biometric verification sys- tem. Also, the number of training signatures must be chosen carefully. Using too many training im- ages leads to complex training and testing, whereas using too few images leads to poor generalization and poor verification performance. During verification the subject’s live biometric is matched against the stored template to judge whether the input is from an authentic subject or from an impostor. In many ver- ification situations it is reasonable to assume that the subject even an impostor wants to be accepted by B. V. K. Vijaya Kumar [email protected], M. Savvides, C. Xie, K. Venkataramani, and J. Thornton are with the Department of Electrical and Computer Engineering, Carnegie Mellon Univer- sity, Pittsburgh, Pennsylvania 15213. A. Mahalanobis is with Lockheed Martin, Missiles and Fire Control, 5600 Sand Lake Road, Orlando, Florida 32819. Received 16 May 2003; revised manuscript received 16 Septem- ber 2003; accepted 8 October 2003. 0003-693504020391-12$15.000 © 2004 Optical Society of America 10 January 2004 Vol. 43, No. 2 APPLIED OPTICS 391
Transcript
Page 1: Biometric Verification with Correlation Filters

B

BJ

1

St�Mostea�oacliamtb

fi

XosLR

b

iometric verification with correlation filters

. V. K. Vijaya Kumar, Marios Savvides, Chunyan Xie, Krithika Venkataramani,ason Thornton, and Abhijit Mahalanobis

Using biometrics for subject verification can significantly improve security over that of approaches basedon passwords and personal identification numbers, both of which people tend to lose or forget. Inbiometric verification the system tries to match an input biometric �such as a fingerprint, face image, oriris image� to a stored biometric template. Thus correlation filter techniques are attractive candidatesfor the matching precision needed in biometric verification. In particular, advanced correlation filters,such as synthetic discriminant function filters, can offer very good matching performance in the presenceof variability in these biometric images �e.g., facial expressions, illumination changes, etc.�. We inves-tigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.© 2004 Optical Society of America

OCIS code: 100.6740.

pTofibivtiacfgdfit

tDbctttbauptsais

. Introduction

ubject verification is important in many applica-ions, such as in control of access to physical spacese.g., buildings� and virtual spaces �e.g., computers�.

ost current authentication systems use passwordsr personal identification numbers, making themusceptible to problems ranging from human error toheft. One way to overcome these problems is tomploy biometrics for authentication. Biometricsre characteristics that differ from person to persone.g., face images, fingerprints, iris images, palm ge-metry, etc.� and cannot be lost or stolen because theyre physically attached to the authentic subject. Ofourse, a determined attacker would go to greatengths to spoof the verification system by synthesiz-ng a fake biometric input that is a close match to theuthentic biometric signature. Applications de-anding very high levels of security may need a mul-

ilayer approach in which both passwords andiometrics are employed.We will first clarify a few important terms. Veri-

cation �also known as 1:1 matching� refers to the

B. V. K. Vijaya Kumar �[email protected]�, M. Savvides, C.ie, K. Venkataramani, and J. Thornton are with the Departmentf Electrical and Computer Engineering, Carnegie Mellon Univer-ity, Pittsburgh, Pennsylvania 15213. A. Mahalanobis is withockheed Martin, Missiles and Fire Control, 5600 Sand Lakeoad, Orlando, Florida 32819.Received 16 May 2003; revised manuscript received 16 Septem-

er 2003; accepted 8 October 2003.0003-6935�04�020391-12$15.00�0© 2004 Optical Society of America

rocess of accepting or rejecting a claimant’s identity.hus a live biometric is matched to a stored biometricr template during verification. In contrast, identi-cation �or 1:N matching� refers to matching a liveiometric to one of N biometrics or templates storedn a database. Recognition refers to the union oferification and identification. For some applica-ions �e.g., accessing your ATM or entering a build-ng�, verification is the task at hand, whereas in otherpplications �e.g., looking for a particular face in arowd�, identification is the task. In this paper ourocus is on verification. Also, we limit our investi-ation to those biometrics that provide two-imensional �2-D� images �e.g., face images,ngerprints, and iris images� and do not considerhree-dimensional representations.

Biometric verification involves an enrollment orraining stage and a verification or testing stage.uring enrollment, several samples of the subject’siometric signatures are collected, and the systemreates one or more biometric template�s�. Theseraining signatures should reflect the variability an-icipated in the use of the biometric verification sys-em. Also, the number of training signatures muste chosen carefully. Using too many training im-ges leads to complex training and testing, whereassing too few images leads to poor generalization andoor verification performance. During verificationhe subject’s live biometric is matched against thetored template to judge whether the input is from anuthentic subject or from an impostor. In many ver-fication situations it is reasonable to assume that theubject �even an impostor� wants to be accepted by

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 391

Page 2: Biometric Verification with Correlation Filters

tactipmbaisdtmacIpa

itmttdfototttclandeac

tlaat

2

Iiastpraflo

fi

rwcHiseMeitnGcdawsttuti“mdoa

fian�vciMMetfil

Fr

3

he system and thus will be cooperative in providingbiometric input that is of high quality. In a highly

ontrolled setting where the enrollment and verifica-ion environments are very similar, biometric match-ng can be relatively easy, leading to good verificationerformance. However, the reality is that the bio-etrics from the same person may differ considerably

etween enrollment and verification. For example,face image might appear very different because the

llumination levels between an indoor enrollmentession and an outdoor verification may be extremelyifferent. In fact, a recent face-recognition vendorest �FRVT2002� indicates that outdoor illuminationay be a severe problem for many face-recognition

lgorithms.1 Similarly, facial expressions oftenhange, causing difficulty for matching algorithms.n fingerprint verification the appearance of a finger-rint can change significantly owing to the unevenpplication of pressure on the finger.Thus accurate image matching in the presence of

mage variability is important for biometric authen-ication. Another problem domain where imageatching �in the presence of distortions such as ro-

ations and scale changes� is needed is automaticarget recognition. Much research has been done inesigning and applying advanced correlation filters2

or automatic target recognition. Correlation filtersffer several advantages. The most important ofhese is the built-in shift invariance in the filteringperation. If the input image is translated by a cer-ain amount, then the filter output shifts by exactlyhe same amount. When correlation filters are used,his shift is easily estimated by the location of theorrelation peak. Another advantage is that corre-ation filters inherently involve an integration stepnd thus offer graceful degradation due to occlusions,oise, etc. Furthermore, correlation filters can beesigned to achieve noise tolerance, discrimination,tc. Thus this paper is aimed at investigating thepplicability of correlation filters to biometric verifi-ation.

The rest of this paper is organized as follows. Sec-ion 2 provides some background for advanced corre-ation filters. Sections 3, 4, and 5 illustrate thepplication of correlation filters to face, fingerprint,nd iris verification, respectively. Finally, in Sec-ion 6 we present our conclusions.

. Advanced Correlation Filters

mage matching can be performed by cross correlat-ng an input image with a synthesized template andnalyzing the resulting correlation output. Figure 1hows schematically how the cross correlation is ob-ained. The correlation output is searched foreaks, and the heights of these peaks, as well as otherelated metrics �e.g., peak-to-sidelobe ratio or PSR�,re used to determine whether the input biometric isrom an authentic subject or from an impostor. Theocations of the peaks indicate the positions of thebjects.The most basic correlation filter is the matched

lter �MF�, which is optimum for detecting a known

92 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

eference image that has been corrupted by additivehite noise.3 Also, VanderLugt4 showed how MF

an be implemented with coherent optical computers.owever, the MF performs poorly when the reference

mage appears with distortions �e.g., rotations andcale changes�. As a result, one MF is needed forach appearance of a biometric, making the use ofFs impractical. A face image, for example, can

xhibit changes due to pose differences, expressions,llumination changes, and aging, as well as inten-ional or unintentional occlusions. One MF iseeded for each combination of these variations.iven the resulting combinatorial explosion, MFs are

learly impractical. Hester and Casasent5 intro-uced synthetic discriminant function �SDF� filters toddress this challenge. The first SDF filter was aeighted sum of MFs where the weights were chosen

o that the correlation outputs corresponding to theraining images would yield prespecified values athe origin. For example, the correlation output val-es corresponding to the training images of authen-ics can be set to 1, and the output values due to thempostor training images can be set to 0. We usecorrelation output” and “correlation peak” synony-ously with the understanding that, for a well-

esigned correlation filter, the correlation peak willccur at the origin if the input image is that of anuthentic subject and is centered.Although the equal correlation peak �ECP� SDF

lter produces prespecified correlation peak values, itlso results in large sidelobes and does not consideroise in the input image. Minimum variance SDFMVSDF� filters were introduced to minimize outputariance6 due to input noise, and minimum averageorrelation energy �MACE� filters7 can be used tomprove correlation peak sharpness. Typically,

VSDF filters emphasize low frequencies, whereasACE filters emphasize high frequencies. How-

ver, an optimal trade-off can be achieved betweenhe MVSDF and MACE filters.8 Also, initial SDFlter designs used hard constraints. That is, corre-

ation peak values were prespecified �e.g., 1 for au-

ig. 1. Block diagram of the correlation process. FFT, fast Fou-ier transform; IFFT, inverse fast Fourier transform.

Page 3: Biometric Verification with Correlation Filters

ttfibitfitp�capaifithmfiTmmstob

AF

Hfiaicsastp

o

wapAtictodtahts

B

Tatofpeida

Tptsoci

CF

IMttadspE

Idwai

DF

Tfwsatt

weMMt

E

Ic

hentics and 0 for impostors�, providing no guaranteehat the correlation peaks would take on the prespeci-ed 1 and 0 values for nontraining images. We maye able to obtain a better performance with nontrain-ng images by using unconstrained correlation fil-ers.9 One example of an unconstrained correlationlter is the maximum average correlation height fil-er. Another extension of the original SDF filter ap-roach is the distance classifier correlation filterDCCF�,10 where the distance between a prototypeorrelation output array and the obtained correlationrray, rather than just the peak value, is used forattern recognition. When the image distortion ismenable to an algebraic mapping, as in the case ofn-plane rotation, we can design a single correlationlter to yield a specified correlation peak in responseo in-plane rotations. This is achieved with circulararmonic function �CHF� filters.11 Finally, thereay be some benefit in extending linear correlationlters to include nonlinear mappings of the input.he resulting filter structure is known as the polyno-ial correlation filter �PCF�.12 Although we useostly MACE filters in our studies, we briefly review

ome of these advanced correlation filter designs inhis section to provide a convenient summary of manyf the correlation filter designs that may be useful foriometric verification.

. Equal Correlation Peak Synthetic Discriminantunction Filters

ester and Casasent5 introduced the concept of SDFlters in 1980. The first SDF filter required that thessociated template be a weighted sum of trainingmages, with the weights chosen so that resultingorrelation output values at the origin take on pre-pecified values. For example, for all training im-ges from the same class, the correlation valuehould be equal, leading to the name ECP SDF. Inhis subsection we introduce the notation for androvide a brief review of the ECP SDF.A set of linear equations describing the constraints

n the correlation peaks can be written as

X�h � u, (1)

here h is the filter column vector with d elementsnd the superscript � denotes the conjugate trans-ose �same as the transpose if the vector is real�.lso, X � �x1 x2 . . . xN� is a d � N matrix with

he N training image vectors �each with d pixels� asts columns, and u � �u1 u2 . . . uN�T is a N � 1olumn vector containing the desired peak values forhe training images. However, because the numberf training images N is generally smaller than theimension d of the filters, the system of linear equa-ions in Eq. �1� is underdetermined. Given that h isweighted sum of the training images, the ECP SDFas a unique expression, shown in Eq. �2�. It is easyo verify that h in Eq. �2� satisfies the hard con-traints in Eq. �1�.

h � X�X�X��1u. (2)

. Minimum Average Correlation Energy Filters

he MACE filter7 is designed to minimize the aver-ge correlation energy �ACE� resulting from theraining images and to constrain the value at therigin to prespecified values. Correlation outputsrom MACE filters typically exhibit sharp correlationeaks, making peak detection and location relativelyasy. Let Di be a d2 � d2 diagonal matrix contain-ng the power spectrum of training image i along itsiagonal, and let diagonal matrix D be the average ofll Di. Then the MACE filter is given as follows7:

h � D�1X�X�D�1X��1u. (3)

he MACE filter in Eq. �3� yields sharp correlationeaks in response to training images and near-raining images from the desired class as well asmall output values in response to images from thether classes. Because D is a diagonal matrix, wean see from Eq. �3� that the main computational tasks the inversion of the N � N matrix �X�D�1X�.

. Minimum Variance Synthetic Discriminant Functionilters

n order to produce sharp correlation peaks, theACE filters amplify high spatial frequencies and

hus noise. Suppose that we can model the noise inhe input images as zero mean, additive, and station-ry. Let C be a d2 � d2 diagonal matrix whoseiagonal element C�k, k� represents the noise powerpectral density at frequency k. Minimizing the out-ut noise variance �ONV� subject to the constraints inq. �1� results in the following closed form solution6:

h � C�1X�X�C�1X��1u. (4)

n many applications where the noise power spectralensity is unknown, a good model is white noise,hich assumes C � I, the identity matrix. In suchcase, the MVSDF is same as the original ECP SDF

n Eq. �2�.

. Optimal Trade-Off Synthetic Discriminant Functionilters

he MACE filter in Eq. �3� emphasizes high spatialrequencies to produce sharp correlation peaks,hereas the MVSDF filter in Eq. �4� typically empha-

izes low spatial frequencies to achieve noise toler-nce. A way to optimally trade off8 between noiseolerance and peak sharpness is to use the optimalrade-off filter �OTF� given below.

h � T�1X�X�T�1X��1u, (5)

here T � �D � 1 � 2C� and 1 � � 0. It isasy to see that when � 1, the OTF reduces to theACE filter and that when � 0, it simplifies to theVSDF filter. Note that the h in Eq. �5� satisfies

he SDF constraints in Eq. �1�.

. Optimal Trade-Off Circular Harmonic Function Filter

f the distortion of interest is in-plane rotation, thenoordinate transformations, such as polar mappings,

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 393

Page 4: Biometric Verification with Correlation Filters

cs�Imrtmictfifi

avace

w

Wrba�a

w

Tl

CtCsWcCscdpcHmi

F

TtdecmtdvisAmws

niopsetttm

w

iob

3

an be combined with Fourier transforms �FT� to de-ign optimal trade-off circular harmonic functionOTCHF� filter11 that can handle in-plane rotation.n-plane rotation appears as shifts in the polar do-ain, and these shifts can be easily handled by cor-

elation filters. The OTCHF filter design optimallyrades off among various correlation filter perfor-ance criteria �such as ONV and ACE� while achiev-

ng a specified in-plane rotation response of theorrelation peak. Polar mappings are used duringhe filter design, however, filter use during the veri-cation stage is like that of any other correlationlter and no coordinate transformation is needed.Let F�u, v� denote the 2-D FT of the reference im-

ge f �x, y�. Let F�r, �� be the polar transform of F�u,�, where r and � correspond to the magnitude andngle of the frequency pair �u, v�, respectively. Be-ause F�r, �� is periodic in � with period 2�, it can bexpressed as the following the CHF expansion:

F�r, �� � k���

Fk�r�exp� jk��, (6)

here the kth CHF Fk�r� is given as follows:

Fk�r� �1

2� �0

2�

F�r, ��exp��jk��d�. (7)

hen the input image is rotated by ��, its 2-D FTotates by exactly the same amount, and the kth CHFecomes Fk�r�exp� jk���. Let H�u, v� denote the filternd Hk�r� denote its kth CHF. Then the filter outputat origin� for in-plane rotation of �� degrees is givens follows:

C���� � �r�0

���0

2� � k���

Fk�r�exp� jk�� � ������ �

l���

Hl*�r�exp��jl���rdrd� (8)

� �r�0

� � k���

l���

Fk�r� Hl*�r�exp� jk���

� ���0

2�

exp� j�k � l ���d��rdr

� 2� �r�0

k���

Fk�r� Hk*�r�exp� jk���rdr

� k���

Ck exp� jk���, (8)

here

Ck � 2� �r�0

Fk�r� Hk*�r�rdr. (9)

94 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

hus the filter output C���� can be expressed as fol-ows:

C���� � k���

Ck exp� jk���. (10)

omplete invariance to in-plane rotation can be ob-ained only if we allow the filter to contain only oneHF, resulting in the loss of much of the object-pecific information. Clearly this is not a good idea.hat is more attractive is the use of many more CHF

omponents to achieve a specific desired response����. Equation �10� shows that the rotation re-ponse can be completely controlled by the Ck coeffi-ients defined in Eq. �9�. These coefficients can beetermined by methods commonly used for finite im-ulse response filter design.13 Once these Ck coeffi-ients are determined, we can find the filter CHFsk�r� from Eq. �9� and optimally trade off perfor-ance criteria, such as ONV and ACE. The result-

ng filters are OTCHF filters.11

. Unconstrained Synthetic Discriminant Function Filters

he optimum trade-off synthetic discriminant func-ion filter and its special cases �e.g., MACE filter� areesigned to satisfy the constraints in Eq. �1�. How-ver, it may be advantageous to replace these hardonstraints with softer requirements, such as maxi-izing the average of the correlation outputs from

he desired class. There are a couple of reasons foroing this. First, nontraining images always yieldalues different from those specified for the trainingmages, and thus it may be counterproductive to in-ist on attaining specific values with training images.lso, relaxing or removing the hard constraintsight increase the solution domain. In this sectione describe an approach that relaxes the peak con-

traints.The key idea is to treat the correlation plane as a

ew pattern generated by the filter in response to annput image. If the filter is distortion tolerant, itsutput should not change much even if the inputattern exhibits some variations. Thus the empha-is is not only on the correlation peak but also on thentire shape of the correlation surface. If gi�m, n� ishe correlation output produced in response to the ithraining image, we can quantify the variability inhese correlation outputs by the average similarityeasure �ASM� defined as follows:

ASM �1N

i�1

N

m

n

� gi�m, n� � g� �m, n��2,

(11)

here

g� �m, n� �1N

j�1

N

gj�m, n�

s the average of the N training image correlationutputs. Ideally, all correlation surfaces producedy a distortion invariant filter �in response to a valid

Page 5: Biometric Verification with Correlation Filters

iwpe

taaUg

w

i

tdau

Fms

Tg

w

G

Ttscau3cc

ummbatt

ltNemHA

w

iadtr

OB

Fip

nput pattern� would be the same, and the ASMould be zero. In practice, minimizing ASM im-roves the filter stability. Toward that goal, we firstxpress ASM in terms of the filter vector h.Let m � 1�N ¥l�1

N xi represent the average of theraining image FTs. In the following discussion, Mnd Xi are diagonal matrices with the same elementslong the main diagonals as in vectors m and xi.sing the frequency domain relations gi � Xi*h and

� � M*h, we rewrite the ASM as follows:

ASM �1

Nd i�1

N

�Xi*h � M*h�2

�1

Nd i�1

N

h��Xi � M��Xi � M�*h

� h�� 1Nd

i�1

N

�Xi � M��Xi � M�*�h

� h�Sh, (12)

here the matrix

S �1

Nd i�1

N

�Xi � M��Xi � M�*

s also diagonal, making its inversion relatively easy.In addition to being distortion tolerant, a correla-

ion filter must yield large peak values to facilitateetection. Toward this end, we maximize the aver-ge correlation height �ACH� defined below instead ofsing the hard constraints in Eq. �1�.

ACH �1N

i�1

N

x�h � m�h. (13)

inally, we should also make the ONV small. Toake the ACH large while reducing ASM and ONV

mall, the filter is designed to maximize

J�h� ��ACH�2

ASM � ONV�

�m�h�2

h�Sh � h�Ch

�h�mm�h

h��S � C�h. (14)

he maximum average correlation height filter9

iven below maximizes the ratio in Eq. �14�.

h � ��S � C��1m, (15)

here � is a normalizing scale factor.

. Distance Classifier Correlation Filters

he DCCF is aimed at transforming the training pat-erns so that classes become more compact and moreeparated from one another. The filtering processan be mathematically expressed as multiplication bydiagonal matrix H in the frequency domain. Fig-re 2 is a schematic of the basic idea based on a-class example, where m1, m2 and m3 represent thelass centers �obtained by averaging the FTs of theorresponding training images�, and z represents an

nknown input to be classified. The transformationatrix H is designed to make the classes distinct byoving the class centers apart while shrinking the

oundaries around each class so that z can be moreccurately identified with its correct class. The dis-ance of a vector x to a reference mk under a linearransform H is given as

dk � �Hx � Hmk�2 � �x � mk��H�H�x � mk�. (16)

For the general C class distance classifier problem,et xik be the d dimensional column vector containinghe FT of the ith image of the kth class, where 1 � i �

and 1 � k � C. We assume without loss of gen-rality that each class has N training images. Let

k be the mean FT of class k. Under the transform, the separation between C classes is represented by�h�, defined as follows:

A�h� �1C2

k�1

C

l�1

C

�v� kl�2

�1C2

k�1

C

l�1

C

�mk�h � ml

�h�2

�1C2

k�1

C

l�1

C

h��ml � mk��ml � mk��h

� h�� 1C

k�1

c

�m � mk��m � mk���h

� h�Th, (17)

here

T �1C

k�1

C

�m � mk��m � mk��

s a d � d nondiagonal matrix of rank ��C � 1�,nd m � 1�C ¥k�1

C mk is the mean of the entireataset. To improve distortion tolerance, we wanto minimize the following criterion B�h�, which rep-esents compactness:

B�h� �1C

k�1

C 1N

i�1

N

h��Xik � Mk��Xik � Mk�*h

� h�Sh. (18)

ur objectives of maximizing A�h� and minimizing�h� are met by maximizing the ratio

J�h� �A�h�

B�h��

h�Thh�Sh

(19)

ig. 2. Schematic of the DCCF transformation process, whichncreases the interclass distance and makes the classes more com-act.

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 395

Page 6: Biometric Verification with Correlation Filters

wiDbe

wfeH

ptactppozccCrgp

H

T3Psf

wecna

itl

scTmt

omc

TFspabiFsanscs

I

Ifmcchibdwt

TtsltsspI

Fr

3

ith respect to h. The solution vector h is the dom-nant eigenvector of s�1T. We refer to that h as theCCF. During testing the distance to be computedetween the transformed input and the ideal refer-nce for class k is given by dk:

dk � �H*z � H*mk�2 � p � bk � �z�hk � hk�z�,

1 � k � C, (20)

here z is the input image, p � �H*z�2 is the trans-ormed input image energy, bk � �H*mk�2 is the en-rgy of the transformed kth class mean, and hk �H*mk is viewed as the effective filter for class k.In general, the target may be anywhere in the in-

ut image. For the shift-invariant distance calcula-ion, we are interested in the smallest value of dk overll possible shifts of the target with respect to thelass references �i.e., the best possible match betweenhe input and the reference for class k�. In Eq. �20�,

and bk are both positive and independent of theosition of the target; thus the smallest value of dkver all shifts is obtained when the third term �i.e.,�hk� is as large as possible. Therefore, this term ishosen as the peak value in the full space domainross correlation of z and hk. Because there are only

classes to which distances must be computed, weequire C such filters. It should be noted that for aiven transform H, all dk, 1 � k � C have the same, which could be dropped.

. Polynomial Correlation Filters

he main idea underlying PCFs is illustrated in Fig.. Although any point nonlinearity can be used forCFs, we consider powers �e.g., x2, x3, etc.� for rea-ons of analytical simplicity. The correspondingorm of the output is then given by

gx � A1x1 � A2x2 � · · · � ANxN, (21)

here xi represents the vector x with each of itslement raised to the power i, and Ai is a matrix ofoefficients associated with the ith term of the poly-omial. It should be noted that the output gx is alsovector.We refer to the form in Eq. �21� as the PCF. Thus

f x represents the input image in vector notation,hen gx is a vector that represents the output corre-ation plane as a polynomial function of x. To en-

ig. 3. The PCF architecture shown to the Nth order, with h1–hN

epresenting the filters.

96 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

ure that the output is shift invariant, all theoefficient matrices are required to be Toeplitz.hen it can be shown that each term in the polyno-ial can be computed as a linear shift-invariant fil-

ering operation, i.e.,

Aixi � hi�m, n� � xi�m, n�, (22)

r that filtering xi�m, n� by hi�m, n� is equivalent toultiplying xi by Ai. The output of the polynomial

orrelation filter can be mathematically expressed as

gx�m, n� � i�1

N

hi�m, n� � xi�m, n�. (23)

he corresponding structure of the filter is shown inig. 3. The objective is to find the filters hi�m, n�uch that the structure shown in Fig. 3 optimizes aerformance criterion similar to that in Eq. �19�. Were omitting the details of the filter derivation hereecause they are available elsewhere.12 However, its important to emphasize two aspects of PCFs.irst, the component filters in all branches are de-igned jointly rather than independently. Second,ll correlation outputs can be summed without theeed for registering them. This is because any inputhift manifests itself as exactly the same shift in allorrelation outputs �i.e., the entire structure is stillhift invariant but no longer linear�.

. Peak-to-Sidelobe Ratio

n Subsections 2.A–2.H, we reviewed many methodsor designing correlation filters. In many of theseethods, the verification decision is based on the

orrelation peak, namely, the largest value in theorrelation output. This correlation peak value,owever, will change if the brightness level of the

nput image changes. To avoid this dependency onrightness level and to make the recognition decisionepend on a larger region of the correlation output,e can employ the following figure of merit known as

he PSR:

PSR �peak � mean

�. (24)

he PSR estimation is illustrated in Fig. 4. First,he correlation peak is located, and a small �e.g., ofize 5 � 5� mask is centered at the peak. The side-obe region is defined as the annular region betweenhis small mask and a larger �e.g., of size 20 � 20�quare, also centered at the peak. The mean andtandard deviation of the sidelobe region are com-uted and are used to estimate the PSR with Eq. �24�.f the PSR exceeds a fixed threshold, the input bio-

Fig. 4. Estimation of the PSR.

Page 7: Biometric Verification with Correlation Filters

moOdu

3

Ipv

A

FbiCtfsiwtari

adifciMo9wsasi

m

fist

edia

Fct

etric is declared to be that of an authentic subject;therwise, it is considered to be from an impostor.ne of the advantages of the PSR metric is that itoes not change with uniform amplification or atten-ation of image intensities.

. Face Verification

n this section we show face verification results in theresence of expression changes and illuminationariations.

. Facial Expressions

or facial expression simulations we used the data-ase collected at the Advanced Multimedia Process-ng Laboratory in the Department of Electrical andomputer Engineering at Carnegie Mellon Universi-

y.14 The database consists of 13 subjects, whoseacial images were captured with varying expres-ions. For each subject in the database there are 75mages with varying facial expressions. The facesere captured in a video sequence in which a face

racker15 tracked the movement of the user’s headnd, based upon an eye localization routine, extractedegistered face images 64 � 64 in size. Examplemages are shown in Fig. 5.

We first used only 3 training images �images 1, 21,nd 41 out of 75 images, each corresponding to aifferent expression for a different subject� for design-ng each person’s MACE filter. To evaluate the per-ormance of each person’s MACE filter, crossorrelations of all the images �i.e., 13 � 75 � 975mages� in the dataset were computed with a person’s

ACE filter, resulting in 13 � 75 � 975 correlationutputs �corresponding to 75 true-class images and00 false-class images� and the corresponding PSRsere measured and recorded. Because there are 13

ubjects and one filter for each subject, we computedtotal of 13 � 975 � 12675 correlations. The PSR

hould be large for authentic subjects and small formpostors.

Figure 6 shows the best MACE filter PSR perfor-ance �top plot, person 1�, and the worst PSR per-

Fig. 5. Example images from the Advanced Multim

ormance �bottom plot, person 2� as a function ofmage index. The PSRs of authentic subjects arehown as solid curves and those of impostors as dot-ed curves. One important observation from all 13

Processing Laboratory’s facial expression database.

ig. 6. PSRs for person 1 �top�, and person 2 �bottom�. Solidurves, authentic subjects; dotted curves, imposters; “x” symbols,raining images.

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 397

Page 8: Biometric Verification with Correlation Filters

PttoitefmbtaupcmtNsrm

fiFrsitisW2t

fTfvspsdopssfietss

ptudaftpdtaw1st

B

Tfcnpwwbehwegor

snictl6ctppspiacis

3

SR plots �we show only two PSR plots in Fig. 6� ishat all the false-class images �12 � 75 � 900� consis-ently yielded PSRs �10 �dotted curves at the bottomf both plots� for all 13 filters. The three “x” symbolsndicate the PSRs for the three training images usedo synthesize the MACE filter for each person. Asxpected, they yield high PSR values. The resultsor person 2, whose filter yields the worst perfor-

ance �exhibits the smallest margin of separationetween the true- and false-class PSR values� suggesthat the expected distortions in the test set were notdequately captured by the three training imagessed. Indeed, a close look at the dataset shows thaterson 2 exhibits significantly more variation in fa-ial expressions than did the other subjects. Thusore training images may be needed to improve au-

hentication of face images belonging to person 2.evertheless even person 2’s filter, which was de-

igned with only three training images, performedeasonably well, yielding a 99.1% verification perfor-ance.Table 1 shows the error rates achieved with MACE

lters designed from only three training images.AR, FRR, and EER refer to the false-acceptanceate, false-rejection rate, and equal-error rate, re-pectively. The EER is the case when the thresholds chosen such that FRR equals FAR. Table 1 showshat the overall EER �13 filters, each tested on 975mages� is only 0.15% from that of MACE filters de-igned from only 3 training images per person.hen we increased the number of training images to

5, the verification accuracy improved to 100% withhis facial expressions database.

We compared the performance of the MACE filtersor face verification with the eigenface method.16

he original eigenface approach used the trainingaces from different people to compute a single uni-ersal eigenface subspace. Although such a univer-al subspace is optimal for representing all theeople’s training faces in the minimum mean-quared-error sense, it may not adequately capture orescribe the detailed information that discriminatesne person’s face from that of another. A better ap-roach for face verification may be to use each per-on’s training faces to build an individual eigenfaceubspace.17 To provide a benchmark for the MACElter performance, we repeated the face verificationxperiment using the same training images used forhe MACE filter but with the individual eigenfaceubspace method. For each person’s eigenface sub-pace, we projected all the face images from each

Table 1. Error Percentages for all 13 MACE Filters Synthesized with 3Training Images

Error Rates

Person

1 2 3 4 5 6 7 8 9 10 11 12 13

FAR, FRR � 0 0 1.3 0 0 1 0 0 0 0 0 0 0 0EER 0 0.9 0 0 1 0 0 0 0 0 0 0 0FRR, FAR � 0 0 0.2 0 0 2.6 0 0 0 0 0 0 0 0

98 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

erson and reconstructed the face images to recordhe reconstruction error. If the reconstruction resid-al is smaller than a threshold value, the input iseclared as authentic; otherwise, it is deemed that ofn impostor. Table 2 shows the error percentagesrom the individual eigenspace method based onhree training images �numbers 1, 21, and 41� pererson. From Table 2, the average EER for the in-ividual eigenspace method is 0.85%, which is higherhan the 0.15% obtained from the MACE filters. Welso repeated the individual eigenspace simulationsith 25 training images, and it was observed that the00% verification performance achieved on this data-et with the MACE filter was also achieved here withhe individual eigenspace method.

. Illumination

o examine the performance of correlation filters forace verification in the presence of illuminationhanges, we used the illumination subset of the Car-egie Mellon University’s pose, illumination, and ex-ression �PIE� database,18 which contains 65 subjectsith 21 different illuminations. All face imagesere normalized for translation, scale, and rotationased on handpicked feature locations, such as theyes and mouth. We show the results with thearder illumination dataset, which was capturedith room lights off �PIE–NL�. Figure 7 shows an

xample set of images of person 2. We also investi-ated the PIE illumination subset with room lightsn and achieved an even better performance than iseported here.

We synthesized a single MACE filter for each per-on by using three extreme lighting variations;amely image 3 �left shadow�, image 7 �frontal light-

ng�, and image 16 �right shadow�. We then crossorrelated the whole database with each person’s fil-er and recorded the PSRs. This involves 65 corre-ation filters, with each filter being tested with 21 �5 � 1365 images for a total of 65 � 1365 � 88725orrelations. In this experiment we observed thathe authentic PSRs were always greater than all im-ostor PSRs, as shown in an example PSR plot forerson 2 in Fig. 8. In fact, there is a clear margin ofeparation, yielding a single threshold that com-letely discriminates the authentic subjects from thempostors for all 65 people �i.e., 100% verificationccuracy was achieved with both datasets�. Thisan be partly explained by the fact that all possiblelluminations lie in a three-dimensional linear sub-pace for a Lambertian surface model.19 It is clear

Table 2. Error Percentages for all 13 Individual Subspaces with 3Training Images Per Person

Error Rates

Person

1 2 3 4 5 6 7 8 9 10 11 12 13

FAR, FRR � 0 0 5.3 2.6 0 0 0 0 7.6 0 0 0 0 0EER 0 3.5 2.1 0 0 0 0 5.4 0 0 0 0 0FRR, FAR � 0 0 8 10.6 0 0 0 0 14.7 0 0 0 0 0

Page 9: Biometric Verification with Correlation Filters

tiatlph

fioaMtfipfia

aa

itttP

4

FintcSwteft

t4te13

2cftpfiipuMo

Fc

FMgp

hat when we use three linearly independent trainingmages that capture the extreme lighting variations,ny face image that lies in the convex hull of theseraining images will be perfectly recognized, yieldingarge PSRs that are near the training PSRs. Inractice, images that fall near this space also yieldigh PSR values and thus are correctly verified.To examine what happens when we train on near-

rontal lighting, we repeated the experiment withmages 7, 10, and 19 �near-frontal lighting� and testn the whole database. We observed a verificationccuracy of 93.5% at a FAR of 0%. This shows thatACE filters exhibit built-in tolerance to illumina-

ion variations. In other words, in order for theselters to produce sharp peaks, the filters must em-hasize high spatial frequencies. As a result, theselters attenuate low spatial frequencies, which areffected mostly by lighting changes, and therefore

ig. 7. Twenty-one images from the PIE database for person 2onditions and without background lighting.

ig. 8. PSR comparison between MACE and unconstrainedACE �UMACE� filters by use of the PIE–L dataset �with back-

round lighting� for person 2. Authentic subjects, top plot; im-osters, bottom plot.

chieve some degree of tolerance to illumination vari-tions.We also conducted extensive experiments in face

dentification �matching the incoming face to one inhe database�. The performance of correlation fil-ers in the face-recognition task was even better, of-en providing 100% recognition accuracy with theIE database.

. Fingerprint Verification

ingerprints have a long history of use in subjectdentification. Most fingerprint algorithms use mi-utiae20,21 for fingerprint verification and identifica-ion. In this section we show that correlation filtersan be used for fingerprint verification. The NISTpecial Database 24,22 which contains fingerprintsith plastic distortion, is used here because it con-

ains images from a digital fingerprint sensor andxhibits significant distortions. These images arerom an optical fingerprint sensor of 500-dpi resolu-ion.

We use a subset of the plastic distortion set, thehumbprints from 10 people, with 300 images �448 �78 pixels zero-padded to 512 � 512 pixels� for eachhumb. The thumbprints of persons 9 and 10 arexcessively twisted. Sample fingerprints of person0 are shown in Fig. 9. The thumbprints of personsand 7 exhibit much less variation.Figure 10 shows the verification performance when

0 training images are used. The receiver operatingharacteristic curve in Fig. 10 shows small error ratesor all classes, although it is desirable to reduce fur-her the error rates for classes 4, 9, and 10. To im-rove the performance of classes 4, 9, and 10, we usedve filters for those classes with the same training

mages. For the remaining classes with adequateerformance, only one MACE filter was built andsed. The best performance observed with theACE filters is an average EER of 0.006 and a FRR

f 0.011 at a 0 FAR. This overall EER of 0.6% for the

ach of these images was captured under different illumination

. E

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 399

Page 10: Biometric Verification with Correlation Filters

1tmch

5

Ibl

lbtptddrtctnti

gaAsdrtnRicafia

Ft

Fi

4

0-class fingerprint verification is comparable withhe �1% EERs reported for many minutiae-basedethods. However, the next step is to evaluate the

orrelation filters for fingerprint verification withundreds, if not thousands, of fingerprints.

. Iris Verification

ris patterns are attractive for personal identificationecause they are unique to individuals and are be-ieved to be reasonably stable over an individual’s

Fig. 9. Example images fro

ig. 10. ROC curve for 10 fingerprint classes �see legend�, with 20raining images used per class.

ig. 11. Sixteen synthesized iris images for class 1: �from leftllumination angles changed, and three randomly occluded.

00 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

ifetime. Earlier methods for iris recognition wereased on generating 2048-bit iris codes23 from fea-ures generated by the application of Gabor-waveletrocessing to iris images. For identical iris pairs,he codes should be the same, making their Hammingistance zero. Conversely, for statistically indepen-ent iris codes �perhaps an authentic–impostor pair�,oughly half the code bits should be the same. Thushe normalized Hamming distances between irisodes should be �0.5 for impostors and �0 for au-hentic subjects. Another approach to image recog-ition is the use of correlation filters. We comparedhe performance of correlation filters with our ownmplementation of the iris-code method.

Ideally, we should evaluate the iris verification al-orithms on a large database containing image vari-bility. Unfortunately, no such database exists.s we had access to only a small set of iris images, weynthesized a database by deliberately introducingistortions �additive white Gaussian noise, in-planeotations, illuminations, and random occlusions� andested the algorithms on that database. The origi-al iris images were kindly provided by the Milesesearch Laboratory.24 We have 44 classes of iris

mages, with a single image size of 400 � 400 for eachlass. For each iris image 15 synthetic variationsre generated to obtain 16 images �shown in Fig. 11�or each class. In Fig. 11, the first is the originalmage, the following four are versions with differentmounts of additive white Gaussian noise, the next

e thumbprint of person 10.

ight� one original, four Gaussian noised, four rotated, four with

m th

to r

Page 11: Biometric Verification with Correlation Filters

ffg�

icatsmtTd

oOmtAaerifiifis

rcasdicstd

Etndbtaw

aw

6

Ittsvfia

Up

R

1

1

our are versions rotated at different angles, the nextour are approximations of different illumination an-les, and the last three images are randomly occludede.g., 3% refers to occlusion of 3% of the image�.

Let us first discuss the results from the undistortedris images. For each of the 44 iris classes, an irisode was determined and a matched filter was cre-ted. Each iris code and matched filter was thenested against all 44 iris images �i.e., 1 authenticubject and 43 impostors in each case�. Bothatched filter and modified iris-code methods iden-

ified each iris class with FAR and FRR equal to zero.hus both methods work very well in the absence ofistortions.Using the synthetic database, we tested four types

f correlation-filter algorithms: MF, OTF, CHF, andTCHF. We also tested the modified iris-codeethod as a benchmark. For each of the 44 classes,

he MF was created from the nondistorted iris image.single OTF with five training images was also cre-

ted for each class. Images 1, 3, 7, 11, and 16 fromach class were used as a training set because theyeflected expected distortions. This left the remain-ng 11 images from each class to be used to test thelter as authentic test images and 16 � 43 � 688 as

mpostor images. The CHF filter and the OTCHFlter were also created for each class by use of theame five training images as with the OTF filter.In our implementation of the iris-code method �we

efer to this as the modified iris-code method�, wehoose the wavelet parameters so that the waveletsre matched well for the given iris images. Theame five training images used for correlation filteresign �1, 3, 7, 11, and 16� are chosen to construct fiveris codes for each class. During testing, the irisode of each test image is compared with all of fivetored codes for that class to get five Hamming dis-ances, the smallest of which is used as the Hammingistance for that test image.Table 3 shows the overall EERs as well as the

ERs separated by types of distortion. As expected,he EER of the MF is large because one MF is usuallyot sufficient to cover all the distortions. Amongifferent distortions, we find that rotation proves toe the most problematic. The CHF and OTCHF fil-ers are designed to handle in-plane rotations and, as

result, exhibit better rotation tolerance comparedith others and thus reduce the overall EER. Once

gain, it is clear that correlation filters can performell in iris verification.

. Conclusions

n this paper we show how advanced correlation fil-ers can be used for biometric verification. In par-icular, we show that correlation filters can beuccessfully applied to face verification, fingerprinterification, and iris verification. In fact, correlationlter methods can be used with any biometrics thatre in the form of images.

The authors thank Sony Corporation and thenited States Army Research Office for partial sup-ort of this research.

eferences1. P. J. Philips, P. Grother, R. Micheals, D. M. Blackburn, E.

Tabassi, and M. Bone, “Face recognition vendor test 2002:overview and summary,” http:��www.frvt2002.org.

2. B. V. K. Vijaya Kumar, “Tutorial survey of composite filterdesigns for optical correlators,” Appl. Opt. 31, 4773–4801�1992�.

3. D. O. North, “An analysis of the factors which determine sig-nal�noise discriminations in pulsed carrier systems,” Proc.IEEE 51, 1016–1027 �1963�.

4. A. VanderLugt, “Signal detection by complex spatial filtering,”IEEE Trans. Inf. Th. 10, 139–145 �1964�.

5. C. F. Hester and D. Casasent, “Multivariant technique formulticlass pattern recognition,” Appl. Opt. 19, 1758–1761�1980�.

6. B. V. K. Vijaya Kumar, “Minimum variance synthetic discrimi-nant functions,” J. Opt. Soc. Am. A 3, 1579–1584 �1986�.

7. A. Mahalanobis, B. V. K. Vijaya Kumar, and D. Casasent,“Minimum average correlation energy filters,” Appl. Opt. 26,3633–3630 �1987�.

8. P. Refregier, “Optimal trade-off filters for noise robustness,sharpness of the correlation peak, and Horner efficiency,” Opt.Lett. 16, 829–831 �1991�.

9. A. Mahalanobis, B. V. K. Vijaya Kumar, S. R. F. Sims, and J. F.Epperson, “Unconstrained correlation filters,” Appl. Opt. 33,3751–3759 �1994�.

0. A. Mahalanobis, B. V. K. Vijaya Kumar, and S. R. F. Sims,“Distance classifier correlation filters for distortion tolerance,discrimination and clutter rejection,” in Photonics for Proces-sors, Neural Networks, and Memories, J. L. Horner, B. Javidi,S. T. Kowel, and W. J. Miceli, eds., Proc. SPIE 2026, 325–335�1993�.

1. B. V. K. Vijaya Kumar, A. Mahalanobis, and A. Takessian,“Optimal tradeoff circular harmonic function correlation filtermethods providing controlled in-plane rotation response,”IEEE Trans. Image Process. 9, 1025–1034 �2000�.

Table 3. EERs for Various Distortion Sets for Each Filter Type

VerificationMethod

Distortion Sets

Overall�%�

Gaussian Noise�%�

Rotation�%�

Illumination�%�

Occlusion�%�

MF 18.3 16.3 20.5 21.6 16.4OTF 12.4 0.7 28.7 0 0CHF 4.0 6.8 0.4 0.2 0.3OTCHF 2.6 4.0 0.6 0.6 0.5Modified iris code 10.2 7.9 19.4 0 0

10 January 2004 � Vol. 43, No. 2 � APPLIED OPTICS 401

Page 12: Biometric Verification with Correlation Filters

1

1

1

1

1

1

1

1

2

2

2

2

2

4

2. A. Mahalanobis and B. V. K. Vijaya Kumar, “Polynomial filtersfor higher-order and multi-input information fusion,” in Pro-ceedings of the Eleventh Euro-American Optoelectronic Work-shop, Spain, June 1997, pp. 221–231.

3. A. V. Oppenheim and R. W. Schaffer, Digital Signal Processing�Prentice-Hall, Englewood Cliffs, N.J., 1975�.

4. Advanced Multimedia Processing Laboratory web page, De-partment of Electrical and Computer Engineering, CarnegieMellon University, Pittsburgh, Pa. �November 2003�, http:��amp.ece.cmu.edu.

5. F. J. Huang and T. Chen, “Tracking of multiple faces forhuman-computer interfaces and virtual environments,” IEEEInternational Conference on Multimedia and Expo, �Instituteof Electrical and Electronics Engineers, New York, 2000�, pp.1563–1566.

6. M. Turk and A. Pentland, “Eigenfaces for recognition,” J. Cogn.Neurosci. 3, 71–86 �1991�.

7. X. Liu, T. Chen, and B. V. K. Vijaya Kumar, “Face authenti-cation for multiple subjects using eigenflow,” Pattern Recogn.36, 313–328 �2003�.

02 APPLIED OPTICS � Vol. 43, No. 2 � 10 January 2004

8. T. Sim, S. Baker, and M. Bsat, “The CMU pose, illumination,and expression �PIE� database of human faces,” Technical Re-port CMU-RI-TR-01-02 �Robotics Institute, Carnegie MellonUniversity, Pittsburgh, Pa., 2001�.

9. P. Belhumeur, J. Hespanha, and D. Kriegman, “Eigenfaces vsfisherfaces: recognition using class specific linear projection,”IEEE Trans. Pattern Anal. Mach. Intell. 19, 711–720 �1997�.

0. A. Jain, L. Hong, and R. Bolle, “On-line fingerprint verifica-tion,” IEEE Trans. Pattern Anal. Mach. Intell. 19, 302–314�1997�.

1. A. Jain, L. Hong, S. Pankati, and R. Bolle, “An identity-authentication system using fingerprints,” Proc. IEEE 85,1365–1388 �1997�.

2. C. I. Watson, NIST Special Database 24—Live-Scan DigitalVideo Fingerprint Database, 1998, http:��www.nist.gov�srd�nists24.htm.

3. J. G. Daugman, “High confidence visual recognition of personsby a test of statistical independence,” IEEE Trans. PatternAnal. Mach. 15, 1148–1161 �1993�.

4. Miles Research Laboratory, http:��www.milesresearch.com.


Recommended