automatic localization of pupil using eccentricity and iris using gradient based method

11
Automatic localization of pupil using eccentricity and iris using gradient based method Tariq M. Khan a, , M. Aurangzeb Khan a , Shahzad A. Malik a , Shahid A. Khan a , Tariq Bashir a , Amir H. Dar b a Department of Electrical Engineering, COMSATS Institute of Information Technology, Islamabad, Pakistan b College of Electrical and Mechanical Engineering NUST, Rawalpindi, Pakistan article info Article history: Received 3 June 2010 Received in revised form 19 August 2010 Accepted 19 August 2010 Available online 8 October 2010 Keywords: Biometrics Iris recognition Segmentation Iris localization abstract This paper presents a novel approach for the automatic localization of pupil and iris. Pupil and iris are nearly circular regions, which are surrounded by sclera, eyelids and eyelashes. The localization of both pupil and iris is extremely important in any iris recognition system. In the proposed algorithm pupil is localized using Eccentricity based Bisection method which looks for the region that has the highest probability of having pupil. While iris localization is carried out in two steps. In the first step, iris image is directionally segmented and a noise free region (region of interest) is extracted. In the second step, angular lines in the region of interest are extracted and the edge points of iris outer boundary are found through the gradient of these lines. The proposed method is tested on CASIA ver 1.0 and MMU Iris databases. Experimental results show that this method is comparatively accurate. & 2010 Elsevier Ltd. All rights reserved. 1. Introduction Biometrics based person identification is getting popular due to its accuracy and high reliability. During the last few decades many biometric identification techniques such as those based upon voice, fingerprint, signature, face, retina and iris have been introduced [1]. Iris recognition is comparatively new and considered as highly foolproof due to the fact that it cannot be artificially copied and remains same during a person’s life time. It possess rich texture which offers a strong biometric cue to recognize individuals [2,3]. Iris texture is very complex pattern characterized by brightness, color, slope and size. These characteristics of iris pattern give rise to informality, density, linearity, phase, directionality, smoothness, fitness, regularity, frequency, coarseness, randomness and granulation of the texture as a whole [4]. As public security and safety demands are increasing every day, it is obligatory to have secure identification [5]. Thus, it is very important to develop an effective and accurate iris recognition system that not only overcomes the rigid constraints imposed during iris image acquisition [6] but is also able to offer a real time processing [7]. Generally, an iris recognition system can have four subsystems [8]. 1. Image acquisition. 2. Segmentation. 3. Normalization and iris code generation. 4. Comparison and recognition. This paper is about localization which is the most important part of any iris recognition system. Iris segmentation means locating the inner (pupil) and outer boundary of iris. Though eyelashes and eyelids removal process is also part of iris segmentation but, in this paper we have focused on locating the inner and outer boundary of iris. In many machine vision applications, such as iris recognition, pupil tracking [9], ocular torsion measurement [10], pupil size estimation [11], pupillometry [12], point of gaze extraction [13], pupil localization (the inner boundary location) is considered as one of the most important preprocessing step in iris localization. Pupil is a circular region located in the center of the iris [14]. Its basic function is to control the amount of light that enters the eye [15]. Different techniques are reported in literature for locating the inner and outer boundary of the iris. Normally these techniques are categorized into two classes: circular-edge based techniques and histogram based techniques. Daugman [16–18] proposed a circular-edge based algorithm which is quite popular. It uses a circular-edge detector for iris segmentation. Wildes [19] algorithm is based on two stages. In stage-1, an edge map is constructed based on gradient. In stage-2, Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/optlaseng Optics and Lasers in Engineering 0143-8166/$ - see front matter & 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.optlaseng.2010.08.020 Corresponding author. E-mail addresses: [email protected] (T.M. Khan), [email protected] (M. Aurangzeb Khan), [email protected] (S.A. Malik), [email protected] (S.A. Khan), [email protected] (T. Bashir), [email protected] (A.H. Dar). Optics and Lasers in Engineering 49 (2011) 177–187

Upload: tariq-m-khan

Post on 26-Jun-2016

228 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Automatic localization of pupil using eccentricity and iris using gradient based method

Optics and Lasers in Engineering 49 (2011) 177–187

Contents lists available at ScienceDirect

Optics and Lasers in Engineering

0143-81

doi:10.1

� Corr

E-m

aurangz

smalik@

tariq_ba

journal homepage: www.elsevier.com/locate/optlaseng

Automatic localization of pupil using eccentricity and iris using gradientbased method

Tariq M. Khan a,�, M. Aurangzeb Khan a, Shahzad A. Malik a, Shahid A. Khan a, Tariq Bashir a, Amir H. Dar b

a Department of Electrical Engineering, COMSATS Institute of Information Technology, Islamabad, Pakistanb College of Electrical and Mechanical Engineering NUST, Rawalpindi, Pakistan

a r t i c l e i n f o

Article history:

Received 3 June 2010

Received in revised form

19 August 2010

Accepted 19 August 2010Available online 8 October 2010

Keywords:

Biometrics

Iris recognition

Segmentation

Iris localization

66/$ - see front matter & 2010 Elsevier Ltd. A

016/j.optlaseng.2010.08.020

esponding author.

ail addresses: [email protected] (T.M. Khan

[email protected] (M. Aurangzeb Kha

comsats.edu.pk (S.A. Malik), shahidk@comsa

[email protected] (T. Bashir), amirdar@ce

a b s t r a c t

This paper presents a novel approach for the automatic localization of pupil and iris. Pupil and iris are

nearly circular regions, which are surrounded by sclera, eyelids and eyelashes. The localization of both

pupil and iris is extremely important in any iris recognition system. In the proposed algorithm pupil is

localized using Eccentricity based Bisection method which looks for the region that has the highest

probability of having pupil. While iris localization is carried out in two steps. In the first step, iris image

is directionally segmented and a noise free region (region of interest) is extracted. In the second step,

angular lines in the region of interest are extracted and the edge points of iris outer boundary are found

through the gradient of these lines. The proposed method is tested on CASIA ver 1.0 and MMU Iris

databases. Experimental results show that this method is comparatively accurate.

& 2010 Elsevier Ltd. All rights reserved.

1. Introduction

Biometrics based person identification is getting popular dueto its accuracy and high reliability. During the last few decadesmany biometric identification techniques such as those basedupon voice, fingerprint, signature, face, retina and iris have beenintroduced [1]. Iris recognition is comparatively new andconsidered as highly foolproof due to the fact that it cannot beartificially copied and remains same during a person’s life time. Itpossess rich texture which offers a strong biometric cue torecognize individuals [2,3]. Iris texture is very complex patterncharacterized by brightness, color, slope and size. Thesecharacteristics of iris pattern give rise to informality, density,linearity, phase, directionality, smoothness, fitness, regularity,frequency, coarseness, randomness and granulation of the textureas a whole [4].

As public security and safety demands are increasing every day, itis obligatory to have secure identification [5]. Thus, it is veryimportant to develop an effective and accurate iris recognitionsystem that not only overcomes the rigid constraints imposed duringiris image acquisition [6] but is also able to offer a real timeprocessing [7].

ll rights reserved.

),

n),

ts.edu.pk (S.A. Khan),

me.nust.edu.pk (A.H. Dar).

Generally, an iris recognition system can have four subsystems [8].

1.

Image acquisition. 2. Segmentation. 3. Normalization and iris code generation. 4. Comparison and recognition.

This paper is about localization which is the most importantpart of any iris recognition system. Iris segmentation meanslocating the inner (pupil) and outer boundary of iris. Thougheyelashes and eyelids removal process is also part of irissegmentation but, in this paper we have focused on locating theinner and outer boundary of iris.

In many machine vision applications, such as iris recognition,pupil tracking [9], ocular torsion measurement [10], pupil sizeestimation [11], pupillometry [12], point of gaze extraction [13],pupil localization (the inner boundary location) is considered asone of the most important preprocessing step in iris localization.

Pupil is a circular region located in the center of the iris [14].Its basic function is to control the amount of light that enters theeye [15]. Different techniques are reported in literature forlocating the inner and outer boundary of the iris. Normally thesetechniques are categorized into two classes: circular-edge basedtechniques and histogram based techniques.

Daugman [16–18] proposed a circular-edge based algorithmwhich is quite popular. It uses a circular-edge detector for irissegmentation. Wildes [19] algorithm is based on two stages. Instage-1, an edge map is constructed based on gradient. In stage-2,

Page 2: Automatic localization of pupil using eccentricity and iris using gradient based method

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187178

pupil and iris are segmented using Hough transform. Lui et al. [20]used an improved Hough transform to segment the pupil and irisregion. Kooshkestani et al. [21] proposed wavelet transform andanalytic geometry based pupil localization.

In histogram based techniques pupil is considered as the darkestregion in an eye image and thresholding is used for locating thepupil. Bai et al. [22] used global histogram and compute thethreshold for binarization to localize the pupil region. It is aneffective method to some extent, but if the gray level of the otherpart of eye falls below the gray level of pupil region, then it is unableto detect the pupil region correctly. Therefore, a new histogrambased method for pupil localization is proposed in this paper whichovercomes these flaws. For pupil localization, histogram is firstdivided into two levels: upper level and lower level by usingthresholding (dividing the histogram profile by using mid value).Then a region with minimum eccentricity is iteratively located.

For iris outer boundary localization, Basit et al. [23] extracted ahorizontal line from the center of pupil and calculated its gradient. Onthe basis of the gradient, edge points are extracted. This approachworks well when the central horizontal line is not affected byeyelashes. But when an eye is just partially open, and centralhorizontal line taken from the center of pupil is fully occluded byeyelashes and eyelids, it is hard to find the edges of iris outerboundary. For example in MMU database, there are few images inwhich the eye is partially open. Therefore, for outer boundarydetection, we focus on these issues and propose a more accurategradient based method which works well even when the eye ispartially open.

The remainder of this paper is organized as follows. Section 2details the proposed method. In Sections 3 and 4 experimentalresults for pupil and iris localization are discussed, respectively,and finally Section 5 concludes the work.

2. Proposed method

Our proposed method of iris localization is composed of twostages:

Fig. 1. Flow control diagram of pupil localization.

1.

Pupil localization. 2. Iris localization.

2.1. Pupil localization (pupil/iris boundary)

For pupil localization, an iterative histogram based method isproposed. Gray scales in the histogram of an eye image can bedivided into three regions: lower, medium and high. Lower regioncomprises gray levels that normally correspond to pupil andeyelashes. Gray levels of medium region correspond to iris andeyelids. High region contains sclera and other parts of face. In mostcases gray levels of eyelashes and pupil are almost same. Therefore,it becomes difficult to locate pupil only on the basis of gray levels.

In this paper, we have used eccentricity along with gray levelsto locate the inner boundary of iris (i.e. pupil). Eccentricity is aparameter which is used to measure the circularity of differentregions or sections. In other words, it measures how much a conicsection deviates from being circular. It is a quantity or measurewhich is used to uniquely characterize the shapes (ellipse,parabola, hyperbola, circle, etc.). The list below details theeccentricity of different conic sections.

For a perfect circular region, eccentricity is always zero. � The eccentricity of an ellipse which is not circular always lies

between 0 and 1 but can never be equal to 0 or 1.

� A parabolic region has the eccentricity equal to 1.

The eccentricity of a hyperbolic region or shape is alwaysgreater than 1.

It can be observed that if two regions have same eccentricity, thenthese regions are normally similar.

A flow diagram of the proposed method for pupil localization isshown in Fig. 1. For explanation, it is further divided into threephases.

2.1.1. Phase-I

Phase-I comprises the following four steps:

1.

Convert the RGB image of the human eye into gray scale imageI.

2.

In this step, the contrast of the image is stretched to full grayscale (0–255) by using Eq. (1) as given below:

Inorm ¼ 255I�Imin

Imax�Imin

� �, ð1Þ

where Imin and Imax are the minimum and maximum values ofgray levels in the input image I.

3.

Histogram of the rescaled image is calculated, which is shownin Fig. 2.

4.

As mentioned previously, gray levels of the pupil are always inlow region of the histogram. For further processing we have
Page 3: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 2. (a) A sample eye image of size 320�280 of CASIA Ver 1.0 iris database [24]. (b) Histogram of (a) in which maximum peak below 50 gray level represents the pupil

region. In (b) X-axis shows the gray levels distributed from 0 to 255 and Y-axis shows frequency corresponding to each gray level.

Fig. 3. (a) Shows the region constructed by selecting the highest frequency peak of Ll. (b) Shows the region constructed by selecting the highest frequency peak of Lu.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187 179

neglected all the gray levels above 128 and the remaining graylevels (0–128) are named as new_range.

2.1.2. Phase-II

Phase-II comprises the following four steps:

1.

In this step, minimum gray level (ming) and maximum graylevel (maxg) are found from the gray levels of the new_range.

2.

The new_range is divided into two levels: upper level (Lu) andlower level (Ll) on the basis of threshold th as given below:

th¼maxgþming

2, ð2Þ

The gray levels from ming to th are named as Ll and the graylevels from th+1 to maxg are named as Lu.

3.

Now the gray levels with maximum frequencies (no. of pixels)Pu and Pl are found from Lu and Ll, respectively.

4.

Two regions are constructed by selecting gray levels rangingfrom Pu�e to Puþe and Pl�e to Plþe, where e is a small realvalue. Then, two binary images are constructed correspondingto these two regions, which are shown in Fig. 3.

2.1.3. Phase-III

Phase-III comprises the following five steps:

1.

Now, the task is to select true pupil region from the twobinary images, constructed in last step of Phase-II, on the

basis of eccentricity. Two morphological operators opening

and closing are used to erase the unwanted region (noise)which is included in the selected region and can badlyaffect the eccentricity. A disk of radius 3 is used asstructuring elements for both closing and opening operations.To apply these operators the following two steps areexecuted:(a) In the first step, closing operator is applied which involves

two operators: erosion and dilation. It first performsdilation on the image and then erodes it. It is operatedor applied by multiplying each value of the structuringelement with the pixel of the input image underneathit and average is calculated. If the average valuematches the foreground value then the input pixel is leftas it is. If it does not match the foreground value, then it isset to as background pixel. Erosion is the reverse process ofdilation.

(b) In the second step, opening is performed. Opening is thereverse process of closing. In opening first erosion isperformed and then dilation. Fig. 4(a) shows animage constructed from lower levels having highestfrequency which is affected by eyelashes. It can beobserved that boundary of pupil is not compact regionand if we directly apply opening then the radius of pupilwill decrease. So, first closing is applied to make it acompact region which is shown in Fig. 4(b). After closing,opening is applied. Fig. 4(c) shows the effect of opening

applied in Fig. 4(b).

Page 4: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 4. (a) Eye lashes affected pupil. (b) Shows result after applying closing on (a). (c) Shows the effect of opening applied on (b).

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187180

2.

Now the eccentricities of both Figs. 3(a) and (b) are calculatedby using equations

el ¼

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiMaj2

l �Mn2l

Mn2l

vuut , ð3Þ

eu ¼

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiMaj2

u�Mn2u

Mn2u

vuut , ð4Þ

where el and eu and the eccentricities of images correspondingto the regions of lower level Ll and upper level Lu, respectively.Majl and Mnl are the lengths of major and minimum axis of Ll,respectively. Maju and Mnu are the lengths of major andminimum axis of the image corresponding to region of Lu,respectively.

3.

If

eloeu

new_region¼ Ll

e¼ el

Else

new_region¼ Lu

e¼ eu

4.

Rp ¼ ððmaxðyÞ�minðyÞÞ=2Þ, ð5Þ

If

eðiÞaeði�1Þ

move to Phase� II

elseregion corresponding to e(i) is declared as pupil region Pr.Fig. 5 shows the iterations that lead to convergence of lowerregion towards true pupil region. In the first iteration, theupper region corresponds to iris, which is also a circular region.This means its eccentricity should be less than 1 and itcan have lower eccentricity than pupil if it forms a perfectcircular. But, the eccentricity corresponding to pupil regionalways remains smaller due to two reasons: Firstly, thevariations between the gray levels of pupil are very small ascompared to iris region. To cater to this small variation ofpupil, in step 8, we assign a small value equal to 5 to e so that itcan only cater to the variation of pupil region. Due to thisreason pupil always forms a more compact and circular regionas compared to iris. In Fig. 3(b), which is a case of nearly a fullyopened eye, it can be seen that iris region along with someeyelashes is selected. When we apply morphological operatorsfor removing noise, as it is less compact region compared topupil region, its boundary is also affected which is shownin Fig. 5(b). Due to this reason in second iteration, pupilregion is selected instead of iris region. Secondly, in mostcases, eye is partially opened and some part of iris is coveredwith eyelashes and eyelids which badly affects the circularityof iris.

5.

After locating the true pupil, next step is to find its radius Rp

and center coordinates (Cpx,Cpy). Rp is calculated by

Page 5: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 5. First, second and third rows show the result of morphological operation opening and closing applied on the first, second and third iterations on selected region.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187 181

where y is an array of column coordinates of pupil imageshown in Fig. 5(f). Central column can be determined as

Cpy ¼ ððmaxðyÞþminðyÞÞ=2Þ: ð6Þ

In case of partially opened eye in which the upper or lowerboundary of pupil is covered with eyelids it is not possible tolocate the correct vertical radius of the pupil. Therefore,instead of finding max(x) and min(x) for vertical radius, Cpx iscalculated by using

Cpx ¼maxðxÞ�Rp, ð7Þ

where x is the array of row coordinates of pupil shown in Fig. 5.Figs. 12 and 13 show some results of our proposed pupilsegmentation method.

2.2. Iris localization (iris/sclera boundary)

Outer boundary detection or iris localization process isnormally considered more difficult than inner boundary detectionor pupil localization due to two reasons. First, the gray leveldifference between iris and sclera is very small. Second,

sometimes eye is partially opened and most of the upper andlower region of iris is occluded with eyelashes and eyelids. Toovercome these issues, a new method of iris localization isproposed in which, we extract a most secure region ROI (region ofinterest) from an eye image, which has the lowest probabilityof occlusion caused by eyelashes and eyelids. Fig. 6 shows theblock diagram of the proposed method of iris localization andinvolves the following steps:

1.

In first step, we directionally segment the eye image into threeregions as shown in Fig. 7 .(a) Region-1: Upper eyelids and upper eyelashes affected

region.(b) Region-2: Lower eyelids and lower eyelashes affected

region.(c) Region-3: Secure region (region of interest).

For dividing the image into these three directional regions,center of the pupil is considered as the reference point. Thearea of an eye image corresponding to angles from �301 to 301is declared as Region-3. Region-1 is formed from angles 301 to

Page 6: Automatic localization of pupil using eccentricity and iris using gradient based method

FigReg

Fig

Fig(a).

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187182

1501. The region corresponding to angles 1501 to �1501 isalso marked as Region-3. Finally, the last region, Region-2 isformed from angles �1501 to �301. It is observed from

. 7. Region1: eyelashes effected region, Region2: eyelids effected region,

ion3: secure region (region of interest).

. 6. Block diagram of proposed method for iris (outer boundary) localization.

. 8. (a) 1-D profile of angular line. X-axis shows the pixels of 1-D profile and Y-axis show

X-axis shows the pixels of 1-D profile and Y-axis shows their difference value.

Fig. 7 that Region-1 and Region-2 are affected by eyelashesand eyelids, respectively, so these two regions are ignoredin subsequent processing. Region-3 is considered as secureregion since it is the least affected region by eyelashes andeyelids.

2.

In this step, a median filter of size 11�11 is applied to smoothout the secure region.

3.

In this step, we extract 1-D gray level profiles from the secureregion oriented at discrete angles as given in the followingequation and shown in Fig. 9(a).

xi

yi

" #¼

rcosðyiÞ

rsinðyiÞ

" #, ð8Þ

where xi and yi are the sets of x-coordinates and y-coordinatescorresponding to ith angle yi and r is an integer array setgiven by

r ¼ froundð3=2� RpÞ,roundð3=2� RpÞþ1, . . . ,3� Rpg, ð9Þ

where Rp is the radius of pupil.Now ith gray level profiles Pyi from secure region SR is given bythe following equation and also shown in Fig. 9(a):

Pyi ¼ SRðxi,yiÞ: ð10Þ

4.

In this step, we use difference operator to find the gradient oneach point (xi, yi) of all these profiles Pyi as given in thefollowing equation:

dPyi ¼ Pyi½j��Pyi½j�1�, ð11Þ

where j is used to access the elements of profile oriented atangles yi. Fig. 8 shows one of the gray level profiles and itsgradient.Now, in each profile, points with maximum gradient areconsidered as the iris boundary points. Fig. 9(b) shows the irisboundary points corresponding to maximum change.Fig. 9(b) shows that because of eyelashes and eyelids there aresome points which are not true boundary points.

5.

In this step, these false points are removed and true boundaryof iris is localized. For this purpose a distance matrix D isconstructed which contains the Euclidean distance of eachpoint with the center of pupil using formula given by

D½i� ¼

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiðx0�xiÞ

2þðy0�yiÞ

2q

, ð12Þ

where (x0,y0) are the center coordinates of pupil and (xi,yi) arethe coordinates corresponding to boundary points marked asedge points. Then a difference matrix Diff is formed bycalculating the absolute difference of each element of D with

s their gray level values. (b) Shows the result of difference operator applied on

Page 7: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 9. (a) 1-D profiles oriented at different discrete angles. (b) Shows iris boundary points corresponding to maximum change. (c) Shows difference matrix. (d) Shows the

extracted iris.

Fig. 10. Two white dots shows that the center of pupil and center of iris and white

circle shows the extracted iris boundary.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187 183

all other elements. Fig. 9(c) shows the difference matrix.If more than 50% entries of Diff corresponding to one edgepoint exceed a pre-defined threshold (in our case 10) then theparticular edge point is not selected in the list of true points.After locating the true edge points, radius of the iris iscalculated as given by

Ri ¼XN

i ¼ 1

DisðiÞ

!,N, ð13Þ

where Dis is the array containing the distance of all the truepoints from the center of pupil and N are the total number oftrue points.As both pupil and iris are not concentric, which is shown inFig. 10, therefore the center coordinates of iris are calculatedby

Ciy ¼XN1

i ¼ 1

Dis1ðiÞ

!,N1�

XN2

i ¼ 1

Dis2ðiÞ

!,N2

!,2, ð14Þ

Cix ¼ Cpx, ð15Þ

where Dis 1 is the distance array of all the true points and N 1 isthe total number of true points taken from the region coveredby angles from 3501 to 301 and Dis 2 is the distance array of allthe true points and N 2 are the total number of true points takenfrom the region covered by angles from 1501 to 2101.

3. Experimental results for pupil localization

Proposed algorithm is tested on two most widely useddatabases MMU Database [25] and CASIA Ver 1.0 [24]. Proposedalgorithm is implemented in MATLAB 7.50 on a computer having1.86 GHz Core 2 Duo processors and 1 GB RAM. Experiments aredivided into two sets.

3.1. Set 1

In set-1, results are collected after testing the proposedalgorithm on whole MMU database. This database contains 450images of 45 individuals, 10 images per individual. The resolutionof each image is 320�240. In MMU database each image containsa white spot in the pupil region due to specular reflection, whichbadly affects the eccentricity of the pupil region as shown inFig. 11(a). Therefore, in MMU database, we have to minimize the

Page 8: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 11. (a) A sample image of MMU database showing a white spot in the pupil region. (b) Shows the result of min filter applied on (a). (c) Shows the result of median filter

applied on (b).

Fig. 12. Some of the results for MMU database after applying the proposed method for pupil localization.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187184

effect of this white spot by using some filtering. For this purpose,we have used a nonlinear min filter of size 3�3 which is shown inFig. 11(b). Fig. 11(b) shows that there is some spectral reflectionnoise present in the pupil region which can affect the eccentricity

of pupil region. To remove this noise, median filter has been usedwhich is shown in Fig. 11(c).

The accuracy achieved on this database is 98%. The accuracy ismeasured on human decision with the precision of 2 pixels. In an

Page 9: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 13. Some of the results of CASIA Ver 1.0 database after applying the proposed method for pupil localization.

Table 1Comparison of some recent segmentation algorithms over CASIA 1.0 (results

supplied by respective authors).

Author Database used Accuracy (%)

Wildes [19] CASIA Ver 1.0 100

Ma [26] CASIA Ver 1.0 99.86

Samira [21] CASIA Ver 1.0 95

Proposed CASIA Ver 1.0 100

Table 2Comparison of some recent segmentation algorithms over CASIA 1.0 (results taken

from the published work).

Method Database used Accuracy (%)

Yuan [27] CASIA Ver 1.0 99.45

Cui [8] CASIA Ver 1.0 99.34

Mateo [28] CASIA Ver 1.0 95

Wildes [19] CASIA Ver 1.0 99.9

Daugman [16] CASIA Ver 1.0 98.6

Basit [23] CASIA Ver 1.0 99.6

Proposed CASIA Ver 1.0 100

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187 185

image the circle which exceeds this criteria is marked as wrongpupil. Fig. 12 shows some of the results of MMU database.

Table 3Comparison of some recent segmentation algorithms over MMU database (results

are taken from [33]).

Method Accuracy (%)

Masek [29] 83.92

Daugman [30] 85.64

Ma et al. [31] 91.02

Daugman new [32] 98.23

Somanth et al. [33] 98.41

Proposed 98.22

3.2. Set 2

In set-2, the proposed algorithm is tested on CASIA Ver 1.0 irisdatabase. This database contains 756 images of 108 individuals, 7images per individual. The resolution of each image is 320�280.Using the eccentricity property of region and bisecting the regionsleads us towards pupil region which mitigates the effects ofeyelashes. Because of this improvement we achieve 100%accuracy on this database. The accuracy is measured on humandecision with a precision of 2 pixels. Fig. 13 shows some of theresults of CASIA Ver 1.0 iris database. A comparison of proposedalgorithm with some already existing algorithms is given inTable 1.

4. Experimental results for iris localization

The proposed algorithm for iris localization is tested usingMMU and CASIA ver 1.0 databases. On MMU database we have

achieved 98.22% and 100% on CASIA ver 1.0. The accuracy ismeasured on human decision with a precision of 2 pixels. In animage the circle which exceeds this criteria is marked as false.Tables 2 and 3 shows the comparison of our proposed algorithmwith some recent algorithms. Figs. 14 and 15 show some of theresults of MMU and CASIA ver 1.0 databases. Fig. 16 shows somefailed results of MMU database after applying the proposed

Page 10: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 14. Some of the results of CASIA Ver 1.0 database after applying the proposed method for iris localization.

Fig. 15. Some of the results of MMU database after applying the proposed method for iris localization.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187186

method of iris localization. From Fig. 16(a) it can be seen that theleft side of the secure region is badly affected because of specularnoise caused by the glasses which becomes the cause of failure.From Figs. 16(b) and (c), it can be seen that the width of right

sided iris disk is greater than left sided. As, we select the truepoints on the basis of difference, most of the left sided points areconsidered as wrong point, although they were not wrong,because of threshold 10.

Page 11: Automatic localization of pupil using eccentricity and iris using gradient based method

Fig. 16. Some of the failed results of MMU database after applying the proposed method of iris localization.

T.M. Khan et al. / Optics and Lasers in Engineering 49 (2011) 177–187 187

5. Conclusion

In this paper, a new method for pupil and iris localization isproposed. In order to locate true pupil region, segmentation isperformed first by bisecting the gray levels into two levels: upperlevel and lower level. Then two regions are constructed: one inupper level and the other one in lower level which have highestprobability of occurrence. After constructing the regions, theireccentricity is calculated. The gray level range whose region’seccentricity is lower than the other one is marked as new gray levelrange. This iterative process continues till region repeats itself.Final lowest eccentricity region is considered as the true pupilregion. For iris localization, first the eye image is directionallydecomposed into three regions and the region of interest is pickedup. In the region of interest, 1-D gray level profiles are extractedand their gradient is calculated. By finding maximum change onthese gray level profile, true iris boundary is located. Thisprocedure resolves the problem of eyelashes and eyelids.

Experiments on MMU and CASIA ver 1.0 iris databases and theresults in Tables 1–3 show that the proposed method produceshighly accurate and comparable results to previously reported ones.

Acknowledgments

The authors wish to thank Multimedia University for providingMMU iris database and Chinese Academy of Science-Institute ofAutomation for providing CASIA ver 1.0 iris database.

References

[1] Du Y, Chang C-I. 3d combinational curves for accuracy and performanceanalysis of positive biometrics identification. Optics and Lasers in Engineering2008:477–90.

[2] Bowyer WK, Hollingsworth K, Flynn JP. Image understanding for irisbiometrics: a survey. Computer Vision and Image Understanding2008;110(2):281–307.

[3] Ross A. Iris recognition: the path forward. IEEE Computer Society 2010(Feb-ruary):30–6.

[4] Patnala SR, Sreenivasa Reddy E, Ramesh Babu I. Iris recognition systemusing fractal dimensions of haar patterns. International Journal of SignalProcessing, Image Processing and Pattern Recognition 2009;2(September):75–84.

[5] Chen Y, Adjouadi M, Han C, Wang J, Barreto A, Rishe N, et al. A highly accurateand computationally efficient approach for unconstrained iris segmentation.Image and Vision Computing 2009;28:261–9.

[6] Matey JR, Naroditsky O, Hanna K, Kolczynski R, LoIacono D, Mangru S, et al.Iris on the move: acquisition of images for iris recognition in less constrainedenvironments. Proceedings of the IEEE 2006;94(11):1936–46.

[7] Chen Y, Wang J, Adjouadi M. A robust segmentation approach to irisrecognition based on video, in: Applied imagery pattern recognition (AIPR)annual workshops, Washington, vol. 94 (11), 2008. p. 15–7.

[8] Cui J, Wang Y, Tan T, Ma L, Sun Z. A fast and robust iris localization methodbased on texture segmentation. Proceedings of the SPIE 2004:401–8.

[9] Tian Y, Kanade T, Cohn JF. Dual-state parametric eye tracking. Face andGesture Recognition 2000:110–5.

[10] Chung DWJ, Eizenman M, Cheung BSK, Frecker RC. Estimation of oculartorsion with dynamic changes in pupil. In: Proceedings of IEEE engineering inmedicine and biology.16th annual, vol. 2, 1994. p. 924–5.

[11] Kim J, Park K. An image processing method for improved pupil sizeestimation accuracy. In: Proceedings of the 25th annual internationalconference of the IEEE EMBS, vol. 2, 2003.

[12] Iskander DR, Collins MJ, Mioschek S, Trunk M. Automatic pupillometry fromdigital images. IEEE Transactions on Biomedical Engineering 2004;51:1619–27.

[13] Kim SI, Lee DK, Kim SY, Kwon OS, Cho J. An algorithm to detect a center ofpupil for extraction of point of gaze. In: Proceedings of the 26th annualinternational conference of the IEEE EMBS, 2004.

[14] Belcher C, Du Y. Region-based SIFT approach to iris recognition. Optics andLasers in Engineering 2008:139–47.

[15] Cassin B, Solomon S. Dictionary of eye terminology. Gainsville, FL: TriadPublishing Company; 1990.

[16] Daugman JG. High confidence visual recognition of person by a test ofstatistical independence. IEEE Transactions on Pattern Analysis and MachineIntelligence 1993;15:1148–61.

[17] Daugman JG. The importance of being random: statistical principles of irisrecognition. IEEE Transactions on Pattern Recognition 2003;36(2):279–91.

[18] Daugman JG. How iris recognition works. IEEE Transactions on Circuits andSystems for Video Technology 2004;14(1):21–30.

[19] Wildes RP. Iris recognition: an emerging biometric technology. Proceedingsof IEEE 1997;85:1348–63.

[20] Liu X, Bowyer K, Flynn P. Experiments with an improved iris segmentationalgorithm. In: 4th IEEE workshop on automatic identification advancedtechnologies (AutoID), October 2005. p. 118–23.

[21] Kooshkestani S, Pooyan M, Sadjedi H. A new method for iris recognitionsystem based on fast pupil localization. In: ICCA, vol. 5072, 2008. p. 555–64.

[22] Bai X, Wenyao L, et al. Research on iris image processing algorithm. Journalof Optoelectronics Laser 2003;14:741–4.

[23] Basit A, Javed MY. Localization of iris in gray scale image using intensitygradient. Optics and Lasers in Engineering 2007;45:1107–14.

[24] Specifications of casia iris image database. In: Chinese Academy of Sciences,Available at /http://www.nlpr.ia.ac.cn/english/irds/irisdatabase.htmS.

[25] Mmu iris image database. In: 2007 Available at /http://www.persona.mmu.edu.my.ccteo/S.

[26] Ma L, Tan T, Wang Y, Zhang D. Efficient iris recognition by characterizing keylocal variations. IEEE Transactions on Image Processing 2004;13(6):739–50.

[27] Yuan W, Lin Z, Xu L. A rapid iris location method based on the structure ofhuman eyes. In: 27th Annual conference on engineering in medicine andbiology, 2005.

[28] Otero-Mateo N, Vega-Rodrguez M, Gmez-Pulido JA, Snchez-Prez JM. A fastand robust iris segmentation method. In: Proceedings of the 3rd Iberianconference on pattern recognition and image analysis, Part II. Berlin:Springer; July 2007. p. 162–9.

[29] Masek L, Kovesi P. Matlab source code for a biometric identification systembased on iris pattern. In: The school of computer science and softwareengineering, the University of Western, 2003.

[30] Daugman J. How iris recognition works. In: Proceedings of 2002 internationalconference on image processing, vol. 1, 2002.

[31] Ma L, Tan TN, Wang YH, Zhang D. Local intensity variation analysis for irisrecognition. Pattern Recognition 2004;37(June):1284–98.

[32] Daugman J. New methods in iris recognition. IEEE Transactions on SystemsMan and Cybernetics. Part B: Cybernetics 2007;37(October):1167–75.

[33] Dey S, Smanta D. A novel approach to iris localization for iris biometricprocessing. International Journal of Biological, Biomedical and MedicalSciences 2008;3(September):180–91.