proff. former director r&d, exam controller gndec, bidar [email protected]...

11
Retrieval of Palmprint from Natural Image using Skin Color model for biometric Authentication Dr. Dhananjay Dr. I.V.Murali Krishna Dr.C.V.Guru Rao Proff. Former Director R&D, Exam controller GNDEC, Bidar JNTU Hyderabad SR University India Hyderabad,India Waranagal,India [email protected] [email protected] [email protected] Abstract: Accesses to the secured resources are controlled through a proper authentication and authorization process. An authentication is a process of ascertaining the claimed identity of a person. Biometric system is an effective modality for authentication. The biometric traits are natural and have a good discriminating ability. A palmprint is one among such biometric traits possessing required features for unique identification and discrimination. The palmprints are acquired using the specialized palmprint capture devices. Use of a generic camera for the acquisition of a palmprint from a cluttered natural image containing the palm has motivated this paper. Perceptual skin color models along with the set of morphological operations are used for palmprint retrieval from cluttered captured images. During the post processing a vector based method is used for the palmprint alignment and registration. A cepstrum based method together with a score confidence is used for organizing the palmprint samples for optimized search time. Keywords: Morphological operations, heart line, anisotropic diffusion, Cepstrum, Score confidence. 1. Introduction Unique physiological and behavioural characteristics of human being have been used for an authentication. The biometrics is a process for automated authentication by utilizing biometric traits such as the finger print, face, lip, iris, hand geometry, palmprint etc [1]. The finger print based biometric authentication is widely used for identity verification. In this a genuine user fails to register finger print if it has been in use or has been put on with color or a tattoo. A suitable alternative for the finger print is palmprint due to its high and good number of discriminating features spread across the palmprint. A palmprint is an area below the fingers and above the wrist [2]. Features in the palmprint that are useful for establishing an identity of a person are principle lines, wrinkles, minutiae, ridges etc. A palmprint is normally captured using the specialized hardware such as a palmprint scanner. The principle lines and the wrinkle features of the palmprint can be captured by using the normal web camera having the sufficient resolution. The palmprint captured using the web camera can have a cluttered and noisy background. We can separate palmprint from other background object using the skin color models that are based on the human perception. It is observed that the skin color of the palm is easily distinguishable from other skin areas provided the palm region is occupying prominent space in the captured image (ref. Fig. 1.). We have applied the log opponent color model [3] followed by an application of the HCL color model [4] for retrieval of palmprint. The retrieved palmprint is submitted for registration and identification. This paper is organized in sections. An introduction to the paper is given in section 1. The existing work reported on the skin color model and palmprint processing is presented in the section 2. The section 3 is aimed at describing the palmprint retrieval process. The registration process of the retrieved palmprint is discussed in section 4. Extracting of the feature vector from the ROI is dealt in the section 5. Experimental results are presented in section 6 followed by the conclusion and scope for the further work.

Upload: others

Post on 04-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Retrieval of Palmprint from Natural Image using Skin Color model for biometric Authentication

Dr. Dhananjay Dr. I.V.Murali Krishna Dr.C.V.Guru Rao Proff. Former Director R&D, Exam controller GNDEC, Bidar JNTU Hyderabad SR University

India Hyderabad,India Waranagal,India [email protected] [email protected] [email protected]

Abstract: Accesses to the secured resources are controlled through a proper authentication and authorization process. An authentication is a process of ascertaining the claimed identity of a person. Biometric system is an effective modality for authentication. The biometric traits are natural and have a good discriminating ability. A palmprint is one among such biometric traits possessing required features for unique identification and discrimination. The palmprints are acquired using the specialized palmprint capture devices. Use of a generic camera for the acquisition of a palmprint from a cluttered natural image containing the palm has motivated this paper. Perceptual skin color models along with the set of morphological operations are used for palmprint retrieval from cluttered captured images. During the post processing a vector based method is used for the palmprint alignment and registration. A cepstrum based method together with a score confidence is used for organizing the palmprint samples for optimized search time.

Keywords: Morphological operations, heart line,

anisotropic diffusion, Cepstrum, Score confidence.

1. Introduction

Unique physiological and behavioural characteristics of human being have been used for an authentication. The biometrics is a process for automated authentication by utilizing biometric traits such as the finger print, face, lip, iris, hand geometry, palmprint etc [1]. The finger print based biometric authentication is widely used for identity verification. In this a genuine user fails to register finger print if it has been in use or has been put on with color or a tattoo. A suitable

alternative for the finger print is palmprint due to its high and good number of discriminating features spread across the palmprint. A palmprint is an area below the fingers and above the wrist [2]. Features in the palmprint that are useful for establishing an identity of a person are principle lines, wrinkles, minutiae, ridges etc. A palmprint is normally captured using the specialized hardware such as a palmprint scanner. The principle lines and the wrinkle features of the palmprint can be captured by using the normal web camera having the sufficient resolution. The palmprint captured using the web camera can have a cluttered and noisy background. We can separate palmprint from other background object using the skin color models that are based on the human perception. It is observed that the skin color of the palm is easily distinguishable from other skin areas provided the palm region is occupying prominent space in the captured image (ref. Fig. 1.). We have applied the log opponent color model [3] followed by an application of the HCL color model [4] for retrieval of palmprint. The retrieved palmprint is submitted for registration and identification. This paper is organized in sections. An introduction to the paper is given in section 1. The existing work reported on the skin color model and palmprint processing is presented in the section 2. The section 3 is aimed at describing the palmprint retrieval process. The registration process of the retrieved palmprint is discussed in section 4. Extracting of the feature vector from the ROI is dealt in the section 5. Experimental results are presented in section 6 followed by the conclusion and scope for the further work.

Page 2: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Figure1. Image captured using web camera containing palm.

2. Related Work 2.1 Skin color model

Variety of color spaces are in use for color

representation. RGB, perceptual color spaces

HSL (hue, saturation, luminance), HSV/HSB (hue,

saturation, value or brightness) [5] and HSI (hue,

saturation, intensity). Perceptually uniform color

spaces like L*u*v*, and L*a*b* (luminance L*,

chrominance u*, v*, a*, and b*)[6] are also used

for representation of color image. Many existing

skin segmentation techniques use the individual

image pixel for classification of image region into

skin and non-skin categories using the color of

the pixels [7]. It is observed that separating the

illumination from chrominance helps in

detecting the skin pixels [8]. In general human

skin is characterized by a combination of the red

and melanin (yellow, brown) hue. RGB color

space was used in [9] for separating skin and the

non skin regions. The normalized RGB that

separates the luminance from chrominance was

proposed by [10 11]. The HSV color space for

skin detection uses hue and saturation value for

classifying skin regions [12]. Perceptual color

space for the modelling the skin color are also

used. Skin detection can be classified as non-

parametric method or parametric method. Non

parametric methods works on the basis of the

spatial statistical information of the skin color. In

parametric method snake model, level set,

Gaussian model etc are in use. In this paper we

have used non-parametric log opponent model

along with the HCL model for the skin detection.

2.2 Palmprint processing

The palmprint authentication system has received a considerable attention due to its discriminating features and large area. The existing methods on the palmprint are widely classified as Line, subspace, local statistical, global statistical, frequency domain or coding based [13]. The Line based approach uses the edge detection method for extracting the palmprint lines [14]. The principal line extraction using Sobel mask and morphological operation [15] had good performance. This method fails to capture the minutiae. The pre-processing of the palmprint using the Gabor filter [16] had improved the accuracy. The Line feature based PCA and fused template using Wavelet [26] are also reported. The subspace based method includes usage of principal component analysis (PCA), independent component analysis (ICA), linear discriminate analysis (LDA) and combination of their variants. The local statistical approach works on transforming an image into the frequency domain followed by the processing to extract the local or global statistical measurements [27]. Uses of Fourier transform for extracting the features of palmprint [17] have obtained improved result. A transformation invariant global statistical approach computes moments [18] for palmprint authentication. The coding method used the filtered coefficients as the features through encoding. The competitive scheme to code the output of the elliptical Gabor filters with different orientations [19] enhances fine features of palm print. An intra modal fusion had a promising result in palmprint authentication. This system had used the fusion score to increase the accuracy of the system using the multimodal input [20]. The feature fusion aims to introduce extra discriminative information for classification. Multi spectral palm

Page 3: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Processing is proposed by [21, 22]. A reliable palmprint based personal authentication system should discriminate between the individuals.

3. Palmprint retrieval using the skin color 3.1 Log opponent color model.

Color perception of human eye has inspired many color models. These color models are called as perceptual color models. A log opponent color space for skin color detection was suggested by []. In this model the RGB color values are represented using log opponent using the eqn (1)

2/))()(()(

)()()(

RLogGLogBLogB

GLogRLogRGLogI

y

g

….(1)

The above model uses the green channel to model intensity and red and blue channels are made independent of illumination. Hue value is obtained using eqn. (2)

),(tan yg BRHue

.....(2) The values computed from eqn. (1) and (2) are used to define a Log opponent color image. The output image is passed through an averaging filter to produce a filtered image having a zero mean over given mask size (in our paper we have used [32, 32] mask size). The filtered image has a better response to an area having exposed skin than any other object present in the image (ref. Fig 2.)

Figure 2 . (a) Image captured using web cam (b) Filtered image

Figure 3. (a) Threshold image (b) Retrieved palmprint

We have used mean value of the filtered image as a threshold to separate the skin region from the non-skin region. This output is presented in fig 3a. The image is submitted for simple morphological operations to remove objects with smaller size and retain one with the maximum size. The

output obtained after the morphological operations is used for logical ‘AND’ with input image to retrieve the palmprint as shown in fig 3b. The above operations are presented as an algorithm Log Opponent Mode.

Algorithm-I Log Opponent Model

Page 4: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

1 Use Eqn. 1 and Eqn. 2 to generate Log color image

2 Apply Average filtered to the Log color image

3 Use the mean value of Log color image as a threshold

4 Generate threshold image 5 Apply a suitable morphological

operation to remove an object with smaller area to obtain the segmented color image.

6 Retrieve the palmprint by applying the logical ‘AND’ operation between the input image and segmented image.

The above algorithm works fine for all the images captured using the low cost web camera but fails to capture the palmprint when the image captured contains objects having hue value nearer to the skin color as depicted in the fig 4(a , b). We have used the HCL color model for proper retrieval of the required palmprint.

Figure 4. (a) Sample Input (b) retrieved palmprint area 3.2 HCL based skin color model HCL skin color model was proposed by [4]. In this model H is used to represents the Hue, C represents the chrominance and L is used to represent the luminance component of the image. The luminance in the HCL color space is calculated as a linear change from minimum to the maximum value of the RGB values as in eqn.(3)

2/)),,()1(),,(.( BGRMinQBGRMaxQL

…(3)

Value of Q is defined by e where α and γ are

used as tuning factor to produce better quality of color. On similar basis the chromaticity values are defined as in eqn. (4)

3/)(. RGGBBRQC

…(4)

))(/)arctan(( GRBGH

…(5) The chromaticity ( C ) also represents the average spread of the RGB values. The value of hue ( H ) is calculated using the eqn. (3). The spread of the

hue value is controlled between - to + by using the below given set of equations eqn. (6,7,8,9,10,11)

HHBGGRIf :0)(&0

..(6)

HHBGGRIf :0)(&0

..(7)

HHBGGRIf *3/2:0)(&0

,.(8)

HHBGGRIf *3/4:0)(&0

...(9)

HHBGGRIf *3/4:0)(&0 ..(10)

HHBGGRIf *3/4:0)(&0

..(11)

It has been observed that the palmprint skin area has the dominance of the R component and hence in the HCL model we have used only eqn. (8,9) to detect the skin pixels of the palmprint region. The response of the non palm area skin will have a low hue values. The response of the R component will be more in comparison with the G component. This intuition allows us to differentiate between the palmprint skin and non palmprint skin area. The retrieved palmprint suing this process is presented in fig. (5). The palmprint registration process reduces the computational burden through the uniform alignment of all the retrieved palmprints.

Page 5: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Figure 5. Resulting palm image retrieved after applying HCL color model

4 Palmprint registration 4.1 Palmprint alignment The palmprint retrieved using the process discussed in the previous section will not have a uniform space defined. The palmprint samples differ in reference to the direction of the inclination. An efficient classification should have a palmprint aligned in the uniform direction. Establishing the center of the image and passing a line in the direction of inclination intersecting a vertical line right angled establishes two vectors. Inclination of the palmprint is calculated as the angle between these two vectors [23]. In our paper we have establish the perimeter pixels of the palmprint to find the farthest point from the center of the image. If V1 represents line joining the farthest point of image and V2 line intersection through center eqn. (12) is used to calculate the angle of inclination.

)21/()21(tan VVVV

…(12) The aligned palmprint sample is further classified at coarser level using the feature of the principle lines of the palmprint.

4.2 Principle Line detection Principle lines are the prominent features of palmprint. One among the principle line is the heart line [] .The inclination of the heart line is in the opposite directions in left and the right palmprint. The property is used for the coarse classification of palmprints along with assigning the flag to the palmprint sample. The process of marking the principle lines is an important step in the palmprint biometrics. We deal this process by applying the model as proposed by [24]. Selective smoothing or edge preserved smoothing is a process of smoothing image intra regions without

disturbing the boundaries or the edges. The solution for heat diffusion partial differentiation equation is given by eqn. (13).

)( yyxxt IIII

...(13)

The anisotropic diffusion equation eqn. (14)

IItyxcItyxcdivI Ct ),,(),,((

…(14)

Where div is divergence operator and , are

the Laplacian and gradient operators respectively for the intra and inter region smoothing, The coefficient c is chosen such that c=1 within the region and c=0 at boundaries. The nonlinear diffusion keeps the region boundaries intact and also enhances the edges or principle lines in the palmprint image. We have applied the modified line masks [23] for the better image quality and enhancing the principle lines. The output obtained after applying modified line mask is used as an input image for applying anisotropic diffusion process. We use the mean value of the image as a threshold to pick up only the edge and the line pixels. The output of this process is depicted in fig 6(a,b,c,d). Algorithm-II Palmprint Registration 1. Find the center of retrieved palmprint 2. Determine V1 as an intersect a line through center of the palmprint 3. Determine V2 using the index finger as a reference and pass line intersecting at center. 4. Find the inclination angle between V1,V2 using eqn. (12) 5. Rotate the image in clock wise direction by inclination angle.

6. Apply modified line masks to palmprint. 7. Apply the Anisotropic diffusion method to mark the principle lines. 8. Use pixel label algorithm to detect principle lines. 9. Mark the heart line using pixel value.

10. Determine the inclination of the heart line. 11. If inclination gradient is less than zero Set flag =0 as mark for the left palmprint Else set the flag=1 to mark the right palmprint

Page 6: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

(a ) Threshold input palm (b)

Aligned palm

( c ) Output after applying (d) Marked Principle lines

Anisotropic diffusion

Figure 6 (a,b,c,d) Result of applying algorithm-II to the input image

4.3 Detecting the heart line inclination

We use connected component labeling method for identifying the different lines and edges in the palmprint obtained after marking the edges and lines following the process discussed in section 4.2. The RGB representation of labeled palmprint is as shown in fig. 7 (a). It is observed that the boundary pixels are the largest connected component followed by principle lines. The total number of pixels assigned to a specific connected component is calculated. This total is sorted in the ascending order. If we exempt component with maximum value of the calculated sum, the remaining components must belong to principle lines ref fig. 7 (b). The start and the end pixel positions are used to compute the gradient of the inclination. If the gradient computed is negative the palmprint sample is flagged as left palmprint and a flag value ‘0’ is assigned. If the gradient is positive the palmprint sample is flagged as right palmprint and we assign ‘1’ as flag.

Page 7: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Figure 7 (a) Labeled Image (b) Retrieved Principle lines

For each palmprint along with the flag value we generate a feature vector. The feature vector is generated by selecting a ROI of size 128 x 128 using the center of the palmprint as a center of ROI. This ROI is used to generate the feature vector. 5 Feature vector generation

Ordinary 2D cepstrum of a 2D signal is defined as the inverse FT of the logarithmic spectrum of the signal and it is computed using 2D fast Fourier transform (FFT)[25]. The cepstrum values calculated are independent of the pixel amplitude variations or grayscale changes. The process of generating 2d Cepstrum is given in the block diagram of fig. 8.

Figure 8. Block diagram of process involved in generating 2D cepstrum

For a given image I(x,y), Cepstrum ),(ˆ yxI

is given by eqn. (15)

))|),((log(|),(ˆ 21

2qpIFyxI

...(15)

Here I(p,q) is an 2D FFT of the input image

I (x,y). The domain data I(p,q) is divided into non-uniform grid and the energy of each of the grid is calculated. Log of the energy values are used for inverse Fourier transform, to calculate

normalized Cepstrum values. ),(ˆ yxI will have

the dimensions lower than I(p,q) due to non

uniform sampling. In our proposed method for the feature vector generation the Cestrum matrix is of the size 16*16. We calculate the Cepstrum values for the each component of R, G, B plane of the extracted ROI. To generate the feature vector we calculate the Eigen values for each Cepstrum. The Eigen values are converted into a binary string using threshold. The flag value determined in the section 4 is appended as the MSB bit of the feature vector. The feature vector generated for the different palmprint samples is represented in Table 1.

48:116:116:116:1

/))(~)(~)(~()(~i

db

B

Sampdb

G

Sampdb

R

SampdbSampsamp IBBGGRRFFSC

…( 16 )

2D FFT

Non-Uniform Grid & Weighting

2D Image

ABS() LOG()

2D IFFT

2D Cepstrum

Page 8: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

sampSC Score confidence of the sample, F Flag of sample, SampRGB Sample RGB threshold Eigen

vector , dbRGBF, Database sample flag and RGB values

The Score confidence ( SC x) is calculated for each sample stored in the database. The score confidence will be higher for the samples which are nearer to each other. Using a threshold

SC value of 50, the samples in the database are reorganized. The resulting database will store the samples with a close match within the small distance hence results in optimized search time.

It has been observed that many palmprint database capture only sample of left palm or right palm in such cases eqn. (16) can be modified by removing terms representing the

flag ( F ). During our simulation the score confidence value calculated for samples is used to populate the table 2.

Table 1. Feature Vector generated using Cepstrum

Sam ple Flag Thresholded Eigen values for cepstrum of size 16*16

1 1 1 0 0 0 0 1 1 1 1 1 1 1 1 1 0 0

2 0 1 0 0 0 0 1 1 1 1 1 1 0 0 1 1 1

3 0 1 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0

4 1 1 0 0 0 0 0 0 1 1 1 0 0 1 1 1 1

5 1 1 0 1 0 0 1 1 0 0 0 0 0 1 1 1 1

6 1 1 0 0 0 0 1 1 0 0 0 0 1 1 1 1 1

7 1 1 1 1 0 0 1 1 1 1 1 0 0 0 0 0 0

8 0 1 0 0 0 0 1 1 0 0 0 0 1 1 1 1 1

9 0 1 0 0 1 1 1 0 1 1 0 0 0 1 1 1 1

10 0 1 0 1 1 1 1 1 0 1 1 0 0 0 0 0 0

11 0 1 0 0 0 0 0 0 1 1 1 1 1 0 0 1 1

12 1 1 1 1 1 1 1 0 0 0 0 0 0 1 1 0 0

13 1 1 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0

14 1 1 0 0 0 0 0 1 1 1 1 1 0 1 1 1 0

15 0 1 0 1 1 1 0 0 0 1 1 0 1 1 0 0 1

16 1 1 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0

Table 2. Score confidence of samples calculated using eqn(16).

Sam ple No S1 S2 S3 S4 S5 S6 S7 S8 S9 S10 S11 S12 S13 S14 S15

1 1.00 0.00 0.00 0.58 0.60 0.50 0.60 0.00 0.00 0.00 0.00 0.52 0.50 0.48 0.54

2 0.00 1.00 0.63 0.00 0.00 0.00 0.00 0.42 0.65 0.56 0.60 0.00 0.00 0.00 0.00

3 0.00 0.63 1.00 0.00 0.00 0.00 0.00 0.58 0.56 0.65 0.56 0.00 0.00 0.00 0.00

4 0.58 0.00 0.00 1.00 0.44 0.46 0.48 0.00 0.00 0.00 0.00 0.69 0.63 0.56 0.50

5 0.60 0.00 0.00 0.44 1.00 0.44 0.67 0.00 0.00 0.00 0.00 0.42 0.65 0.54 0.48

6 0.65 0.00 0.00 0.48 0.46 1.00 0.63 0.00 0.00 0.00 0.00 0.38 0.40 0.58 0.73

7 0.60 0.00 0.00 0.48 0.67 0.31 1.00 0.00 0.00 0.00 0.00 0.38 0.52 0.67 0.56

8 0.00 0.42 0.58 0.00 0.00 0.00 0.00 1.00 0.31 0.65 0.48 0.00 0.00 0.00 0.00

9 0.00 0.65 0.56 0.00 0.00 0.00 0.00 0.31 1.00 0.54 0.67 0.00 0.00 0.00 0.00

10 0.00 0.56 0.65 0.00 0.00 0.00 0.00 0.65 0.54 1.00 0.46 0.00 0.00 0.00 0.00

11 0.00 0.60 0.56 0.00 0.00 0.00 0.00 0.48 0.67 0.46 1.00 0.00 0.00 0.00 0.00

12 0.52 0.00 0.00 0.69 0.42 0.65 0.38 0.00 0.00 0.00 0.00 1.00 0.60 0.54 0.52

13 0.50 0.00 0.00 0.63 0.65 0.50 0.52 0.00 0.00 0.00 0.00 0.60 1.00 0.48 0.42

14 0.48 0.00 0.00 0.56 0.54 0.56 0.67 0.00 0.00 0.00 0.00 0.54 0.48 1.00 0.56

15 0.54 0.00 0.00 0.50 0.48 0.58 0.56 0.00 0.00 0.00 0.00 0.52 0.42 0.56 1.00

16 0.65 0.00 0.00 0.48 0.46 0.65 0.63 0.00 0.00 0.00 0.00 0.38 0.40 0.58 0.73

Page 9: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

6. Results

Experimental setup uses Matlab7.0. The images are captured using a 1.8 Mega pixel web camera. To check the effect of different value of illumination we have used Microsoft Picture editor to change the value of illumination and have stored in the same database. The average size of the images in our database is 640 x 480 occupying 50 Kb of memory.

Using the score confidence value we have retrieve the samples with sore confidence value more than 60%. The results of the retrieved palmprint are depicted in the fig 9( a, b, c) for an input sample using the score confidence value. The samples drawn have overcome the variation in illumination.

Figure 9a. Nearest samples retrieved for sample input

Figure 9b. Nearest samples retrieved for sample input

Figure 9c. Nearest samples retrieved for sample input

Page 10: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

Conclusion and scope for future work

Biometric traits are being used successfully for the authentication. A palmprint biometric system has better performance due to its discriminating features. Our proposed method of acquisition of palmprint captured from cluttered scene allows a palmprint based authentication useful in hostile environment. While acquiring the palm image the direction of the palm and the illumination on the palm may affect authentication process. Our proposed method is able to retrieve the palmprint

independent of rotation and illumination. The arrangement of palmprint samples which have maximum likelihood of matching has reduced search time. Our proposed method allows palmprint captured at different instances and environment useful for the authentication. This work can be extended by capturing a fine scale image for authentication processing. The palmprint ROI captured can be used for detecting and retrieving the features that can result in the fine sample matching.

References

[1] A.Jain, R.Bolle and S.Pankanti “Biometrics: personal identification in networked society”, Boston Kluwer Academic, 1999. [2] N. Duta, A.K. Jain, K.V. Mardia, “Matching of palmprints”, Pattern Recognition Letters 23 (4) (2002) 477–485 [3] Berens, J. Finlayson "Log-opponent chromaticity coding of colour space," Pattern Recognition, 2000. Proceedings. 15th PP .206-211 vol.1, 2000 doi: 10.1109/ICPR.2000.90304 [4] Sarifuddin, “A new perceptually uniform color space with associated color similarity measure for content–based image and video retrieval” Proc,2005,PP 3-7 [5] B. Hill, T. Roger, and F. Vorhagen “ Comparative analysis of the quantization of color spaces on the basis of the cielab color-difference formula”. ACM Trans. on Graphics, 16:109–154, April 1997. [6] N. Moroney “A hypothesis regarding the poor blue constancy of cielab. Color Research and application”, Vol 28, no.3:371–378, 2003 [7] Fleck, M., Forsyth and Bregler, C. “Finding Naked People”. In Proc. of the ECCV, Vol. 2, 592-602, 1996. [8] G. Wyszecki and W. S. Stiles. “Color Science: Concepts and Methods, Quantitative Data and Formulae” John Wiley and Sons, second edition, 1982. [9] David A. Forsyth and Margaret M. Fleck. “Automatic detection of human nudes” International Journal of Computer Vision, 32(1):63–77, August 1999 [10] D. Brown, I. Craw, J. Lewthwaite, “A SOM based approach to skin detection with application in real time systems”, BMVC01, 2001 [11] H. Greenspan, J. Goldberger, I. Eshet, “Mixture model for face color modelling and segmentation”, Pattern Recognition Lett. 22 (14) (2001) 1525–1536. [12]Qiong Liu, Guang-zheng Peng, “A Robust Skin Color Based Face Detection Algorithm”, 2nd IACon Informatics in Control, Automation and Robotics,2010, 978-1-4244-5194-4/10 [13]D.Zhang, “Palmprint Authentication”, Norwell,.Kluwer Academic publishers, 2004. [14] Xiangqian Wu, David Zhang;, Kuanquan Wang, Bo Huang “Palmprint classification using principal lines” Pattern Recognition 37 (2004) 1987 – 1998 [15] Chin-chuan Han, Hsu-liang Cheng, Chih-lung Lin, Kuo-chin Fan “Personal authentication using palm-print features” Pattern Recognition , vol. 36, no. 2, pp. 371-381, 2003 [16] Jing Wei, Wei Jia, Hong Wang, Dan-Feng Zhu “Improved Competitive Code for Palmprint Recognition Using Simplified Gabor Filter” LNCS Vol 5754, 2009, pp 371-377 [17] Swenxin Li, David Zhang and Zhuoq Unxu ‘Palmprint identification by fourier transform” International Journal of Pattern Recognition and Arti_cial Intelligence Vol. 16, No. 4 (2002) 417-432 [18] Liu, L and Zhang, D (2005). “Palm-Line Detection” Proceeding of the IEEE International Conference on Image Processing (ICIP); pp. 269-272. [19] Kong, K; Zhang, D (2002). “Palmprint texture analysis based on low-resolution images for personal authentication” Proceedings of 16th International Conference on Pattern Recognition; vol. 3; pp. 807–810. [20] Kumar, A and Zhang, D (2005). ‘Personal authentication using multiple palmprint representation” Pattern Recognition; Vol. 38; pp. 1695–1704 [21] Zhen Guo, Lei Zhang and David Zhang, “Feature Band Selection for Multispectral Palmprint Recognition," 20th International Conference on Pattern Recognition, pp.1136 - 1139, 2010. [22] Wei Li,D Zhen Guo David Zhang, Lei Zhang,Guangming Lu, and Jingqi Yan “Three Dimensional Palmprint Recognition with Joint Line and Orientation features” BATR Vol. 5 No.11 Nov. 2011 [23] C. V. Guru Rao , Dhananjay D. M. and I. V. Muralikrishna “Automated PolyU Palmprint sample Registration and Coarse Classification” IJCSI, Vol. 8, Issue 4, July 2011 ISSN (Online): 1694-0814 [24] P.Perona and J.Malik ”Scale space and edge detection using anisotropic diffusion”, IEEE journal of Pattern recognition and machine intelligence, vol 12(2) (1990) 629-630

Page 11: Proff. Former Director R&D, Exam controller GNDEC, Bidar ......dhan_mak@yahoo.com iyyanki@gmail.com guru_cv_rao@hotmail.com Abstract: Accesses to the secured resources are controlled

[25] Cakir, S. Cetin, A.E., "Mel-cepstral methods for image feature extraction," Image Processing (ICIP), 2010 17th IEEE International Conference on , vol., no., pp.4577-4580, 26-29 Sept. 2010 doi: 10.1109 / ICIP.2010.5652293 [26]Ribaric, S.; Fratric, I., "A biometric identification system based on eigenpalm and eigenfinger features," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.27, no.11,PP.1698,1709,Nov2005DOI: 10.1109/TPAMI.2005.209 [27]Choras, M. Kozik, R. Zelek, A., "A Novel Shape-Texture Approach to Palmprint Detection and Identification," Intelligent Systems Design and Applications, 2008. ISDA '08. Eighth International Conference on , vol.3, no., PP638,643, 26-28 Nov. 2008 DOI: 10.1109/ISDA.2008.221