Download as pdf or txt
Download as pdf or txt
You are on page 1of 46

Knuckle Print Biometrics and Fusion Schemes – Overview,

Challenges, and Solutions


GAURAV JASWAL, AMIT KAUL, and RAVINDER NATH,
National Institute of Technology, Hamirpur

Numerous behavioral or physiological biometric traits, including iris, signature, hand geometry, speech, palm
print, face, etc. have been used to discriminate individuals in a number of security applications over the last
30 years. Among these, hand-based biometric systems have come to the attention of researchers worldwide 34
who utilize them for low- to medium-security applications such as financial transactions, access control, law
enforcement, border control, computer security, time and attendance systems, dormitory meal plan access,
etc. Several approaches for biometric recognition have been summarized in the literature. The survey in
this article focuses on the interface between various hand modalities, summary of inner- and dorsal-knuckle
print recognition, and fusion techniques. First, an overview of various feature extraction and classification
approaches for knuckle print, a new entrant in the hand biometrics family with a higher user acceptance and
invariance to emotions, is presented. Next, knuckle print fusion schemes with possible integration scenarios,
and traditional capturing devices have been discussed. The economic relevance of various biometric traits,
including knuckle print for commercial and forensic applications is debated. Finally, conclusions related to
the scope of knuckle print as a biometric trait are drawn and some recommendations for the development of
hand-based multimodal biometrics have been presented.
Categories and Subject Descriptors: A.2 [General and Reference]: Surveys and Overviews; I.3.1 [Security
and Privacy]: Authentication; K.3.8 [Computing Methodologies]: Object Recognition
General Terms: Algorithms, Security, Performance, Verification/Identification
Additional Key Words and Phrases: Hand Biometrics, Fusion, FKP (finger knuckle print), MCP (metacar-
pophalangeal joint), ROI (region of interest), CRR (correct recognition rate), ERR (equal error rate), FAR
(false acceptance rate), IKP (inner knuckle print), ROC (reciever operating characteristic)
ACM Reference Format:
Gaurav Jaswal, Amit Kaul, and Ravinder Nath. 2016. Knuckle print Biometrics and Fusion Schemes –
Overview, challenges, and solutions. ACM Comput. Surv. 49, 2, Article 34 (November 2016), 46 pages.
DOI: http://dx.doi.org/10.1145/2938727

1. INTRODUCTION
Traditionally, the usage of personal characteristics like name, rank, ID, address, date
of birth, password, etc., employed to validate the identity of an individual is not a re-
liable solution for authentication. The limitations of these traditional approaches are
that they are insecure and unsuitable for personal authentication in the modern world.
It is therefore desired to develop more consistent and realistic methods for personal
authentication to control day-to-day increasing crime and fraud in various social and
commercial activities such as e-commerce, e-passport, online financial transaction sys-
tems, cross-border security, crime scene analysis, voter identity in India, controlling

Authors’ address: G. Jaswal, A. Kaul, and R. Nath, Electrical Engineering Department, National Institute
of Technology, Hamirpur, Himachal Pradesh, India-177005; email: {er.shgaurav, amitkaul9, r.nath1964}@
gmail.com.
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted
without fee provided that copies are not made or distributed for profit or commercial advantage and that
copies show this notice on the first page or initial screen of a display along with the full citation. Copyrights for
components of this work owned by others than ACM must be honored. Abstracting with credit is permitted.
To copy otherwise, to republish, to post on servers, to redistribute to lists, or to use any component of this
work in other works requires prior specific permission and/or a fee. Permissions may be requested from
Publications Dept., ACM, Inc., 2 Penn Plaza, Suite 701, New York, NY 10121-0701 USA, fax +1 (212)
869-0481, or permissions@acm.org.
c 2016 ACM 0360-0300/2016/11-ART34 $15.00
DOI: http://dx.doi.org/10.1145/2938727

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:2 G. Jaswal et al.

Fig. 1. Classification of biometric traits.

access to restricted sites, etc. Biometric authentication refers to recognition of an indi-


vidual by his/her unique characteristics, namely physiological and behavioral traits or
both [Garcia and Alberola 2003]. Out of these, behavioral biometric traits are acquired
due to the repeated action of a person such as typing rhythm, voice, gait, etc., whereas
physiological characteristics are inherent characteristics of human body such as fin-
gerprint, iris, ear pattern, DNA, finger knuckle print, palm print, face, etc. [Jain and
Kumar 2010]. The advantages of these biometric characteristics are that one cannot
easily manipulate, misplace, steal, forge, or share them. Therefore, the authentication
based on these traits is more reliable and secure than traditional token- (e.g., driv-
ing license) or knowledge- (e.g., PIN or password) based security mechanisms [Jain
et al. 2006]. Physiological and behavioral biometrics are categorized as hard biomet-
rics. Further, there are a few more features known as soft biometrics including age,
height, weight, color of eye/skin/hair, scars, gender, marks, and tattoos, etc. that can be
used to identify human-beings. These soft biometrics are unable to distinguish between
a large population of individuals, as they are not unique and stable [Jain et al. 2004a].
However, it has been observed that soft biometrics work well when used in conjunction
with hard biometric traits. For an overview and clarity, biometric traits are classified
as shown in Figure 1. In physiological biometrics, ocular and facial region are treated
separately keeping in mind a situation where the entire face is captured from a dis-
tance, resulting in loss of significant iris/retina information. Similarly, when the iris
is captured at close distance, the entire face information may not be available in that
image. Further, hand bacteria based biometric is kept in hand biometrics group due to
its stable and distinct bacterial DNA structure. Likewise, voice has been classified as
ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:3

a behavioral biometric trait because it is affected by an individual’s state of mind and


health. Ideally, any physiological or behavioral characteristic cannot be a biometric
trait unless it satisfies the criteria such as [Jain et al. 2004b]:
(i) Universality. The biometric trait should be present in all who access the
application.
(ii) Uniqueness. The feature should be as unique as possible, so that the same feature
does not appear in any two different individuals even in case of identical twins.
(iii) Permanence. The biometric feature should be invariant to any environmental
change.
(iv) Collectability. It should be easily collectable in terms of acquisition, digitization,
and feature extraction.
(v) Acceptability. It refers to the willingness of an individual to provide his/her bio-
metrics to the device.
(vi) Circumvention. The biometric feature cannot be easily imitated using artifacts.
(vii) Performance. The feature should be robust and capable to achieve highest recog-
nition at high speed.
Based on physiological or behavioral biometric traits, there exists a variety of
commercial and forensic applications. Depending upon the application, any biometric
system is commonly a part of either a verification or an identification task [Jain et al.
2004b]. A verification system allows one-to-one matching and validates the claim
against the earlier collected samples of the same individual. Whereas, identification
is described as one-to-all matching and compares the individual sample against the
collected samples of all individuals present in the database. The question of which
biometric characteristic should be used for a given authentication problem, depends
upon the exact requirements of anticipated application [Jain et al. 2006]. In the
present scenario, the commonly used biometric traits for various security applications
are fingerprint, voice, signature, iris, and facial features. This is due to their ideal
characteristics like feasibility, distinctiveness, permanence, accuracy, and reliability
[Jain and Kumar 2010]. It has been found that no biometric feature is superior to
others or can replace the other trait, because each has its own merits and demerits.
In the past few years alone, hand biometrics have gained popularity due to plenty
of usage in access control applications and have around 60% share in the biometric
market [Bera et al. 2014]. The hand-based biometric patterns have very unique
anatomy and thus provide highly distinct information to recognize the individuals
[Zhang et al. 2012]. Also, they can be captured with low-cost and small-size imaging
devices without mounting extra hardware, leading to smaller-size templates, and are
appropriate for large population practices [Travieso et al. 2014].
In recent studies, it has been reported that the patterns on the inner and outer surface
of finger knuckle joints, because of their abundant line and texture features, are novel
and highly unique biometric modality [Neware et al. 2012]. However, still no study
has recommended a potential application based on the use of knuckle joint patterns
for personal identification. Therefore, in spite of presence of unique structural texture,
knuckle pattern has not been introduced in commercial or forensic applications [Choras
2013]. In this survey, an effort has been made to present an overview of the existing
inner- and outer-knuckle-print-based recognition techniques, their limitations, and fu-
sion strategies. Further, the state-of-the-art related to finger knuckle pattern, focusing
on the development of highly precise multimodal hand biometrics, has been examined.
Finally, the future research directions for the development of next-generation finger-
knuckle recognition techniques and devices have also been discussed. This discussion
has been presented in nine sections in the remainder of the survey. In Section 2, the
anatomy and the importance of knuckle print is discussed. The Knuckle Print Biometric
model is described in Section 3. Literature survey of existing techniques is presented
ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:4 G. Jaswal et al.

Fig. 2. Front and back view of hand.

in Section 4. In Section 5, an overview of fusion schemes of knuckle print with other


biometric patterns is provided. Section 6 describes a summary of knuckle-print tradi-
tional devices and its applications. Section 7 gives a description about knuckle-print
standard benchmark databases and evaluation parameters. Next, Section 8 highlights
the economic relevance of various biometric patterns. Lastly, conclusions and future
research directions are listed in Section 9.

2. KNUCKLE PRINT ANATOMY AND COMPARISON WITH OTHER HAND TRAITS


Hand-based biometric systems have found wide acknowledgment in the past few years.
These systems have been found to be more convenient to use, as they use contact-
free or touchless imaging mechanism, which is economical, simple in architecture,
and suitable for large population usage. Hand-based biometric systems are capable of
attaining a high degree of discrimination characteristics over time except in the case
of children and elders [Bera et al. 2014]. Moreover, recent studies have shown that
hand features both from palmer and dorsal side can be simultaneously obtained from
a hand image itself [Ferrer et al. 2007; Zhu and Zhang 2010]. Therefore, the present
research trend is to extract novel features either from the front or back surface of
the hand and utilize them in large-scale personal recognition systems. Further, it has
always been a challenge for researchers to decide which properties of the hand can be
used in biometrics. The hand is a multi-fingered boundary at the end of the arm, which
includes multiple features like hand geometry or shape [Zheng et al. 2007], fingerprint
[Cappelli et al. 2010], finger geometry [Malassiotis et al. 2006], nail bed [Kumar et al.
2014], palm print [Zhang et al. 2003], palm vein [Yan et al. 2015], dorsal hand vein
[Kumar and Prathyusha 2009], finger dorsal knuckle print [Choras 2013], finger vein
[Yang et al. 2014a], finger inner-knuckle print [Liu et al. 2013], hand print [Lemes
et al. 2011], hand bacteria [Holbert et al. 2015], and their combinations. The biometric
patterns occurring on the palmer side of the hand are perceived to be more informative,
when compared to the dorsal side. The unique biometric characteristics in front and
back surface of the hand region are shown in Figure 2.
Among hand biometrics, the fingerprint1 is the oldest and most widely used identifier
[Cappelli et al. 2010]. However, minutia and singular points of the fingerprint are
sensitive to injury or wound on the finger surface unlike the finger knuckle pattern, an

1 Web Link:http://biolab.csr.unibo.it/DatabaseSoftware.asp?organize=Software.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:5

upcoming biometric trait [Ozkaya and Kurat 2014]. Moreover, finger knuckle patterns
on the inner side of the palm can be easily captured with a contactless imaging device.
Also, fingerprint technology requires high resolution images (>400 dpi) for better
recognition results [Li et al. 2004]. Whereas lines and wrinkles on the outer surface of
finger knuckle can be clearly observed using low-quality samples [Zhang et al. 2011b].
Additionally, the skin patterns on finger knuckle outer surface arise at an early stage
of development and survive longer unlike fingerprint, which are difficult to acquire
specially from cultivators and workers [Ferrer et al. 2007]. Similarly, the palm region
of the hand provides more informational detail than others because it considers the
principle lines, wrinkles, palm geometry, and datum points for identification [Zhu and
Zhang 2010]. But there is a risk of size, complexity and imposter attacks because people
leave their fingerprint/ palm print unintentionally when they touch any object [Kumar
et al. 2013a]. In comparison to the palm print, the finger dorsal knuckle print near the
Proximal Inter Phalangeal (PIP) joint contains appropriate lines and creases within a
small region. Nevertheless, copying and forgery of a palm print is slightly more difficult
to perform than fingerprint and hence most ideal for the low-resolution based-security
applications [Zhang et al. 2003]. Recently, researchers have concentrated on 3D palm
images [Zhang et al. 2015], multispectral palm images [Zhang et al. 2016], and are
performing fusion with 2D features to enhance anti-forgery ability so as to make it
suitable for higher security forensic applications. It’s worth mentioning that along with
the 3D palm print, there are other 3D hand biometric traits like 3D fingerprint [Kumar
and Kwong 2015], 3D hand geometry [Kanhangad et al. 2009], and 3D finger geometry
[Malassiotis et al. 2006]. However, no biometric work based on 3D finger knuckles
has been reported yet. Another related biometric is the handprint which includes
lines, creases, and ridges, prominently present on the entire inner surface of finger
and palm. Although the smaller and unclear palm surface of infants looks identical
and does not provide very unique and stable features, it is still considered suitable
for missing child identification [Lemes et al. 2011]. Therefore, full hand surface and
high resolution images (>1000 ppi) are required for better results. Performance of
handprint is not satisfactory when main emphasis is on cost and computation [Bera
et al. 2014]. In addition to these traits, hand geometry is another hand-based biometric
which consists of large physical size, but this increases the storage requirement. The
peg-based hand geometry scanners used earlier were affected by hygienic concerns
and could not be embedded in mobile devices [Zheng et al. 2007]. Moreover, hand
or finger geometry features vary during illness, and do not provide very distinctive
information for identifying the individuals in case of large population size [Sharma
et al. 2015]. Further, the biometric features that lie on the back surface of the hand,
such as finger dorsal knuckle print, nail-bed, etc. cannot be easily duplicated and the
possibility of information loss from this region is also less [Kumar et al. 2014].
Apart from these, veins based biometrics such as dorsal hand vein [Kumar and
Prathyusha 2009], finger vein [Yang et al. 2014b], and palm vein [Yan et al. 2015] have
emerged as the most distinctive and user-friendly identifiers as they depend on the
genetic information of inner blood vessels. Thus, in terms of image acquisition, both
finger dorsal knuckle print and vein biometrics consist of a contactless and lightweight
setup, but an additional near infrared light source (NIR) is required for vein acquisition.
This makes vein imaging devices slightly expensive and complex for bulk deployment.
But in comparison to finger dorsal knuckle print or fingerprint, vein schemes have
strong anti-forgery characteristics as the underneath skin patterns are actually unique
and remain relatively stable through the adult age. It has been reported that finger
geometry and finger knuckle print [Hegde et al. 2011a] or finger vein and finger knuckle
[Yang et al. 2014b] can be combined to improve the performance of the biometric system.
In the recent past, hand surface bacteria based identification has emerged as a new

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:6 G. Jaswal et al.

Table I. Comparison of Hand Biometrics Traits

Trait/ Sensor Characteristics


Application Type Cost Feature Sets Universality Uniqueness Permanence Benchmark Datasets
Palm print Contact High Principal lines, High High High IIT Delhi Touch less
(verification/ creases, ridges, Palm print Database,
identification) singular points, CASIA Palm print
wrinkles, minutiae Image Database, Poly
points, texture, U Multispectral Palm
geometry print Database
Finger Contact-less Low Knuckle lines, creases High High Moderate IIT Delhi Finger
knuckle print and texture Knuckle Database,
(verification/ THU-FVFDT
identification) Database, Poly U
FKP Database
Inner knuckle Contact-less Low Horizontal lines, High Moderate Moderate Hebei University IKP
print corner points, and Database
(verification) gray-mutation regions
Fingerprint Contact Low Singular points, ridge Moderate High High Poly U HRF
(verification/ skeleton, pores and Database, CASIA
identification) ridge shape, minutiae Fingerprint Image
points, ridge Edge Database, FVC 2006
Protrusions, ridge (DB1, DB2, DB3)
flow, incipient ridges,
distal creases and
scars, 3D features
Fingernail bed Contact High Nail shape, ridges, High High Moderate Biometrics
(verification) texture ResearchLaboratory,
IIT Delhi Database
(self -collected)
Hand Shape/ Contact High Boundary of hand Moderate Moderate Moderate Bosphorus Hand
Geometry (2D/3D), Length/ Database, MSU Hand
(verification) width of fingers or Geometry Database,
palm, finger GPDS 150 Hand
perimeter, aspect Database, HGC 2011
ratio of finger or palm, Hand Database
finger area, angles
between inter fingers
Finger Contact-less High Contour of finger Moderate Moderate Moderate Bogazici Hand
Geometry (2D/3D), length/ Database
(verification) width, area of finger
Hand Vein Contact-less High Lines, Vein High High High Bosphorus Hand Vein
(verification/ bifurcation points and Database
identification) ending points
Finger Vein Contact-less High Lines, edges, Vein High High High THU-FVFDT
(verification/ bifurcation points and Database
identification) ending points
Palm Vein Contact-less High Lines, curvatures, High High High CASIA Multi-spectral
(verification/ local statistical Palm Vein Image
identification) features and minutiae Database
Handprint Contact High Lines, creases, and High Moderate Moderate NUDT Lab handprint
(verification/ ridges on palm print, samples
identification) fingerprint, and finger (self-collected)
surface
Hand Bacteria Contact High Hand surface High Moderate Moderate WVU’s Health
(verification) Sciences Center hand
bacteria samples
(self-collected)

research area that finds its role in forensic investigations [Holbert et al. 2015]. In the
existing works, it is not considered as hand biometric pattern straight, but its bacterial
texture configuration is highly unique and time immutable even among the twins
[Fierer et al. 2010; Bera et al. 2014]. Table I displays a comparison of characteristics
related to various hand-based modalities.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:7

Fig. 3. Finger anatomy. Fig. 4. Finger dorsal knuckle print around MCP, PIP, and DIP
joint.

In the finger knuckle image biometric system, an individual is verified by extraction


of lines, creases and texture on knuckle print2 which lie nearby the three knuckle
joints of finger [Choras 2013]. The three joints in each human finger lie between one
of three bone groups, called the distal, proximal and middle phalanx [Kumar 2012].
The first and largest knuckle joint is the junction between hand and fingers called
the Metacarpophalangeal joint (MCP). The next joint is the Proximal Inter Phalangeal
joint (PIP) and the farthest joint is the Distal Inter Phalangeal joint (DIP). These
joints of finger anatomy are shown in Figure 3. Accordingly, the basic convex skin
patterns formed around the outer surface of any of three finger knuckle joints consist
of highly rich lines and creases known as finger-dorsal- or finger-back-surface-based
knuckle prints, as shown in Figure 4. The image patterns formed outside by slight
finger bending near PIP joint are purely unique and considered finger dorsal knuckle
print or simply Finger Knuckle Print (FKP), a significant biometric [Zhang et al. 2010].
In the literature, there are other finger knuckle image studies too that make use of
flexion creases inside the finger surface as a biometric trait. In particular, the epidermal
ridges appearing on the inside surface of finger knuckle joints are called the finger-
inner-surface-based knuckle prints. They also consist of rich horizontal like lines, corner
points, and gray-mutation regions. These unique lines formed even before birth, rarely
change during an individual’s life. The image patterns molded inside the near middle
knuckle joint (PIP) are more stable lines, referred to as the Inner Knuckle Print (IKP)
[Liu et al. 2014]. Moreover, these line features can be clearly visible to the naked eye
and they can be captured using low-resolution cameras. Due to these highly robust
features and the ease of collection, knuckle prints on either side of finger surface can
be used for various security applications. IKP recognition is a new research area and
very few scholars have explored its use as a biometric identifier.
In the majority of the prior works related to finger knuckle patterns, it has been ob-
served that researchers have utilized information present either between the distal and
middle phalanx bones or middle phalanx and proximal phalanx bones. In other words,
maximally DIP and PIP joints were involved in the possibility of utilizing knuckle
patterns for biometrics [Kumar 2012; Kumar and Xu 2014]. Knuckle print is a new
modality in the biometric field, but it is more convenient to use for different security
applications and offers several advantages over popular biometric characteristics such
as faces, fingerprints, iris, etc. This modality has the following promising advantages:
(1) The finger dorsal knuckle print is difficult to be rubbed because we use inner side
of the hand to carry, write, or do something else.

2 The term ‘knuckle print’ refers the skin patterns of inner and outer side around the knuckle joints of finger.
As finger knuckle/FKP/IKP terms were found to be commonly used in distinguish works.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:8 G. Jaswal et al.

Fig. 5. Photographs showing knuckle prints at different positions.

(2) A slight bending of middle major finger joint produces complex texture patterns;
however no significant work utilizing this information is reported in the literature.
These patterns may be an informative evidence for criminal scene investigation,
law enforcement etc.
(3) It is a user friendly, contactless and provides unrestricted access control. Such im-
ages can be acquired online or offline using inexpensive sensors and used to provide
scale, translation, and rotation-invariant knuckle features for user identification.
(4) It is invariant to emotions and other behavioral aspects such as tiredness, kidnap-
ping, or sexual attack.

Out of all these advantages, the major one is its contactless nature, which provides no
chance of spoofing [Ravikanth and Kumar 2007; Zhang et al. 2010]. But still, there is no
commercial knuckle-print-based biometric system reported in the literature. Recently,
it has been verified that the shape of effective creases and lines in minor and major
knuckle patterns are highly stable in individuals of varying age groups by matching
images acquired after the time interval of 5 years with the same algorithms [Kumar
2014].
The author [Kumar 2014] also revealed some forensic photographs in which the finger
knuckle pattern was the only major source of information available to scientifically
determine the identity of individuals/suspects. It is further stated that there are more
situations, as shown in Figure 5, which also depict the possibility of finger knuckle
pattern as a unique biometric identifier.

3. AUTOMATED KNUCKLE PRINT BIOMETRIC SYSTEM


To design an automated biometric system, certain points need to be kept in mind: user
acceptance, controlled environmental conditions, accuracy, computational time, device
cost, and security [Jain et al. 2006]. It has been observed that all these objectives can-
not be satisfied simultaneously to fulfill the design requirements. The computational
speed of the system can be increased by compromising on accuracy and vice-versa, but
both cannot be achieved together. Moreover, the recognition task becomes increasingly
complex with the decrease in user cooperation, while there are privacy and cost issues
if more sensors are embedded to increase accuracy. There are few extra requirements
in some specific circumstances, such as the size of the template, power intake, memory
storage, etc. [Travieso et al. 2014]. In summary, a biometric system with all these prac-
tical implications may be perceived as a good choice for recognition and authentication
tasks. As far as automated knuckle print recognition system is concerned, the main
parts are: knuckle print acquisition device, preprocessing methods, feature extraction/
classification, decision module, and database development.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:9

(1) The inner and outer skin patterns around the finger joints are highly unique and
can be obtained online or offline for further usage. In literature, the following FKP
scanners have been used:
(a) A peg-free imaging system that uses Minolta 900/910 Sensor with dimension
213x413x271mm [Woodard and Flynn 2005].
(b) A digital camera having multiple resolutions under the effect of uniform light-
ing [Li et al. 2005].
(c) A CCD camera equipped with a finger bracket, a single lens, a ring LED, and a
frame grabber [Zhang et al. 2009b, 2010].
(d) A smartphone (HTC Desire HD A9191) consisting of an 8-megapixel color cam-
era [Cheng and Kumar 2012].
(e) A digital camera having a 10.2-megapixel DX format CCD under varying light-
ing conditions or distances, etc. [Sulthana and Kanmani 2014].
(2) Pre-processing aims to enhance input images for further processing. It includes
region-of- interest (ROI) localization, image segmentation, image enhancement,
filtering, normalization, illumination settings, etc.
(3) The features represent the different properties of an object, and selection of the
most dominant features for further classification is always crucial. In most of the
prior FKP studies [Kumar and Ravikant 2009; Zhang et al. 2011], the main char-
acteristics selected for feature extraction are the U-shaped line around the middle
phalanx, the length and spacing between lines, the number of vertical and horizon-
tal lines, the creases present in the middle of phalangeal joint, etc. Existing feature
extraction techniques are discussed in the following section.
(4) To analyze the similarity of the claimed FKP image against the stored templates,
a classifier of statistical or non-statistical basis may be employed. Undoubtedly,
the performance of a classifier is somehow dependent on the effectiveness of the
features extracted. On the basis of the matching score generated by a classifier,
the claimed identity is either accepted or rejected. For example, in the matching
module of a knuckle-print-based biometric system, the degree of similarity between
the input and the template knuckle print images is determined and expressed in
terms of matching score.
(5) A database stores the biometric templates of the enrolled users in the digital form.
Usually, several samples of an individual are recorded at different timing sessions
to analyze the variations in the biometric trait and those are also updated fre-
quently.
Recently, to alleviate hygiene concerns in hand-acquisition devices and provide a
user-friendly environment, platform-free, non-contact, or touchless formats have been
considered. Despite these benefits, there may be risk of a large variation of common
line and texture features even in different images of the same person. But overall, these
devices are more convenient to the user than the guiding platforms and improves the
scope of application. Therefore, the overall aim of an automated finger-knuckle-based
authentication system is to minimize the manual work at different stages and to pro-
vide speedy, better, and more deterministic results. However, the major drawbacks of
existing finger-knuckle-based systems cannot be ignored in making future improve-
ments in this area. These include:
(i) Inability to authenticate the individual, because of hand injury or damage in finger
knuckle position.
(ii) Extraction of ROI accurately during skewed situations, which contains highly
inimitable texture.
(iii) Lack of skilled experts who are trained to perform error-free matching.
(iv) Non-suitability of low-resolution FKP images for high-security applications.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:10 G. Jaswal et al.

Table II. Summary of Coding Algorithms


Publication
Detail Database Used Feature Extraction Classifier Accuracy (%) Speed (s)
Zhang et al. 5760 images CompCode Angular distance GAR-97%, 1s
[2009] EER-1.09%
Kumar and IIT Delhi finger Localized radon Euclidean distance CRR-98.6%, -
Zhou [2009] knuckle database transform based EER-1.08%
(790 images) knuckleCode
Kumar and IIT Delhi finger Modified finite radon Euclidean distance CRR-98.6%, -
Zhou [2009] knuckle database transform knuckle EER-1.14%
(790 images) code
Zhang et al. PolyU FKP database Improved CompCode Angular distance EER-1.48%, 0.3s
[2010] (7920 images) and magnitude code 0 (S-Rule)
Zhang et al. PolyU FKP database Monogenic Code Hamming distance EER-1.72% 0.02s
[2010] (7920 images)
Zhang et al. PolyU FKP database CompCode, fourier Hamming distance, EER- 0.402% 0.52s
[2011] (7920 images) transform BLPOC
Belguechi et al. PolyU FKP database Biocode Minkowski distance EER-25.9% -
[2011] (7920 images)
Zhang et al. PolyU FKP database Riesz CompCode Hamming distance EER- 1.26% 0.06s
[2011] (7920 images)
Zhang and Li PolyU FKP database RCode1, RCode2 Hamming distance EER-1.661%, 17.6s,
[2012] (7920 images) 1.610% 17.6s
Li et al. [2012] PolyU FKP database Adaptive steerable Angular matching EER-1.221% -
(7920 images) Orientation coding function
Gao and Yang PolyU FKP database Weighted competitive Hamming distance, EER-1.203% -
[2013] (7920 images) code weight pattern
Shariatmadar PolyU FKP database Log Gabor filter, LBP, Hamming distance 100% -
and Faez [2013] (7920 images) PCA+LDA
Gao et al. [2014] PolyU FKP database MoriCode &MtexCode Fragility masks EER-1.048% 0.39s
(7920 images) hamming distance (MW-Rule)
Yang et al. THU-FVFDT C2 Code Nearest neighbor EER-0.435% -
[2014]

(v) Both FKP and IKP are challenged by the shortage of publicly available databases.
(vi) Difficulty of collecting IKP and FKP consistently during varying environmental
conditions.
So, it can be concluded that an automated FKP system is still in its nascent stage and
needs further improvement/development in order to implement it as a practical system
in real-time environments.

4. LITERATURE SURVEY OF KNUCKLE PRINT TECHNIQUES


The feature extraction and classification play important roles in a pattern recognition
problem. Most of the conventional knuckle-print recognition algorithms can be
grouped into following categories: coding methods [Zhang et al. 2009a, 2010], subspace
methods [Guru et al. 2010; Shariatmadar and Faez 2011a], texture analysis methods
[Zhang et al. 2011] and other image processing methods. Among all, the coding-based
methods given in Table II are the most popular [Zhang et al. 2010; Zhang and Li
2012] as they are robust against illumination variations. These techniques are widely
used for extraction of orientation or edge information, but are unable to preserve all
of the orientation features against image rotation. These include Gabor filter-based
CompCode, Improved CompCode&MagCode [Zhang et al. 2010], knuckle code [Kumar
and Zhou 2009a], Adaptive Steerable Orientation Coding (ASOC) [Li et al. 2012],
etc. On the other hand, the unsupervised subspace analysis methods project the raw
image into a low-dimensional subspace and use the resulting subspace coefficients as
features. But the subspace coefficients do not possess the high discriminatory ability
of knuckle prints. The popular subspace methods are Principal Component Analysis

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:11

Table III. Summary of Subspace Algorithms


Publication Database Feature
Detail Used Extraction Subspace Classifier Accuracy % Speed
Woodard and CVRL Hand Curvature and PCA Correlation CRR(Group A) -
Flynn [2005] image dataset shape based coefficient 94.5%- 98%
(1191 3D index
Hand images)
Kumar and IIT Delhi Canny edge PCA, LDA and Euclidean GAR-97.5%, EER- 0.53s
Ravikant finger knuckle detector ICA distance 1.39, DI-2.35%
[2009] Database
(630 images)
Nanni and 720 right hand Haar wavelet, PCA, Parzen EER-0.3, 0.9 -
Lumini [2009] images radon nonlinear window
transform fisher distance
transform
Guru et al. PolyU FKP Zernike PCA Euclidean CRR-92.24% -
[2010] database moments distance
(7920 images)
Yin et al. PolyU FKP PCA, LDA, LDE, WLE Distance CRR-WLE (66.5%, -
[2010] database measure 97.3%)
(7920 images)
Wankou et al. PolyU FKP Gabor filter PCA, OLDA Nearest CRR- LI (98%), -
[2011] database neighbor with LM (98.6%),
(7920 images) cosine RI (97.7%),
distance RM (97%)
Shariatmadar PolyU FKP Gabor filter PCA, LDA Euclidean CRR-LI+LM+ -
and Faez database distance RI+RM (98.79%)
[2011] (7920 images)
Wankou et al. PolyU FKP Gabor filter PCA, MMDA Cosine CRR- LI (98.30%), -
[2011] database distance LM (98.7%),
(7920 images) RI (97.5%),
RM (97.9%)
Jing et al. PolyU FKP CLPP, OCLPP, PCA and CPCA Fused angle CRR (OCLPP)- LI -
[2011] database andeuclidean (87.8%), LM
(7920 images) distance (87.4%), RI(86.9%),
RM (87.3%)
A1Mahafzah PolyU FKP Log Gabor PCA, LPP Z Score GAR-93% -
et al. [2012] database filter, LPQ normalization (LG+LPP+PCA)
(7920 images)
Swati and PolyU FKP Gabor filter LDA, KPCA Euclidean CRR- 91.67% -
Ravishankar database Distance
[2013] (7920 images)
Shariatmadar PolyU FKP Log Gabor PCA, LDA Euclidean EER (4 fingers)- -
and Faez database filter distance, 0.104% (without
[2014] (7920 images) modified normalization),
min-max 0 (with
normalization Normalization)
AlMahafzah PolyU FKP LG, LPQ PCA, LPP, and decision-level GAR-86.33% -
et al. [2014] database fusion
(7920 images)
Yang et al. 1232 FKP CompCode 2DPCA Euclidean 97.08% 0.079s
[2014] images distance
El-Tarhouni PolyU FKP MSLBP PCA RLDA 94.70% (LMF), -
et al. [2014] database 94.80% (RMF)
(7920 images)
Kazhagamani PolyU FKP Contourlet PCA Euclidean CRR-98.72%, -
and database transform distance EER-0.82%
Murugasen (7920 images)
[2015]

(PCA) [Meraoumia et al. 2013; Manjunath et al. 2013], Linear Discriminant Analysis
(LDA) [Kumar and Ravikant 2009] and Independent Component Analysis (ICA)
[Zhang et al. 2006] used for feature extraction and matching. Table III provides a
summary of different subspace methods. Another category in the knuckle-print recog-
nition algorithms is texture analysis methods that include transform, local descriptor,

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:12 G. Jaswal et al.

and statistical-based approaches applied for knuckle print recognition. The texture
schemes shown in Table IV have resulted in satisfactory performances, though at the
cost of computational time. In transform-based methods, the image is transformed
into a specific domain that represents the characteristics of the original texture.
The popular transforms applied are DCT [Saigaa et al. 2012], Radon Transform
[Hegde et al. 2011a], Haar Wavelet [Gomaa et al. 2012], DFT [Aoyama et al. 2011],
S Transform [Mahesh, and Premalatha 2015], etc. Many other representations using
local texture descriptors have also been applied on low-resolution FKP biometrics such
as Speeded-Up Robust Features (SURF) [Choras and Kozik 2010], Scale Invariant
Feature Transform (SIFT) [Badrinath et al. 2011], Local Binary Pattern (LBP) [Kumar
2012], etc. In addition to these, the statistical texture analysis methods consider the
relationship among intensity points of the FKP image. In the last few years, many
statistical or non-statistical classification methods have been proposed. These methods
give reasonably satisfactory results in terms of Equal Error Rate (EER), computation
time, Correct Recognition Rate (CRR), verification accuracy, False Acceptance Rate
(FAR), and False Rejection Rate (FRR) but certain more modifications need to be
carried out to make this trait suitable for stringent security applications. For further
improvement, multi-biometric fusion using strong feature extraction/classification
techniques may be an alternate solution. An overview of the related work in knuckle-
print biometrics until now is presented in this segment. FKP and IKP studies have been
separately presented in chronological order, and similar work highlighted together.
Jungbluth [1989] was the first to reveal the novelty of finger knuckle texture and
found its utility in forensic identification. Based on this work, Colbert [1997] developed
an apparatus for identifying or verifying individuals using their knuckle contours. The
apparatus comprised of a microcomputer connected to a video camera whose output was
selected to extract the digitized knuckle profile, when fingers are oriented in the form
of a grip. In the following year, Joshi et al. [1998] proposed the use of inner-side creases
of fingers for biometric identification. The matching was carried out using the normal-
ized correlation function. Initially, due to lack of datasets and low recognition rates,
this area did not receive much attention and a limited number of works/efforts were
reported by various authors [Colbert 1997; Joshi et al. 1998]. Subsequently, continuous
efforts were made to find simple ways to develop finger knuckle acquisition devices,
methods, etc. A few years later, Li et al. [2004] utilized the inside flexion lines of the
finger knuckle for recognition and achieved an accuracy of 96.88%. The lines, wrinkles,
and location of these features were extracted from middle knuckle and a hierarchical
classification method was used for testing 1432 finger images. During the same time,
Woodard and Flynn [2005] examined the folds and creases on the finger (index, middle,
and ring) outer surface. The authors had used a Minolta 900/910 sensor to collect 3D
finger samples (http://www.cse.nd.edu/∼cvrl/CVRL/Data_Sets.html), but it was a quite
expensive, bulky, and slow processing device, which limited its use in commercial or
real-time systems. This study is one of the most cited works in the area, but it did not
provide the solution to completely extract the 3D texture information from the finger
outer surface. 2D images were used and made FKP a possible biometric identifier. In-
spired by earlier studies, Ribaric and Fratric [2005] used a scanner for collecting finger
and palm surface. But it was also not fast enough to be considered for online applica-
tions. In the following year, Wang et al. [2006] performed segmentation of particular
skin texture by utilizing canny operator and moving window. A method-applying scale,
rotation, and translation invariance technique for knuckle texture representation was
employed by Ferrer et al. [2006]. In their study, Hidden Markov Model (HMM) and a
Support Vector Machine (SVM) were tested over a small dataset of 160 images and
achieved BER-0.096 and VFR-0.023, BER-0.094 and VFR-0.015, respectively. Subse-
quently, Sricharan et al. [2006] focused on isolating the knuckle print from the hand
surface and used the correlation function as a classifier.
ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:13

Table IV. Summary of Texture Analysis Algorithms


Publication Database Feature
Detail Used Extraction Classifier Accuracy % Speed
Ferrer et al. Self-collected Ridge SVM and BER-0.096, VFR-0.023(SVM) and -
[2006] (160 Images) feature-based HMM BER-0.094, VFR-0.015 (HMM)
algorithm
Zhang et al. PolyU FKP Fourier BLOPC EER-0.31% 0.22s
[2009a] database (7920 transform
images)
Choras and IIT Delhi PHT, SURF K nearest EER-1.02% -
Kozik [2010a] Finger knuckle neighbor
database
(630 images)
Meraoumia PolyU FKP 1D Log Gabor Hamming CRR-88.96% (RIF-FKP), 99.850% -
et al. [2011a] database (7920 filter distance (min rule)
images)
Hegde et al. PolyU FKP Gabor Matching FAR- 1.24%, FRR 1.11% -
[2011b] database (7920 wavelet distance
images) transform
Badrinath et al. PolyU FKP SIFT, SURF Nearest- CRR-100%, EER-0.215% 0.081s
[2011] database (7920 neighbor
images) ratio
Morales et al. PolyU FKP Gabor filter, Euclidean EER-0.09% <1s
[2011] database (7920 OE-SIFT norm
images)
Hegde et al. PolyU FKP Radon Karl FAR-1.55%, FRR-1.02% -
[2011a] database (7920 transform pearson’s
images) correlation
Nigam and PolyU FKP ELBP, LK FTS CRR-99.71%, EER-3.3% -
Gupta [2011] database (7920 tracking
images)
Xiong et al. PolyU FKP LGBP, Gabor Chi square CRR CRR (RI)-100%, -
[2011] database (7920 filters distance (LI)-99.39%, (RM)-99.70%
images) (LM)-99.85%
Kekre and PolyU FKP Kekre’s ANN EER EER -
Bharadi [2011] database (7920 wavelet (TAR-TRR) (FAR-FRR) 20%
images) transform 80%
Aoyama et al. PolyU FKP 2D-DFT BLPOC EER- 6.352%, 0.748%, 0.556% -
[2011] database (7920
images)
Saigaa et al. PolyU FKP 2D-BDCT, Matching Verification Identification -
[2012] database (7920 2D-DCT objective (4 fingers) (4 fingers)
images) mod2 function GAR-97.91%, EER-0.20%
EER-2.09%
Cheng and IIT Delhi finger 1D Log Gabor Hamming CRR-98% -
Kumar [2012] knuckle (630 filter distance
images), 561
images
Gomaa et al. PolyU FKP HAAR, SURF K means, CRR-100% (fusion) -
[2012] database (7920 Bayesian
images) classifier
Amraoui et al. PolyU FKP LBP, DCT SVM CRR CRR (RI)-98.%, -
[2012] database (7920 (LI)-98.2%, (RM)-97.1%
images) (LM)-98%
Ajay Kumar PolyU LBP, ILBP Hamming EER-1.04% -
[2012] Contactless and 1D log distance
FKP database Gabor filter
(1010 images)
Meraoumia PolyU FKP 2D block DCT GMM, log EER-0.37% -
et al. [2013] database (7920 likelihood
images)
Shariatmadar PolyU FKP LBP, PCA, Hamming EER-0%, CRR-100% (4 fingers) -
and Faez [2013] database (7920 LDA Distance
images)
(Continued)

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:14 G. Jaswal et al.

Table IV. (Continued)


Publication Database Feature
Detail Used Extraction Classifier Accuracy % Speed
Kumar et al. PolyU FKP Ant colony FBDT, FAR-LI FRR-LI FAR-RI FRR-RI -
[2013a] database optimization Hamming (0.003%), (4.5%), (0.003%), (4%), RM
(7920 images) (fusion) Distance LM LM (4%) RM (2.5%)
(0.03%) (0.009%)
Muthukumar PolyU FKP SIFT K means CRR-99.4%, FRR-0.006% -
and Kannan database
[2013b] (7920 images)
Kong et al. PolyU FKP Gabor filter, Hierarchal EER-0.22%, 0.218% 0.175s
[2014] database SURF method
(7920 images)
Raut et al. Self-collected KWT Euclidean TAR-TRR 80%, 89.99% FAR-FRR 20%, 10.01% -
[2014] (500 Images) distances
Mahesh and PolyU FKP FOST BLPOC EER-1.14%, CRR-99.46% (RI) -
Premalatha database
[2015] (7920 images)

Later, numerous unique techniques for finger knuckle print recognition were pro-
posed and knuckle print research has received significant consideration since 2007.
Morales et al. [2007] proposed knuckle codes for texture feature extraction. Ravikant
and Kumar [2007] used the finger knuckle image for personal recognition. The authors
clubbed both the finger knuckle texture and geometrical features of the hand, and
tested their results on finger knuckle images acquired by a peg-free and non-contact
imaging setup. The results were verified and found to be satisfactory for a database
of 105 users. At the same time, Luo et al. [2007] investigated a novel ROI method
to locate the IKP effectively and extracted finger creases using radon transform and
singular value decomposition algorithms. Their method achieved 2.25% EER with a
nearest neighbor classifier. In one of the articles reported in 2008, Xianguang et al.
[2008] proposed to detect line features of the IKP by Radon projection and wavelet
multi-resolution analysis. In the following year, Zhu et al. [2009] discussed a method
of matching the IKP line features, which also overcame the problem of finger rotation.
Their method resulted in 0.67% EER, when the wavelet-decomposed images were clas-
sified using cosine function. Zhang et al. [2009b] proposed the use of finger’s middle
joint (PIP) patterns. They employed 2D Gabor filter to extract local orientation infor-
mation, and saved it in a feature vector called competitive code. Similarity of features
was computed based on angular distance, and results showed the robustness of the ap-
proach. Additionally, Kumar and Ravikant [2009] resolved the problems occurring in
finger knuckle recognition due to the variations of the pose or the presence of artifacts.
The authors suggested a multi-algorithmic approach based on matching scores of prin-
cipal component analysis (PCA), linear discriminant analysis (LDA), and independent
component analysis (ICA) with EER of 1.39%. A study aimed at the establishment of
a database, performed by Zhang et al. [2009a], resulted in a collection of 7,920 FKP
images with an inexpensive low-resolution camera. With their imaging setup, while
capturing the image, the user had the option to put the finger either flat or slightly
bent. The experiments were conducted using the Band Limited Phase Only Correlation
(BLPOC) method and showed high accuracy and speed. Kumar and Zhou [2009a] de-
veloped a knuckle codes approach that performed better in comparison to the method
given by Zhang et al. [2009a]. The knuckle curved lines and creases were obtained
by applying radon transform on an enhanced knuckle image and resulted in 1.14%
EER and 98.6% rank in one recognition rate. In particular, Nanni and Lumini [2009]3

3 Web link: http://bias.csr.unibo.it/nanni/diffusion.rar.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:15

put forward a multi-algorithmic approach based on Parzen Window Classifier (PWC),


radon transform and Haar wavelet. In this method, the extracted knuckle features
were again modified by non-linear fisher transform.
In another work, Kumar and Prathyusha [2009] designed a new hand-based multi-
modal authentication system by combining hand vein with knuckle shape, resulting
in an EER of 1.14% on a database of 100 users. Rui et al. [2009] described an effec-
tive IKP approach based on the line creases of a single finger. With this approach, the
authors claimed 95.68% accuracy when the method was tested on a database of 1579
knuckle images. A year later, Zhang et al. [2010] demonstrated the advantages of fus-
ing information in biometric systems by combining features of palm print and middle
finger images and achieved an accuracy of 98.71%. Furthermore, Zhang et al. [2010]
improved their previous work by combining the magnitude (magnitude code) and ori-
entation information (improved competitive code). For this work, a dataset of cropped
images (220 × 110) was adopted for further feature extraction and matching. In the
same year, Kekre and Bharadi [2010] suggested a novel knuckle print ROI segmenta-
tion method based on image gradient. Unlike the other methods, this technique took
less time (110 ms) for ROI localization and resulted in 96.21% accuracy. Likewise, Li
et al. [2010] created a new ROI corner detection method for a contactless knuckle print
authentication. The algorithm was tested over 500 hand images, out of which the ROI
was correctly found in 485 images. In addition to this, another scheme that introduced
the fusion of finger geometry, knuckle print, and palm print features at decision level
(AND rule) was given by Zhu and Zhang [2010]. Goh et al. [2010a, 2010b] conducted
two different studies in the meantime. In their first research work, they employed the
popular competitive code scheme for a palm print whereas, for the knuckle print, they
adopted a localized ridgelet transform for feature extraction. Thereafter, fusing these
features yielded a promising EER of 1.25%. Whereas in second study, palm print, and
knuckle features were detected using directional coding and ridgelet transform meth-
ods, respectively. Their output scores were fused using SVM with RBF kernel. The
results reported in these articles were satisfactory but the approach was dependent on
creases and wrinkles lying on inner side of fingers. Zhang et al. [2010] later presented
another significant method called the monogenic code (3-bit vector) which represented
the phase and orientation information of knuckle images. Their study resulted in a
lower EER 1.72% and gave a speedy response in comparison to other existing works
like CompCode [Zhang et al. 2009a], Knucklecode [Kumar and Zhou 2009a]. Hemery
et al. [2010] studied the SIFT descriptor performance on enhanced knuckle images and
the results obtained were found to be equivalent or better than the earlier methods
[Kumar and Zhou 2009a]. Likewise, Choras and Kozik [2010a] also conducted local
descriptor-based experiments on IIT Delhi Finger Knuckle database to quantify the
Speeded Up Robust Features (SURF) and Probabilistic Hough Transform (PHT). Guru
et al. [2010] made another contribution to the same area by developing a multi-instance
finger knuckle print recognition using zernike moments. Recognition rate of 92.24%
was achieved by employing feature level fusion. In the meantime, Shen et al. [2010]
designed a multimodal recognition system using knuckle and palm print features ex-
tracted by utilizing fusion code, which provided an accuracy of more than 89% when
fusion was carried out on the decision level. Yin et al. [2010] have utilized local and
nonlocal information simultaneously using a new feature extraction method known as
weighted linear embedding (WLE).
Additionally, Nani et al. [2010]4 described the performance of several texture de-
scriptors for knuckle print, palm print, and fingerprint traits with respect to different
distance measures. The Local Phase Quantization (LPQ) was reported as a best local
4 Web link: http://www.cubs.buffalo.edu/resources/enchancement.zip.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:16 G. Jaswal et al.

descriptor among all the descriptors discussed in this work. For boosting the perfor-
mance, their work recommends a combination of LPQ, Local Gabor Phase (LGP) and
Local Binary Pattern Histogram Fourier (LBP-HF). Subsequently, Zhang et al. [2011]
have used a weighted sum rule to fuse local and global information for FKP recogni-
tion. In their work, Fourier transform and Gabor filter have been exhaustively used to
extract the global and local features. In the continuation of the last work, Rui et al.
[2011] applied a decision-level fusion (sum rule) on matching scores of multiple finger
knuckle prints (1579 FKP images). This method resulted in recognition rate of 96.62%.
Similarly, Meraoumia et al. [2011a] applied score level fusion on palm and knuckle
print datasets. For recognition, 1D log Gabor filter and Hamming distance were con-
sidered and a satisfactory recognition accuracy of 99.850% was attained. In particular,
Wankou et al. [2011a] have used a Gabor filter and orthogonal-LDA for FKP-based
recognition. The achieved results were slightly better than CompCode [Zhang et al.
2009a]. Next, Belguechi et al. [2011] described a cancelable biometric system utilizing
binary-output-based coding algorithm called biocode by using FKP and randomly gen-
erated token number. Authors, Belguechi et al. [2011] evaluated the proposed method
against the knucklecode [Kumar and Zhou 2009a] by using a database of 660 images
and considerably low performance with EER of 25.9% was achieved. Furthermore,
Shariatmadar and Faez [2011a] performed FKP recognition by combining circular
Gabor filter, and projecting PCA weights into LDA. Authors had reported high identi-
fication and verification rates with feature level fusion (98.79% and 91.8%).
In addition, the same authors Shariatmadar and Faez [2011b] focused to extract dis-
tinctive information in FKP by extracting features using the Gabor filter, gray level in-
tensity, and then fused them at the feature level. Hegde et al. [2011b] generalized a FKP
recognition method by using Gabor wavelet transform. This method performed user
recognition on the basis of the threshold value and produced promising results. In an-
other study, Hegde et al. [2011a] performed a radon-transform-based FKP recognition
and attained satisfactory results. A novel scheme assuming phase correlation function
for palm-print- and knuckle-print-based feature extraction and classification was in-
troduced by Meraoumia et al. [2011a] and then fusion at the score level was performed.
Additionally, Morales et al. [2011] suggested an orientation-enhancement-based SIFT
method, that comprised the Gabor filter for image smoothening and the SIFT descrip-
tor for feature extraction. Motivated by these results, Badrinath et al. [2011] conducted
similar kind of experiments on enhanced FKP images. However, in their method, SURF
and SIFT descriptors were combined using the weighted sum fusion rule and the tech-
nique was found to be robust against the change in scale and rotation. A personal
recognition via Gabor features and multi-manifold discriminant analysis was car-
ried out by Wankou et al. [2011b], leading to a recognition rate of 98.79%. Further,
Hegde et al. [2011a] obtained promising results by applying radon transform on pre-
processed knuckle print images and computing Karl Pearson’s correlation coefficient of
Eigen values and probability for authenticating a person. Further, Nigam and Gupta
[2011] introduced two algorithms namely Edge-Based Local Binary Pattern (ELBP)
and Features Tracked Successfully (FTS) for FKP-based recognition. Likewise, Xiong
et al. [2011] incorporated Local Gabor Binary Patterns (LGBP) and chi-square distance
statistics for FKP recognition. Moreover, Jing et al. [2011] proposed a novel dimension
reduction subspace method named as complex locality preserving projections and con-
sidered both angle and distance as similarity measures. In addition to this, Kekre and
Bharadi [2011] applied Kekre’s Wavelet Transform (KWT) to compute wavelet energy
features for FKP recognition. Zhang et al. [2011] modified their own work and proposed
a new 6-bit coding scheme via Riesz Transform, known as Riesz ComopCode. Subse-
quently, Zhu [2011] proposed to extract local features by using SURF descriptor and em-
ployed RANSAC-based matching strategy for FKP recognition. Also, Choras and Kozik

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:17

Table V. Summary of Inner Knuckle Print Algorithms


Feature set and
Publication Detail database Feature Extraction Classifier Accuracy (%) Speed
Joshi et al. [1998] Creases on inside finger Wide line integrated Correlation fn. EER- -
profile 0.1%,0.04%
Li et al. [2004] Line, wrinkle, and Radan Transform, Hierarchal 96.88% 0.9s
location of features, wiener filter, classifier
palm and 1432 knuckle euclidean distance,
images hausdorff distance
Ribaric and Inside finger surface, K-L Transform/ PCA Euclidean 100%, 0.54s
Fratric [2005] palm and 1,820 hand distance, KNN EER-0.58%
images
Luo et al. [2007] Inside finger creases Radon transform and KNN classifier EER- 2.51% -
and 488 images SVD
Xianguang et al. Lines and creases Wavelet - - -
[2008] decomposition
Zhu et al. [2009] Finger phalangeal print DB4, Haar and Cosine EER- 0.67%, 0.72s
and 1900 images Morlet wavelet function. 0.79%, 0.67%
Nanni and Lines in the inner skin Radon transform, PWC EER-0%, -
Lumini [2009] of knuckle and 720 haar wavelet and non 0.18%
images linear fisher
transform
Rui et al. [2009] Line creases of a single Line feature detection Cross 95.68% -
finger and 1579 images correlation
Li et al. [2010] Inner side of finger and Corner detection, K means and 97% -
500 images Euclidean distance
Michael et al. Horizontal ridge lines, Ridgelet transform Radial basis EER-1.95% 2.7s
[2010] ridges, palm print and and Wavelet kernel
136 individuals Transform
Rui et al. [2011] Flexor knuckles ridge of Line feature detection 9th matching 96.62% (Score -
multiple fingers and method level fusion)
1,579 images
Tian et al. [2012] Inner side of finger Gabor filter and PCA - -
Liu et al. [2013] Line features, 2000 Improved LBP Cross- EER- 3.22% -
images correlation
Liu et al. [2014] Line features, 2000 Gabor Filter, Cross- EER- 2.4% 0.81s
images derivative line correlation,
detection Score level
fusion
Xu et al. [2015] 400 images (DB-1), 800 Magnitude maps of Earth mover 100%, <0.05s
images (DB-2) CompCode distance EER-0.5%

[2011] contributed in area of palm print and FKP based recognition by employing PHT
and SURF algorithms for feature extraction. Aoyama et al. [2011] obtained phase infor-
mation using DFT, and FKP recognition was carried out by applying BLPOC to improve
the local block wise matching. Next, Saigaa et al. [2012] presented 2D block Discrete
Cosine Transform (DCT) based knuckle texture extraction and also fused information
from multiple fingers. In another study, Zhang et al. [2012b] presented a score level
fusion of phase congruency and BLPOC features which gave an EER of 0.358%. Cheng
and Kumar [2012] were the first to utilize finger knuckle texture as an identifier for
smartphone applications. The authors [Cheng and Kumar 2012] employed 1D log Gabor
filter and hamming distance for recognition task. In the same year, Gomaa et al. [2012]
utilized OAuth protocol for FKP-based personal authentication. The knuckle minu-
tiae and SURF key points were considered to perform texture feature analysis. Zhang
and Li [2012] introduced two more novel coding methods by using Riesz transform,
namely RCode1 and RCode2 for multimodal recognition. Bharadi [2012] presented
performance comparison of Walsh transform, DCT, Kekre transform, Hartley trans-
form, and Kekre wavelets for multiple traits. Later, AlMahafzah et al. [2012c] carried
out a detailed study and verified the performance of various algorithms on a standard
FKP dataset. The algorithms tested include log Gabor filter based feature extraction,

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:18 G. Jaswal et al.

fusion of multiple fingers at score level (max Rule) and decision level (OR rule), com-
bining multiple features using log Gabor, LPQ,5 PCA, Locality Preserving Projections
(LPP) methods. In the same year, Amraoui et al. [2012] combined the LBP-based micro
(spatial domain) and DCT-based macro texture features (frequency domain). Mittal
et al. [2012] also achieved good results in their research based on the DAISY descrip-
tor. Mathivanan et al. [2012] recognized individuals by extracting hand geometry and
finger knuckle features from the dorsal side of single hand images. Kulkarni and Rout
[2012] presented the anatomy of FKP and discussed its different acquisition devices.
Their work was inspired by Kekre’s wavelet transform (KWT). In another study, Li
et al. [2012] implemented a high order-steerable filter to obtain local orientation infor-
mation from knuckle print images. Then, a multi-level histogram thresholding-based
method was used for an efficient and robust FKP matching. Kumar [2012], in one of the
further works, described the importance of finger minor knuckle patterns and applied
LBP, ILBP, 1D log Gabor filter with normalized hamming distance for a recognition
task. Apart from this, a hybrid feature selection method was adopted by Islam et al.
[2012] by using ANN and scaled conjugate gradient. Zhai et al. [2012] adopted Sparse
Representation based Classifier (SRC) with l0 norm. Tian et al. [2012] also contributed
by developing a new ROI location algorithm for the inner side of knuckles by using
dimension reduction criteria. Kirthika et al. [2012] included Strong Speeded up Fea-
tures (SSF) and Scale Invariant Quality Transform (SIQT) for enhanced FKP-based
biometric security. In comparison to other IKP recognition methods [Li et al. 2004;
Ribaric and Fratric 2005; Luo et al. 2007], Liu et al. [2013] introduced modified LBP6
operator for detection of local features. In this technique, the information of spatial
location of feature points was exploited and cross-correlation-based matching was uti-
lized for overcoming the problem of translational displacement. Their method when
evaluated on a database of 2000 IKP images achieved EER of 3.22%. Meanwhile, Gao
and Yang [2013] updated the CompCode [Zhang et al. 2009a] matching by incorpo-
rating weight matrix and modified Hamming distance. According to Gao et al. [2013],
Gabor-filtering-based competitive coding scheme [Zhang et al. 2009a] was sensitive
toward finger pose variation which caused false rejections. Therefore, Gao et al. [2013]
incorporated a dictionary-learned reconstruction algorithm to improve the matching
results. In the same year, Hegde et al. [2013] modified their previous approach [Hegde
et al. 2011a] by incorporating modularization methods for FKP recognition. The mod-
ularization technique was robust even when there was little damage in finger knuckle
and achieved 4.5% EER on testing the algorithm over 450 damaged FKP images, while
the overall system achieved good performance with 95.33% accuracy. During the same
time, Kong et al. [2013] worked on developing a novel ROI extraction algorithm by
using contrast-enhanced and -corrected skewed images. The developed method was
robust against finger displacement and rotation in the horizontal direction, and its
performance was better when compared with other’s works [Zhang et al. 2010; Kekre
and Bharadi 2010]. Further, Nigam and Gupta [2013a] tested Lukas and Kanade
feature-tracking algorithm over enhanced palm print and FKP images, which resulted
in a new personal authentication multi-modal system.
They achieved CRR of 100% with EER less than 0.1%. Further, in order to improve
the accuracy significantly, a combination of multiple fingers of hand was considered
by Meraoumia et al. [2013]. The authors implemented 2D block DCT, GMM, and log
likelihood for feature extraction and matching process. Experimental outcomes showed
a lowest EER of 0.269% using SUM (Sum-score) rule. In one of the two studies car-
ried out by Muthukumar and Kannan [2013b] FKP features were extracted by SIFT

5 Web link: http://www.cse.oulu.fi/CMV/Downloads/LPQMatlab.


6 Web link: http://www.cse.oulu.fi/CMV/Downloads/LBPMatlab.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:19

algorithm and then the extracted key points were clustered into groups by K-Means
algorithm. Whereas in the second study, Muthukumar and Kannan [2013a] presented
a multi-modal biometric system based on FKP and fingerprint patterns with feature
level fusion. An unsupervised K-Means clustering algorithm was employed for further
processing which yielded satisfactory results in terms of Genuine Acceptance Rate
(GAR) 99.4%, FRR 0.6%, and FAR 0%. Apart from this, another contribution was made
by Shariatmadar and Faez [2013] who computed LBPs histograms from Gabor filtered
images. After this, the authors encoded the feature vectors with BioHashing algorithm
which produced biocodes. To achieve robustness in classification task, Kumar et al.
[2013b] proposed a fuzzy binary decision tree, on the basis of which any requested
personality can be categorized into either genuine or the imposter class. Further, the
same authors revised their earlier work by utilizing an ant-colony-optimization-based
fuzzy binary decision tree. Various possible combinations of fingers were considered to
make a bimodal FKP verification system. Additionally, Nigam and Gupta [2013b] made
an effort in the direction of assessing the quality of FKP images. In total, six quality
attributes were identified and methods were suggested to compute them. Based on the
overall quality score, which was computed using likelihood-ratio-based fusion method,
relationship was established between the quality of FKP images and recognition per-
formance. In that same year, Manjunath et al. [2013] presented a novel dimension
reduction scheme known as Two Directional Two Dimensional Locality Preserving
Indexing (2D2 LPI) for multi-instance FKP recognition. Alternatively, Swati and Rav-
ishankar [2013] used KPCA for dimension reduction of their Gabor-filter-based knuckle
features and LDA for class separation. Their work was similar to others [Wankou et al.
2011a; Shariatmadar and Faez 2011a], but the use of KPCA improved the results and
an accuracy of 91.67% was obtained. In addition to this, Aly et al. [2013a, 2013b]
in two different articles illustrated the improvement in performance of a multimodal
biometric system with feature-level or score-level fusion using iris, palm-print, and
knuckle-print patterns. In this work, Particle Swarm Optimization (PSO) was used
for proper feature selection or adaptive combinations of multiple features. Aside from
this, Perumal and Ramachandran [2013] developed a FKP biometric system based
on fusing SIFT, empirical mode decomposition, and SURF algorithms together. This
combination was applied on palm and FKP images, resulting in highly distinctive ro-
tation and scale invariant local and frequency features. Whereas, in their earlier work,
Aoyama et al. [2011] focused on phase-based matching of single FKP pattern, later they
combined multiple FKP patterns for high security applications [Aoyama et al. 2013].
This recognition algorithm was tested on a special door-handle database consisting of
900 images. In addition to this, Kale et al. [2013] proposed to integrate distinctiveness
of FKP and fingernail patterns by carrying out feature level fusion. An accuracy of 97%
was achieved when classification was done using a feed forward neural network trained
with back propagation. Likewise, Peng et al. [2013] suggested finger-based multi-modal
biometric system which used finger shape features, fingerprint, finger vein, and FKP.
In comparison to existing approaches, results improved on the application of a novel
feature-level fusion approach called Linear Discriminant Multi-set Canonical Correla-
tions Analysis (LDMCCA). During the same period, Sumangali et al. [2013] came up
with another hand-based multi-modal biometric system which combined features from
finger knuckles, palm print, hand geometry, and fingerprint. Additionally, Neware et al.
[2013] presented an FKP recognition using PCA and the nearest mean classifier. While,
Kudu et al. [2013] collected finger knuckle images of 50 volunteers by using a digital
camera placed at a distance of 10 cm. Three-level decomposition was done using KWT
to extract fine resolution features followed by Euclidean distance matching. Also in this
same year, Rani and Shanmugalakshmi [2013] discussed the important characteristics
of FKP trait and presented a survey of various knuckle print recognition techniques.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:20 G. Jaswal et al.

Subsequently, Kong et al. [2014] proposed a hierarchal classification method in


which two-stage recognition was employed using Gabor filter (major features) and
SURF (minor features). Later, Sulthana and Kanmani [2014] found that SIFT-based
FKP authentication system took higher processing time and storage space. Therefore,
the modified feature vector was created by normalization, which resulted in EER of
0.65% and 99.87% accuracy. In the meantime, to improve their earlier work, Liu et al.
[2014] made efforts to extract line features from the ROIs of inner side knuckles (total
2000 images) with the use of Gabor filtering and derivative line detection method.
A cross-correlation method was employed to perform point-wise matching, leading to
high recognition performance in real-time scenario. Also, Aoyama et al. [2014] in con-
tinuation to their previous work [Aoyama et al. 2011] presented a FKP recognition
method using BLPOC-based local block matching and even focused on handling non-
linear deformation. Similarly Shariatmadar and Faez [2014] expanded their earlier
work [Shariatmadar and Faez 2013] and developed a multiple-instance FKP recog-
nition algorithm with score level fusion by combining information from index/middle
fingers of an individual’s left or right hand. Gao et al. [2014] attempted to overcome
the drawbacks of CompCode scheme by integrating the multiple orientation and tex-
ture information using score level fusion. Multiple orientation code was extracted by
passing the Gabor filter output through multi-level thresholding scheme, while the tex-
ture information was extracted through the use of LBP. In the same year, Raut et al.
[2014] exploited the advantages of Kekre wavelet over Haar wavelet [Wankou et al.
2011a] for extracting FKP features using KWT from enhanced images. The proposed
technique achieved accuracy of 89.99% and EER of 10.01%. Additionally, Kazhagamani
and Murugasen [2014a] employed a robust algorithm known as elliptical Hough trans-
form for FKP feature extraction for missing and noisy data. The experimental results
showed minimum EER of 0.78%. At the same time, AlMahafzah et al. [2014] presented
a multi-algorithm feature-level fusion using LG, LPQ, PCA, and LPP methods. Higher
performance was achieved with this algorithm when applied on 7920 images of PolyU
FKP database. Another contribution was made by Jayaraman et al. [2014] who dis-
cussed a boosted geometric hashing based indexing technique by which features of FKP
were found to be well distributed in a hash table. In addition to this, Yu et al. [2014]
implemented a well-known local feature descriptor called LBP over cropped FKP im-
ages. The computed binary patterns of FKP image blocks are formed as histograms by
which the feature vectors are formulated. Yet another work on finger-based multimodal
identification system using feature-level fusion was presented by Yang et al. [2014a].
The finger-vein and finger-dorsal patterns were extracted using magnitude preserved
competitive code (C2Code) and classification was done by a nearest neighbor classifier.
Besides these techniques, Kim and Flynn [2014] utilized SIFT and phase correlation
based approaches for reliable detection of corresponding interest points between differ-
ent images of same FKP. Moreover, Ibrahim and Tharwat [2014] performed image-level
fusion and multi-level fusion methods for ear and knuckle print images. Interestingly,
experiments showed that combining ear and finger knuckle images was better than
combining two different finger knuckle images. Later, Kumar and Xu [2014] made an
attempt to highlight finger knuckle patterns between proximal and the metacarpal
phalanx bones of human finger. This automated contactless imaging method was not
explored earlier and used three different matching algorithms namely BLPOC, local
radon transform and ordinal representation. Their efforts obtained an EER of 8.36%.
In one more contribution, Lu and Peng [2014] modified their earlier work, introduced a
cryptographic-based multi-biometric system so as to secure the biometric template. In
their research, the multiple finger patterns like finger shape or geometrical features,
finger vein, FKP, and fingerprint were combined at the feature level. Similarly, Yang
et al. [2014b] also modified their previous work and reduced the dimension of large

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:21

feature vector by using 2DPCA along with competitive coding scheme. The authors
proposed fusion of FKP and finger vein patterns, resulting in comparable recognition
rates with other existing methods.
In another multi-modal approach, Nigam and Gupta [2014] proposed to combine
iris and FKP patterns. Both the patterns were pre-processed using local Gabor binary
pattern and then corners features were extracted using tracking-based Corners having
Inconsistent Optical Flow (CIOF) dissimilarity measure. In the same year, Yan et al.
[2014] developed a novel sub-space-based dimensionality reduction approach called
genetic generalized discriminant analysis, which was simply the generalization of clas-
sical exponential discriminant analysis by using genetic algorithm. Further, Natarajan
et al. [2014] also realized cryptographic-based multi-modal (FKP and fingerprint) bio-
metric system to keep the information safe from unauthorized users and to get higher
accuracy. In the meantime, Abe and Shinzaki [2014] proposed to use a simple web cam-
era for capturing FKP images without holding the hand in the fix mode. The developed
system showed considerably improved results with EER of 7%. Another contribution
in same year was made by Ozkaya and Kurat [2014] who proposed to implement Eu-
clidean distance and discriminative common vector for FKP-based recognition. The
system had excellent performance for small-sized databases with 100% accuracy. In
another work, Kumar [2014] explored the in-depth facts about the finger knuckle sur-
face and its practicality in forensic and civilian applications. Kumar [2014] stressed the
importance of upper knuckle region and developed an automated approach to simulta-
neously segment the upper and middle finger knuckle region which leads to significant
improvement in recognition performance. Moreover, he also claimed that features ex-
tracted remain stable even for finger knuckle images acquired over a time period of
6 years (public database of 2515 middle finger dorsal images). Neware et al. [2014]
proposed an FKP-based personal identification method by using a 2D Gabor filter for
feature extraction and angular distance for matching. The proposed algorithm figured
out the local orientation and, based on this information, achieved a satisfactory recog-
nition rate 99%. Whereas, Subray [2014] utilized different edge detection algorithms
for feature extraction and verified that performance of canny edge detection for FKP
was better in comparison to other edge detectors. Similarly, Zaw, and Khaing [2014] em-
ployed Canny edge detection and PCA methods for efficient feature representation. The
classification was performed by ANN (Artificial Neural Network). On the other hand,
Boucenna and Latifa [2014] applied a multi-scale wavelet edge detection technique for
FKP feature extraction and compared its performance with Sobel and Canny opera-
tors. In their second work, Kazhagamani and Murugasen [2014b] presented a literature
article on finger knuckle print and explored the various issues with implementation
of knuckle print in real-time situations. Further, Amraoui et al. [2014] proposed an
authentication system based on texture analysis of finger back knuckle surface us-
ing a uniform LBP and a minimum distance classifier. Next, Verma and Sinha [2014]
designed a minimum average correlation filter for FKP verification. Apart from this,
Kusanagi et al. [2014] modified the previous work reported by Aoyama et al. [2013]
and recommended a new ROI extraction algorithm for a video sequence of about 2
seconds. The developed multi-finger knuckle recognition system showed better perfor-
mance when evaluated using a database of 25 subjects. In the same year, Zeinali et al.
[2014] performed Directional Filter Bank (DFB)-based filtering on ROI images and
extracted features were further processed by LDA. This combination achieved 99.29%
accuracy when information from four fingers was fused together. Later, Xu et al. [2014]
adopted a local phase quantization method for feature extraction and achieved a reli-
able performance. In addition to this, Sulthana and Kanmani [2014] proposed to use
SIFT descriptor for FKP recognition and achieved 0.65% EER with 99.87% accuracy.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:22 G. Jaswal et al.

In the following year, Xu et al. [2015] suggested an IKP recognition method for low-
resolution images which was robust to varying lighting conditions and adaptable to the
hand-position variation. This approach yielded very strong recognition results. At the
same time, Dey et al. [2015] developed multimodal biometric system in which wavelet
transform and SIFT descriptor were respectively employed for fingerprint and FKP
feature extraction. Similarly, an efficient multi-modal biometric system was presented
by Kang et al. [2015] in which fingerprint, FKP, and finger vein were fused at matching
score level resulting in an EER of 0.109%. In another technique, El-Tarhouni et al.
[2015] employed the Multi-scale Shift Local Binary Pattern (MSLBP) descriptor to get
more robust and discriminative representation of FKP features. Further, Jayaprakash
and Arumugam [2015] designed a Kernel intra-class Finger-Knuckle Pose density as-
sessment method to increase the robustness of FKP biometric system. Another con-
tribution was made by Kumar and Premalatha [2015] who employed fast Discrete
Orthonormal Stockwell Transform (DOST) efficiently to extract global as well as lo-
cal information from the FKP images and this fused information provided significant
improvement of results. Also, Nigam and Gupta [2015] implemented a multi-modal
system using transformation of ROI of palm print and knuckle print into vcode and
hcode, respectively, based on the sign of local gradient. The authors suggested a highly
uncorrelated features measure to match the images and obtained 100% CRR with
0.01% EER. Another transform domain approach was put forward by Kazhagamani
and Murugasen [2015] who applied contourlet transform on finger knuckle images to
decompose the input image into low- and high-frequency components. Further, PCA
approach was applied to reduce the dimension of obtained feature set and it was fol-
lowed by matching process. In the experiments, an accuracy of 98.72% was achieved.
In addition to this, Grovera and Hanmandlu [2015] presented a hybrid fusion rule for
FKP-based authentication and achieved improved results with reference to individual
fusion methods. They utilized the adaptive fuzzy decision level fusion for biometric au-
thentication. To improve the sparse representation problem in FKP recognition, Li et al.
[2015] employed Group Collaborative Representation-based Classification (GCRC) be-
tween the query sample and training groups to control the sparse constraint as group
information. Authors found that recognition improves by adding the group information.
To make a robust ROI extraction under variable skewed conditions, Yu et al. [2015] pro-
posed center-point ROI detection and localization method for FKP images. This method
was highly accurate to reduce image bending in both horizontal and vertical direction.
Moreover, a study aimed at verification of human identities using lower finger dorsal
region around the MCP joint was proposed by Ozkaya [2015]. In this work, the finger
knuckle image is subjected to Discriminative Common Vector (DCV) and then match-
ing and thresholding stages were performed using the Euclidean distance measure.
Ozkaya [2015] also contributed a non-uniform hand-image database consisting of 600
hand images. In continuation of previous findings on the finger dorsal surface, Kumar
and Wang [2015]7 addressed the issues of automatic recovery and matching of minu-
tiae patterns from knuckle images. The acquired finger dorsal images were initially
subjected to segmentation and enhancement processes to further locate and extract
the knuckle minutiae. This work implemented various methods to compute quality of
knuckle minutiae and finally three popular approaches, namely minutiae cylindrical
code, minutiae triangulation and spectral minutiae were used to match the recovered
knuckle minutiae. Recently, Khellat-Kihel et al. [2016] proposed a novel multi-modal
system based on FKP, finger vein, and fingerprint traits using enhanced feature level
fusion which not only improves the accuracy but optimize the computational time. They
achieved best results in terms of 0.04% EER and a verification rate of 99.53%.
7 Web link: http://www.comp.polyu.edu.hk/csajaykr/Knuckle_Minutiae.rar.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:23

In this section, research works of different authors have been investigated and ar-
ranged in chronological order. The various feature-extracting and -matching techniques
for the knuckle prints, and the combination of knuckle print with other modalities like
palm print or hand geometry, etc., has also been presented. The following conclusions
are drawn on the basis of the review presented in this section:
• The finger knuckle print is still a less-investigated biometric for a wide range of
applications, and, as such, a practical finger knuckle biometric system is yet to be
developed.
• Most of algorithms reported in the literature have been tested on benchmark knuckle
print databases such as the PolyU FKP database and IIT Delhi Finger Knuckle image
database, which only include images captured from the index and middle fingers of
both the hands.
• Various data collection techniques reported in the literature suffer from variations in
finger pose and lighting conditions that contribute to false rejections in the matching
process. Though there is lack of FKP databases in which images incorporate the
real-world situations such as variations in the bending of fingers, etc.
• There is a need to test IKP recognition methods on standard databases. However,
there are no publicly available benchmark IKP databases.
• The recognition performance of IKP under varying lighting and dry skin conditions
has been found to be quite low.
• There is no reported study on an FKP or IKP system that investigates the possibility
of ensuring liveness of the recognized subjects by including some special hardware
or fusion with some other biometric modality.
• Sufficient discriminant information is present on both outer- and inner-side knuckle
prints, but still no bimodal biometric system is proposed using both of them.

5. FUSION SCHEMES
There are good examples of unimodal biometrics such as voice, signature, fingerprint,
DNA, hand geometry, etc. However, there is no single biometric trait that can fulfill
security and performance requirements as are desired for different applications in to-
day’s interconnected world [Hong et al. 1999]. Moreover, most of them often suffer from
some common complications such as non-uniqueness, noise in sensed data, interclass
similarities, non-universality, spoofing, intra-class variations, and poor discrimination
ability leading to a high false FAR and FRR [Ross and Jain 2003]. In order to alleviate
these problems, a multi-modal biometric system may be a possible alternative scheme
that combines two or more physiological or behavioral features modalities to increase
the probability of success or particularly to improve accuracy of any identification or
verification task. Each biometric pattern has a definite origin, so according to the type of
information that a particular biometric provides, a multiple component biometric sys-
tem can be mainly categorized into five sub-categories such as multi-modal, multiple
sensors, multiple instances, multiple samples, and multiple algorithms. For example,
a FKP recognition system can comprise of multiple cameras to take the 2D/3D knuckle
image around the finger joint [Woodard and Flynn 2005; Ravikant and Kumar 2007].
Similarly, photosensitive sensors are required to capture the FKP or fingerprint of a
hand [Rui et al. 2009; Luca and Roll 2004]. A FKP recognition system that combines
Gabor Filter, PCA,8 and LDA is a good example of a multiple algorithm system [Zhara
and Frej 2011a]. Additionally, FKP, hand geometry, fingerprint, and palm print can
be extracted from a single hand image and used to verify the human identity [Goh
et al. 2010b; Ferrer et al. 2007]. Likewise, knuckle print and ear [Ibrahim and Tharwat

8 Web link: http://www.face-rec.org/algorithms/#PCA.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:24 G. Jaswal et al.

2014] or knuckle print and iris-based bimodal system [Nigam and Gupta 2014] depicts
the multimodal systems. However, the systems may integrate any of the previously
mentioned categories to design a mixed system known as hybrid systems [Mathivanan
et al. 2012].
The multi-modal approach to biometrics may be applied in different ways. One of
best known procedures, which combines the multiple biometrics information, is called
fusion. The main objective of the biometric fusion in multimodal system is to pro-
vide high robustness, adaptive, applicability, precision, and recall. However, the selec-
tion of biometric traits and type of fusion rule depends upon the type of application
[Rodriguez et al. 2008]. The probable fusion scenarios in a multi-modal system are de-
fined at sensor, feature, matching, decision, and rank level [Rui et al. 2009; Shameem
and Kanmani 2014]. Though fusion at feature level can be employed, but sometimes
this fusion scheme may be inappropriate because of incompatibility between the mul-
tiple feature vectors. Secondly, another challenge is to process a longer length feature
vector of integrated features from unimodal traits, before feeding it to the classifier. This
results in additional hardware/memory requirements, larger enrollment, and recogni-
tion time [Li et al. 2004; Mingxing et al. 2013]. Whereas in the case of the use of a
single sensor, there is possibility that data acquired may be noisy. As far as rank level
fusion is concerned, it is not suitable for verification tasks. However, among all these
fusion schemes, score-level fusion is mostly preferred due to the ease of computing a
single match score from multiple classifiers. A normalization technique is utilized to
put all the different scores in a same form before combining them.
Further, on architectural front a multimodal biometric system functions in one of the
five different modes: serial, parallel, hierarchical, pipelining, or sequential [Ross and
Jain 2003; Sahoo et al. 2012; Marsico et al. 2014].

(1) Serial Mode. In this, the output of first modality is used to prune the list of candi-
dates to a shortlist of most likely subjects. Thus, the output of first trait serves as an
input to the next stage. It means multiple traits are not processed simultaneously.
For example, a multi-modal biometric system comprising of hand geometry and
knuckle print, in the first stage could use hand information for selecting top N best
matching samples. Then, knuckle print information can be used for recognizing the
individual from amongst the N shortlisted candidates in previous stage.
(2) Parallel Mode. Here, the information from different traits is processed simulta-
neously and the obtained results are combined together to obtain a final match
score. For example, finger knuckles taken from multiple fingers of both hands can
be processed simultaneously.
(3) Hierarchical Mode. In this mode, the various classifiers are supposed to combine
in an ordered form like a tree. It is beneficial if multiple systems are present.
(4) Pipelining Mode. In this operation, the various modules of a multi-modal system
like data collection, feature extraction, etc. and performed simultaneously. The
required time should be same for each module.
(5) Sequential Mode. The final output is a result of successive tests with reject option.
If the first trait fails for any reason, then the system can use another trait or two
of them to provide the result.

An overview of various fusion techniques with focus on FKP-based multi-modal sys-


tems has been presented in this section. Table VI presents a summary of various works
related to fusion of the finger knuckle print with other hand biometric traits. Addi-
tionally, various combinations lead to different advantages and varied performances.
A brief comparison of such biometric groupings on the basis of easiness of acquisition
setup, robustness, anti-forging, and acceptability is presented in Table VII. From the

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:25

Table VI. Summary of Knuckle Print Fusion with Other Modalities


Features
Extraction
Publication Detail Hand Features Techniques Database Size Fusion Level Recognition Rate
Kumar and FKP and finger PCA, LDA, ICA 630 images, 105 Score Level (Sum EER- 1.94%, D.
Ravikant [2009] geometry users rule, product rule) Index- 2.35
Zhang et al. [2009] FKP from multiple Gabor filtering and 7920 images Score level (sum EER-0.26 %
fingers angular distance (PolyU FKP rule and min. rule) (R-index, R-middle)
database) and EER-0%
(R-index, L-index,
R-middle, L-middle)
Kumar and Hand vein and Delaunay 100 users Score level EER-1.14 %
Prathyusha [2009] knuckle shape triangulation,
shape features
Zhu and Zhang Finger geometry, Mean, ordinal 1900 images Decision level FAR-2.52e-6,
[2010] IKP and palm print codes and wavelet (AND rule) FRR-0.0089
denoise
Goh et al. [2010] Palm print and Wavelet Gabor, 125 users Decision level EER-1.25%
FKP comp code and
ridget transform
Goh et al. [2010] Palm print and Directional coding 1360 images SVM-based score CRR-99.84%,
IKP and ridgelet level EER-0.0034%
transform
Guru et al. [2010] Multi-instance Zernike moments, 7920 images Feature level CRR-92.24%
FKP PCA (PolyU FKP
database)
Aoyama et al. FKP 2D DFT and 7920 images Score level EER-0.321%
[2011] BLPOC (PolyU FKP
database)
Shen et al. [2010] Palm print and 2D Gabor wavelet, PolyU palm, Decision level CRR-89.20%
knuckle print hamming distance PolyU FKP
database
Rui et al. [2011] Multiple FKP Line and edge 1579 knuckle Decision level CRR-96.62%.
detection images
algorithms
Meraoumia et al. FKP and palm 1D Log Gabor filter 7920 images Score level CRR-99.850%
[2011] print and hamming PolyU FKP
distance database
Shariatmadar and FKP (Single Gabor filter, PCA 7920 images Feature level 98.79% (4 finger
Faez [2011] Biometric feature) +LDA and (PolyU FKP combination)
euclidean distance database)
Meraoumia et al. Palm Print and 2D DFT and phase 1800 images Score level 99.647 % (fusion)
[2011] FKP correlation
function
Badrinath et al. Knuckle print SIFT, SURF and 7920 images Score level EER-0.215%
[2011] (single biometric nearest neighbour (PolyU FKP (weighted sum
feature) ratio database) rule)
Morales et al. Knuckle print from Gabor filtering, 7920 images Score level (SUM EER-0.45%
[2011] multiple fingers SIFT and (PolyU FKP rule)
euclidean distance database)
Saigaa et al. [2012] FKP from multiple 2D DCT 7920 images Score level EER-2.09%
fingers (PolyU FKP (Verification Phase)
database) and EER-0.20%
(Identification
Phase)
AlMahafrah et al. Knuckle print from Log Gabor filter 7920 images Feature level with GAR-81.13%(3
[2012] multiple fingers (PolyU FKP different instances)
database) normalization
AlMahafrah et al. Knuckle print LG, LPQ, PCA and 7920 images Feature Level with 93.00% (LG+LPQ+
[2012] (Single Biometric LPP (PolyU FKP different PCA), 92.33%
feature) database) normalization (LG+LPQ+LPP)
93.00
(LG+PCA+LPP)
and 90.00
(LPQ+PCA+LPP)
Mathivanan et al. Hand dorsum Prewitt edge filter, 120 images with Natural fusion CRR- 97%
[2012] geometry and median filter, 72 dpi
knuckle print Euclidean Distance
Gao et al. [2013] Knuckle print Competitive coding 7920 images Score level EER-1.10%
(single biometric and reconstruction (PolyU FKP (adaptive binary
feature) based matching database) fusion)
scheme
(Continued)

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:26 G. Jaswal et al.

Table VI. Continued


Features
Extraction
Publication Detail Hand Features Techniques Database Size Fusion Level Recognition Rate
Hegde et al. [2013] Knuckle print Radon transform, 7920 images CRR- 95%
Gabor wavelet (PolyU FKP
transform database)
Nigam and Gupta Palm print and Lukas and Kanade CASIA palm print Score level CRR- 100%, EER
[2013] knuckle print feature tracking and PolyU FKP < 0.1%
algorithm database
Meraoumia et al. Knuckle print 2D-DCT, GMM and 7920 images Score level (MUL EER-0.370% (Score
[2013] (single biometric Log- likelihood (PolyU FKP Rule) and decision MUL Rule) and
feature) database) level FAR-0.705%,
FRR-1.037%
(Decision Rule
RIF-RMF-LIF)
Kumar et al. [2013] Knuckle print Ant colony 7920 images Score Level (sum
(2 instances) optimization, gini (PolyU FKP rule) FAR(%)
Index, information database) LI RI LM RM
gain, hamming 0.003 0.003 0.003 0.009
distance
LI RI LM RM
0.003 0.027 0.051 0.009
Aly et al. [2013] Iris, palm print, LDA, PSO CASIA iris Feature level GAR- 98.83%,
FKP database, PolyU TER- 1.16%
palm print and
7200 FKP images
Kale et al. [2013] Fingernail and MFCC, Second 100 users, self Feature level CRR-97%
FKP level wavelet prepared
decomposition,
Back propagation
NN
Peng et al. [2013] Finger vein, FKP, PCA, LDMCCA Poly U databases Feature level EER- 2.3900e-04
fingerprint and
finger shape
Kong et al. [2014] Knuckle print from Gabor Filtering, 7920 images Decision level CRR-98.5%
multiple fingers SURF and (PolyU FKP
Euclidean Distance database)
AlMahafzah et al. FKP LG, LPQ, PCA, 7920 images Decision- GAR-86.33% (LG+
[2014] LPP level(AND/ OR LPQ+LPP) with
rules) OR rule, 86.33%
(LG+ LPQ+PCA)
with AND rule
Yang et al. [2014] Finger vein and Magnitude THU-FVFDT Feature level EER- 0.889%
finger dorsal preserved database, 220
images competitive code users
and nearest
neighbor classifier
Ibrahim and Ear and finger LDA, DCT and Ear Images in Multi-level CRR- 100% with
Tharwat [2014] knuckle image DWT PGM format and score level fusion
PolyU FKP
Lu and Peng [2014] Finger vein, FKP Gabor filter, log FVC2002 database FMC-MCCA GAR-100% for
finger shape, Gabor phase DB1, finger image feature level FMC-MCCA EER-
fingerprint, congruency, GLBP, database version 0.36 % for finger
fourier descriptor, 1.0 (finger vein), Vein
fuzzy commitment PolyU FKP
scheme
Yang et al. [2014] Finger vein and 2DPCA, 616 FKP and 616 Score level CRR-97.08%
knuckle print competitive coding FV images
scheme and
Euclidean distance
Nigam and Gupta Iris and knuckle Local Gabor binary CASIA 4.0 interval Score level CRR_99.79%
[2014] print pattern and LK and lamp iris,
tracking based PolyU FKP
CIOF
Kang et al. [2015] Fingerprint, finger LBP, ORB and 1890 image Score level EER-0.109%
vein, knuckle print nearest neighbor
distance ratio
Nigam and Gupta Palm print and Sign of local CASIA palm print, Score level CRR-100%,
[2015] FKP gradient, highly PolyU FKP EER-0.01%
uncorrelated database
features

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:27

Table VII. Comparison of FKP Combinations


Biometric Acquisition
Combinations setup Flexibility Anti-forging Acceptability Accuracy
FKP and Ear Complex Low Low Low >95%
[Ibrahim and
Tharwat 2014]
FKP, Fingerprint, Moderate High High Medium 0.109%
and Finger vein (EER)
[Kang et al. 2015;
Khellat-Kihel et al.
2016]
Finger dorsal image, Moderate High High High 0.43% (EER)
and Finger vein
[Yang et al. 2014]
FKP and Iris [Nigam Complex Medium Low Low 99.9%
and Gupta 2014;
Aly et al. 2013]
FKP (Multiple Simple High Medium High >95%
Fingers) [Kong et al.
2014]
FKP and Finger vein Moderate High High High <1%(EER)
[Yang et al. 2014]
Finger Knuckle, and Simple Medium Medium Medium >95%
Fingernail [Kale
et al. 2013]
FKP and Palmprint Simple High High High 99.9%
[Nigam and Gupta
2015]
FKP, Finger vein, Moderate High High Medium 99.9%
Fingerprint, and
Finger shape [Peng
et al. 2013; Lu and
Peng 2014]
Knuckle Shape, and Moderate Medium High Medium 1.14%(EER)
Hand vein [Kumar
and K V Prathyusha
2009]
FKP and Hand Simple High High High >95%
dorsum geometry
[Mathivanan et al.
2012]
IKP, Finger Simple High High High <1%(EER)
geometry, Palmprint
[Zhu and Zhang
2010]

above-mentioned comparison, it can be stated that the combination of finger vein, finger
knuckle print, fingerprint, and finger shape is more straightforward and consistent.
It is possible to conceptualize or embed a sensor which can acquire all these traits
together and can enhance the superiority of recognition task.

6. KNUCKLE PRINT TRADITIONAL DEVICES


Hand-based biometric traits mostly utilize impression-based capturing devices such
as optical, thermal, silicon, or ultrasonic imaging sensors [Woodard and Flynn 2005,
Ravikant and Kumar 2007, Zhang and Zhang 2010; Luca and Roll 2004; Lee et al.
2011]. The advantage of these economical imaging sensors, which use small memory
size to store the templates, make hand region features a beneficial choice for biometric

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:28 G. Jaswal et al.

Table VIII. Summary of FKP Devices and Performance


Sensor Resolution
Dimension (dpi)/
Sensor Type (L x W x H) Background Database Features Pros and Cons
CCD digital camera C- mount Not given 206 users, Creases on inner Performance depends on
[Joshi et al. 1998] lens 412 images with size side of finger controlled image acquisition
(25 mm) (512x512, 8 bit) settings. A use of micro-switch
and consistent light source,
make it unsuitable for real
time applications.
CCD digital camera Not given <70 dpi/ 1432 images with Line, and location Position of hand is fixed with
[Li et al. 2005] black size 1792x1200 features of IKP uniform illumination effects.
It was an attempt to detect
inner knuckles from middle
finger, and ROI <70 dpi is
segmented.
Minolta 900/910 213x413x Not given/ 1191 hand images Skin folds and Impractical to use in real time
sensor with 271mm black (3D) with size crease patterns applications due to large size
dimension [Woodard 640x480 from back surface device (11kg), intrusive, time
and Flynn 2005] of finger consuming, and require large
memory. However, device is
contactless and a distance of
1.3 m is kept between sensor
and hand.
Sony DSC-P8 sensor Not given Not given/ 160 images with size Wrinkles Close positioning between
[Ferrer et al. 2006] black 2048x1536 ofKnuckle image sensor and hand (12cm)
provide high resolution
images. A peg-based setup
with black background is
used.
Cannon Power Shot Not given Not given/ 630 images with size FKP surface, Hand Images are captured using 2D
A90 Camera white 1600x1200 Geometry peg-free imaging setup
[Ravikant and against white background
Kumar 2007] under uniform brightness.
A ring shaped LED, Not given Not given 5760 images from Line features from A constant LED light source
lens, finger bracket, 120 users outer surface of is used to maintain uniform
CCD camera, frame finger joints illumination which results in
grabber [Zhang et al. high quality images.
2009]
Peg-free imaging Not given Not given/ 630 images with size Knuckle texture 2D peg-free imaging setup
system with camera white 1600x1200 and geometrical against white background is
located at 20cm features installed to acquire finger
[Kumar and images. An extra LCD is
Ravikant 2009] mounted to check the position
of hand.
CCD camera, 160 x125 400 dpi 165 users, 7920 Internal skin Basal and triangular blocks
ring-shaped LED, x100 mm images with size pattern around are used to fix the position of
lens, finger bracket, 768x576, Poly U FKP phalangeal joint finger so that curve patterns
and a frame grabber database can be clearly captured. The
[Zhang et al. 2010] size of sensor is small, and
constant LED light is used.
Peg-free imaging Not given Not given 790 images in .bmp Curved knuckle Digital camera with
system with camera format, IIT Delhi lines and creases of unconstrained imaging setup
located at 20cm Finger Knuckle enhanced knuckle is used in indoor environment
[Kumar and Zhou database image which reduces the effects of
2009] shadows and reflections. The
fixed size ROI (100x80) is
segmented.
Infrared camera with Not given Not given 300 images with size Hand vein A near IR camera is used for
LED located at 21 cm 768 x 576 structure and simultaneous acquisition of
[Kumar and shape of knuckles knuckle tip and vein patterns
Prathyusha 2009] from palm dorsal image.
1.3-megapixel CCD Not given Not given/ 1250 hand images, Palm print and Full hand image acquisition
camera [Goh et al. black with size 640x480 inner knuckles in the form of video sequence
2010] enclosure is taken at 25 fps under
semi-controlled environment.
A black enclosure with light
bulb is used to reduce the
effect of background light.
Capture box, Not given 96dpi/ black 1900 hand images Palm Print, finger Camera with high video frame
web-camera and LED with size 640x480 geometry and rate (30fps) and fixed LED
light [Lee-qing and knuckle print light source is used. The setup
Zhang 2010] is enclosed with white box
which maintains the range as
well as uniform illumination.
(Continued)

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:29

Table VIII. Continued


Sensor Resolution
Dimension (dpi)/
Sensor Type (L x W x H) Background Database Features Pros and Cons
Smartphone HTC Not given Not given Database-1 (561 Creases of knuckle Contactless imaging using
desire HD A9191 images with size 80 x image smart phone with auto focus
with 8-megapixel 100), Database-2 facility but illumination and
color camera [Cheng (790 images) position variation may
and Kumar 2012] decrease the accuracy. All
images are resampled to a
fixed size of 80x100.
Camera and a panel Not given Not given 2000 images with Inner knuckle Nonuniform illumination may
with two pegs [Liu resolution 576x768 print from middle affect the performance but
et al. 2013] collected from 100 and ring fingers peg-based imaging setup
users reduces the rotation variation
of hand. The size of images
are fixed to 576x768.
Camera with white Not given Not 500 images collected FKP from index A fixed distance of 10 cm
background fixed at a given/white from 50 users with and middle finger against white background is
distance of 10 cm size 4320x3240 kept to maintain fast
[Kudu et al. 2013] processing.
HP color laser jet Not given Not given 600 images collected FKP from index, Use of scanner may blur the
CM1312-MFP series from 100 users middle, ring and quality of images as it is not
PLC6 [Kale et al. little fingers always possible to keep the
2013] hand with same strength.
Point grey Focal Not 900 images with size FKP from index, NIR light source is not
GRAS-14S5M-C length – given/dark 1280x960 middle, and ring sufficient to enhance the
camera with 8 MP 8 mm fingers quality of finger knuckles, but
and NIR light source it reduces the background
[Aoyama et al. 2013] noise.
Camera with white Not given 96dpi/Not 220 users with 720 x Finger vein, finger Complex sensor design,
light, NIR (JSP given 576 (THU-FVFDT) dorsal require more user cooperation
MODEL:DF-2112) but use of two white-light
[Yang et al. 2014] LED’s provides constant
illumination and enhances
the image quality (96dpi)
Digital single-lens Not given Not given 600 hand images Knuckle print from Unconstrained acquisition
reflex camera with with size 4608 × Index, Middle and environment including
10.2 mp [Ozkaya and 2592/ 2400 FKP Ring fingers non-uniform lighting, variable
Kurat 2014] images with size range (15–25cm) increases
90x90 the user flexibility, but free
position of the hands may
affect the accuracy in case of
large-size database.
Point grey Focal Not given 500 video sequences Knuckle print from Acquisition is non-intrusive,
FL3-U3–13Y3M length– 4 with size 1280×1024 Index, Middle, and a cylindrical door handle
camera with lens mm, frame Ring and little is used with visible light
SPACECOM rate-30fps fingers source which maintains good
JHF4M-MP and NIR image quality of finger
and Visible light knuckles.
[Kusanagi et al.
2014]
Two USB cameras Not given Not given 1890 image with size Finger vein, FKP A uniform visible light
with visible-light, 800x600 and fingerprint illuminator (500 nm) is placed
NIR light source to enhance the quality of
[Kang et al. 2015] images.
Ordinary camera [Xu Not given Not given/ 1200 IKP images Inner knuckles Images are taken under
et al. 2015] no color (DB-1, DB-2) unconstrained illumination
environment along with
peg-free set-up. It raises the
user freedom during the
capture but these effects may
degrade quality of images.
Sony DSC-PS, CCD Not given Not given/ 160 images (DB-1) Finger knuckles of A non-uniform database is
10.2 megapixel DX black with size 2048×1536, MCP joint established using no fixed
[Ozkaya 2015] 660 images (DB-2) environment conditions such
4608x2592 as pegs, lighting, distance
between the camera and
hand. The performance of
system may get reduce, thus
preprocessing tasks are
needed to maintain the
accuracy.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:30 G. Jaswal et al.

Fig. 6. IIT Delhi finger knuckle database.

applications [Unar et al. 2014]. But, hand-region attributes suffer from some common
challenges such as distorted images, high user cooperation, hand diseases (arthritis),
and natural imaging contaminants (dead cell, scars, cuts, wet and dry skin) as well as
dirty or oily surface of imaging sensors [Bera et al. 2014; Carlos et al. 2014].
In this section, we extend the discussion presented in Section 3 and illustrate the
summary of various FKP acquisition devices in chronological order. Various camera
based knuckle print biometric system exist in literature. All existing systems make
numerous assumptions about the acquisition of inner and outer knuckle prints. For
the existing acquisition methods, the following observations can be made:
(i) Low-resolution ordinary cameras are sufficient for the commercial authentication
purpose.
(ii) FKP outer knuckle patterns are prominent in the case of finger bending whereas
IKP inner knuckle patterns are those inborn structures whose formation begins
before birth.
(iii) ROI extraction of knuckle print (FKP) is easily affected by posture variation.
(iv) Need to develop contactless or peg-free sensor device for online FKP applications
is an open research issue.
(v) To consider the requirements of collecting inner and outer knuckle prints together
will be a novel study in this area.
(vi) Need to conceptualize a finger sensor which can extract knuckle patterns and
simultaneously ensure liveness of an individual.

7. DATABASES AND EVALUATION PARAMETERS


The design of database is a very important issue in recognition process so as to reduce
the possibility of drawing wrong conclusions. As mentioned earlier, there are a few
finger knuckle print standard databases publicly available for non-commercial use. Six
different sets of finger knuckle image databases have been presented below. The major-
ity of the algorithms discussed in the literature have been tested on these benchmark
datasets by different researchers. Mostly, the images of both hand fingers including
index and middle are used to create the datasets. Further, these datasets of finger
knuckles are acquired in multiple sessions from subjects of varied age group using
different kind of capturing devices.
(1) IIT Delhi Finger Knuckle Database (http://www4.comp.polyu.edu.hk/∼CSajaykr/
IITD/iitd_knuckle.htm). It was the first freely available finger knuckle image
database in the public domain, developed by IIT Delhi during 2006–2007 using
a digital camera. It consists of 790 images available in bitmap format with res-
olution 80 × 100 and collected from 158 users (faculty, staff, and students from
IITD). All the individuals were in the 16–55-year age group. Figure 6 illustrates
the sample finger knuckle images.
(2) The Hong Kong Polytechnic University Finger Knuckle Print Database (http://
www4.comp.polyu.edu.hk/∼biometrics/FKP.htm). The database was prepared by
Biometric Research Center (UGC/CRC) at the Hong Kong Polytechnic University
and freely available for academic, noncommercial use. The sample FKP image of
different subjects are depicted in Figure 7. It contains 7920 images in BMP image

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:31

Fig. 7. PolyU finger knuckle print database.

Fig. 8. PolyU contactless finger knuckle image database.

Fig. 9. THU-FVFDT1 database.

format with resolution 110 × 220. There were 165 (125:40) individuals partici-
pating in the enrollment process, including males and females. Among them, 143
subjects were 20 to 30 years old and the others were 30 to 50 years old. For each
subject, six images per index/middle finger were acquired in two different sessions
(time gap between 14 to 96 days). This dataset is also available in ROI form.
(3) The Hong Kong Polytechnic University Contact-less Finger Knuckle Images
Database (http://www.comp.polyu.edu.hk/∼csajaykr/fn1.htm). The contactless fin-
ger knuckle database includes 2515 middle-finger dorsal images collected from 503
volunteers from the Hong Kong Polytechnic University and IIT Delhi within two
separate sessions with a gap of 7 years. The images are acquired by a contactless
ordinary camera and available in BMP image format for further research. Fig-
ure 8 shows the sample images of finger knuckle pattern collected from different
individuals.
(4) THU-FVFDT Database (http://www.sz.tsinghua.edu.cn/labs/vipl/thu-fvfdt.html).
THU-FVFDT was developed by Graduate School at Shenzhen, Tsinghua University,
which is freely available for academic use. The database contains a separate raw
finger vein and finger dorsal texture images of 220 different users with resolution
of 720×576 pixels. The database was collected from students and staff volunteers
of the institute in two separate sessions with an interval of about dozens of seconds.
The ROI images are also available in a separate dataset and some illustrations are
given in Figure 9.
(5) IKP Database (http://www.ceie.hbu.cn). The IKP image database was developed by
the College of Electronic and Information Engineering, the Hebei University. The
database comprises of 2000 images from 100 different volunteers of university with
a resolution of 576 × 768. IKP images were collected on two separate instances at
an interval of around two months. A few ROI samples of IKP data set are shown
in Figure 10.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:32 G. Jaswal et al.

Fig. 10. IKP database.

Fig. 11. ROI of MCP hand database.

(6) Hand Database (http://aves.erciyes.edu.tr/neclaozkaya/dokumanlar). The non-


uniform illumination-based MCP joint patterns database was acquired by Insti-
tute of Science and Computer Engineering Department, Erciyes University. The
database consists of 660 images which are taken under variable conditions with
reference to lighting, distance between the camera and the hand. The images are
collected over a time period of 2 weeks with size 4608 × 2592. An example of
cropped images is shown in Figure 11.
Any identification or verification algorithm is tested for evaluating its performance
on a database to determine whether it has the capability to be used in a variety of
applications or not. Generally, a few important performance metrics such as Correct
Recognition Rate (CRR), Cumulative Match Characteristics (CMC), Equal Error Rate
(EER), and False Acceptance Rate (FAR). Verification accuracy, Computation time, and
False Rejection Rate (FRR) are used for evaluation of biometric system [Jain et al.
2004b; Surya and Gupta 2015]. Further, there are some instances in which samples
are not captured properly, and the inability of the user to enroll in a system due to the
absence of a physical trait or to the poor quality of the sample. So, these acquisition
errors are quantified in terms of Failure To Capture (FTC) and Failure To Enroll (FTE),
which deteriorates the overall performance of the system [Sahoo et al. 2012]. These
factors are described below.
(1) FAR, the probability that a system incorrectly matches the user input to non-
matching template in database. It measures the percentage of invalid matches
that are incorrectly accepted. It is also sometimes termed as False Match Rate
(FMR) [Jain et al. 2004b; Surya and Gupta 2015; Vorobyeva et al. 2014].

proportiono f imposter match scores accepted wrongly


FAR = × 100 (1)
(T otal No. o f imposter matches)

(2) FRR, the probability that a system fails to detect a match between user input and
matching template in database. It measures the percentage of valid matches being
rejected. It is also referred to as False Non-Match Rate (FNMR) [Surya and Gupta
2015; Vorobyeva et al. 2014].
proportion o f genuine match scores rejected wrongly
FRR = × 100 (2)
(T otal No. o f genuine matches)

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:33

Fig. 12. Graph b/w threshold and Fig. 13. Hypothetical EER graph. Fig. 14. ROC graph.
FAR/FRR.

However, FAR and FRR are not the exact synonyms for FMR and FNMR but
somehow they have been considered equivalent [Sahoo et al. 2012]. One more
possibility for authorized users is described as genuine acceptance rate.

proportiono f genuine match scores accepted


GAR = × 100 (3)
(Total No. o f genuine matches)

Any classification task performs a decision using the threshold value. It is not
dependent on the higher or lower values of FAR or FRR. The threshold value is
based on the FAR and FRR corresponding to EER point. In other words, increasing
the threshold will make the system less accessible to impostors, but the probability
that genuine users will be rejected increases and vice-versa [Jain et al. 2004b]. Be-
sides, these performances measure graphical methods like DET (Detection Error
Tradeoff), curve (graph of FAR vs FRR), and Receiver Operating Characteristics
(ROC), which plots (1-FRR) against various FARs that are used to assess the per-
formance of verification systems [Jain et al. 2004b]. Each point on a ROC or a DET
curve is related to a specific chosen decision threshold.
(3) EER, a predetermined threshold at which FAR linearly varies with respect to FRR
in ROC plot, then the common value is called as Equal Error Rate. It means, EER
is a point of intersection of FAR and FRR curves. The lower value of EER provides
better system performance.
(4) Verification Accuracy. Defines the performance of verification algorithm [Surya and
Gupta 2015].

 
Sum o f FAR and FAR
Accuracy = 100 − × 100 (4)
2

(5) CRR, the probability of correctly identifying a person from total number of indi-
viduals available in dataset. Besides, comparing the training and test images, it
also arranges them on the basis of matching scores. It is also termed as rank-1
recognition rate [Surya and Gupta 2015].

proportion o f top − 1genuine matches


CRR = × 100 (5)
Total No. o f test matches performed

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:34 G. Jaswal et al.

CMC plots the rank-k recognition rate (number of genuine matches that occurred
in top k matches) against k [Surya and Gupta 2015].
(6) FTE and FTC, the acquisition errors are defined as [Sahoo et al. 2012].

Number o f attempts fail to capture


FTC = (6)
Total No. o f usere

Total Number o f users fail to enroll


FTE = (7)
Total No. o f users

(7) Computation Time. Describes the average execution time for preprocessing as well
as verification/identification tasks.
8. BIOMETRICS MARKET SURVEY AND APPLICATIONS
Due to the capability of recognizing individuals by unchangeable physical or behavioral
characteristics of living organisms, authorities from every facet of human life are em-
ploying biometrics in countless civil or commercial applications such as passports and
customer ID, banking and finance (online transactions, ATMs), mobile and computer
industry, healthcare industry, military and defense, attendance system for allowing
staff to clock in and out etc. [Marketsandmarkets.com 2015]. Most extensively, it is
being used in government projects, including e-passports, e-driving licenses, border
management, voting system, and national IDs across the world in countries like India,
China, South Africa, Malaysia and Pakistan etc. [Unar et al. 2014].
India started a multi-biometric based National ID program in 2011 called Aadhar
Card, which is the largest biometric database across the world with 8.4 crore Aadhar
enrollments. It was discussed in a report titled “The Role of Biometric Technology in
Aadhar Enrollment by the Unique Identification Authority of India (UIDAI).” Likewise,
in Pakistan, a multi-biometric-based national ID card and passport facility was started
through the national database and registration authority [Kwon 2014]. Additionally,
the military services are expected to be one of promising sectors for biometrics, so as
to provide precise access to military equipment and authorized areas [Koltzsch et al.
2007]. Within banking also there are examples of biometrics use such as, in early 2012,
Turkey’s largest commercial bank introduced biometric technology at over 3,000 bank
ATMs permitting their clients to withdraw cash without any further verification [Unar
et al. 2014]. Similarly, in May 2015, India’s private sector bank, ICICI, also announced
the use of a voice recognition system for allowing consumers to accomplish bank-
ing transactions [icici.com 2015]. According to the reports of the Biometrics Research
Group, “the total income streams for biometrics utilized in the global banking sector
will rise from US$900 million in 2012 to US$1.8 billion by the end of 2015.” In 2013,
Apple, Inc. launched biometrics technology in their personal devices, iPhone 5 Smart-
phone and iPad Air 2, by embedding fingerprint sensor for security purpose. Recently,
in 2014, LG Electronics, Samsung Electronics, and various other mobile companies
launched their mobile variants with face-detection and iris-recognition features. More-
over, in the healthcare industry, biometrics have been applied to patient record storage,
medical monitoring, access control, mobile-healthcare, workforce management, etc. As
per the “Acuity Market Intelligence” report, presented in Figure 15, biometrics-based
identity management systems showed a continuously increasing market demand from
2007 to 2015. Also, according to Acuity Market Intelligence “The Global Biometrics
and Mobility Report: The Convergence of Commerce and Privacy,” the mobile bio-
metric revenues will reach a total of 34.6 billion dollars till the end of 2020 and the

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:35

Fig. 15. Biometric revenue [ACUITY: market Fig. 16. Mobile biometric revenue [ACTUITY:
intelligence]. The global biometrics and mobility report.

yearly growth projection is shown in Figure 16. Apart from this, fingerprint, face,
and iris technologies are also being used as safety mechanisms in laptops/notebooks
[Socolinsky et al. 2003]. Biometric technology not only offers device security and theft
prevention in e-banking, mobile phone, or laptop, but simultaneously avoids the need
of memorizing the user name and passwords.
In general, it has been seen that fingerprint, iris, speech, and face modalities globally
rule over the biometrics market and are widely acceptable in numerous areas rang-
ing from identification of documents to consumer applications [Marsico et al. 2014].
However, various other biometric traits such as hand biometrics are also steadily gain-
ing importance and receiving increasing attention to completely satisfy the need of
real time problems due to their highly discrete and informative structural property
along with the advantages offered by them in day-to-day business and public utilities
[Unar et al. 2014]. Fingerprint, and hand geometry are the longest usable biometrics,
which may be due to low cost sensors and higher user acceptance [Luca and Roll 2004;
Sanchez-Reillo et al. 2000]. Palm print and handprint are purely expansion to finger-
print technology and their high resolution images find significance in forensics, missing
child identification etc. [Lemes et al. 2011].
A Biolink APIS (Automated Palm Print Identification System) have been utilized
for law enforcement, and criminal forensic case studies throughout the world [Hand-
based Biometrics 2003], whereas, hand geometry has been implemented only for ac-
cess control, time and attendance, and e-commerce applications [Sanchez-Reillo, and
Gonzalez-Marcos 2000]. Commercially, hand-geometry-based Immigration and Nat-
uralization Service Passenger Accelerated Service System (INSPASS) is available
at airports/lands in the United States to facilitate passage through entry barriers
[INSPASS 1996]. In contrast, a fingerprint is an appropriate choice for border control,
forensics, and criminal identification [Maltoni et al. 2009]. Furthermore, vein patterns
have improved the accuracy, by overcoming spoofing attacks, in various financial trans-
actions, computer logins, and mobile devices [Lee et al. 2011; Zhou and Kumar 2011].
In 1997, the Central Research Institute at Hitachi, Ltd. had designed the first-ever
touchless finger vein recognition device, and approximately 80% of the banks,in Japan,
Korea, and Poland use this facility for user verification. Likewise, Fujitsu, Ltd. had
developed Palmsecure-SL, a highly reliable portable biometric authentication system
based on palm vein pattern recognition technology [Palmsecure-SL 2015]. In addition
to this, another up and coming modality is hand-bacteria-based identification, though
not thoroughly studied, but it appears to have lots of potential in resolving forensic
cases.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:36 G. Jaswal et al.

As already mentioned, amongst the hand modalities, in recent years a huge interest
has grown in finger knuckle based personal authentication. However, finger-knuckle
based personal recognition is still not commercially viable but it has tremendous
scope in access control, crime scene investigation, surveillance, kidnapping applications
[Kumar 2014]. The various successful instances linked to the use of the finger knuckle
pattern as a forensic mark has been reported in the past. In late 1941, law enforcement
agencies convicted gangster Robert Phillips when the ridges around his second knuckle
were found to be similar to prints collected at the scene of a crime [Crimemuseum.org
2009]. Likewise, in 2012, Dean Hardy was jailed for 10 years when unique freckles
on his finger joint were found to be similar to the hand seen in the images of a crime
[Bexleytimes.co.uk. 2012]. Also, researchers have developed a preliminary version of
an Android-based smartphone application to enhance the security by using the fin-
ger knuckle image [Cheng, and Kumar 2012]. According to “Biometric Market Report
2003–2007,” all market studies show that the biometric market has developed quickly
at a yearly world market growth of 30% to 60% in the last few years Further, it has
been estimated that the biometric marketplace will be more trending in Asia Pacific
region and African countries in the coming years than in Japan, the United States,
and European nations because they have already started working on biometrics-based
identity validation systems [Kwon 2014]. As mentioned in the report titled “Market
Shares, Strategies, and Forecasts Worldwide, 2013 to 2019,” the globally recorded bio-
metrics income in 2012 was approximately “$5.2 billion and anticipated to reach $16.7
billion by 2019.” These reports reflect the expected growth in the biometric market
and therefore, efforts need to be made in developing efficient, faster, user-friendly bio-
metric technologies, e.g., based on the knuckle-print, for both commercial and forensic
applications.

9. CONCLUSION AND DISCUSSION


This article presents a comprehensive survey of different algorithms, devices, and
progress in the field of knuckle print technology, which has emerged in the last few
years as a new biometric trait. It contains curved lines like structures and is rich with
texture information on both the back and front surfaces of fingers. The merits and
drawbacks of feature extraction and matching techniques were discussed. From the
review, it is observed that finger-knuckle-based authentication system has not been
explored much for security applications. The directions in which efforts can be made
for further improvement are as follows:

• Although FKP physiological characteristics are unique, there is yet a technologi-


cal gap between the proposed methods and the industry due to lack of specialized
acquisition devices and large-scale databases.
• Standardization of knuckle print databases is required for evaluating and comparing
the performances of various proposed algorithms.
• Most of the earlier methods now have been applied to FKP recognition and found
to be good. However, to optimize the performance in terms of recognition rate and
computation time, more new algorithms need to be designed in future based on prior
knowledge of knuckle print.
• Mostly knuckle prints from the index and middle fingers have been investigated.
Therefore, a reliable bimodal system can be proposed by selecting various combina-
tions of finger knuckles which can offer higher accuracy. An important issue of ROI
extraction for FKP recognition can also be explored.
• Efforts also need to be made to acquire knuckle prints from inside and outside of the
hand formed between proximal and the metacarpal phalanx bones.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:37

• More work is required for overcoming the problem of finger pose variation in FKP
recognition.
• It is possible to develop a highly precise hand-based multi-modal or multi-sensor bio-
metric systems by integrating finger knuckles with other hand patterns for civilian
and forensic applications.
• It is necessary to develop large-scale non-uniform FKP and IKP databases, incorpo-
rating various variations.
• Novel IKP recognition methods need to be developed leading to a hybrid knuckle
print system based on IKP and FKP.
• To ensure that the biometric trait captured by a sensor has been taken from a living
person or not, there is possibility to introduce a liveness detection mechanism using
finger surface which can be integrated with knuckle prints.
• A multi-sensory system may be proposed and developed for capturing inner and outer
side knuckles together and also they may be concatenated into a single template for
next level processing.

In conclusion, it can be stated that knuckle print shows a lot of promising benefits as a
biometric modality such as steady, unique biological properties among the individuals,
requires inexpensive collecting equipment and low processing time. Also, as supple-
mentary hand information knuckle prints can be easily combined with other patterns
or each other to develop a highly precise and computationally economic multi-modal
hand biometric system.

REFERENCES
Narishige Abe and Takashi Shinzaki. 2014. On the fly finger knuckle print authentication. Proceedings of
International Society for Optics and Photonics. Vol. 9075, 907504–8.
AADHAR. 2010. Communicating to a billion, Unique Identification Authority of India, An Awareness
and Communication Report, ACSAC. Retrieved from http://uidai.gov.in/UID_PDF/Front_Page_Articles/
Events/AADHAAR_PDF.pdf.
ACUITY. 2007. Market intelligence, biometrics market development: Mega trends and meta drivers. Re-
trieved from http://www.acuity-mi.com/hdfsjosg/euyotjtub/Biometrics%202007%20London.pdf.
ACUITY. 2015. The Global Biometrics and Mobility Report: The convergence of commerce and pri-
vacy market analysis and forecasts 2014 to 2020. Retrieved from http://www.acuity-mi.com/GBMR_
Report.php#sthash.xqnXhlDa.dpuf.
Harbi AlMahafzah, H. S. Sheshadri, and Mohammad Imran. 2012a. A case study on multi-instance finger
knuckle print score and decision level fusions. International Journal of Scientific & Engineering Research
3 (2012).
Harbi AlMahafzah, H. S. Sheshadri, and Mohammad Imran. 2014. Multi-algorithm decision-level fusion
using finger-knuckle-print biometric. In Emerging Research in Electronics, Computer Science and Tech-
nology. Springer, 39–47.
Harbi AlMahafzah, Mohammad Imran, and H. S. Sheshadri. 2012b. Multibiometric: Feature level fusion
using FKP multi-instance biometric. International Journal of Computer Science 9, 4 (2012).
Harbi AlMahafzah, Mohammad Imran, and H. S. Sheshadri. 2012c. Multi-algorithm feature level fusion
using finger knuckle print biometric. In Computer Applications for Communication, Networking, and
Digital Contents. Springer, Berlin, 302–311.
Ola M. Aly, Hoda M. Onsi, Gouda I. Salama, and Tarek A. Mahmoud. 2013a. A multimodal biometric
recognition system using feature fusion based on PSO. International Journal of Advanced Research in
Computing and Communication Engineering 2, I1 (2013): 4336–4343.
Ola M. Aly, Tarek A. Mahmoud, Gouda I. Salama and Hoda M. Onsi. 2013b. An adaptive multimodal
biometrics system using PSO. International Journal of Advanced Computer Science and Applications 4,
7 (2013), 158–165.
Mounir Amraoui, El Aroussi Mohamed, Saadane Rachid, and Wahbi Mohammed. 2012. Finger-knuckle-
print recognition based on local and global feature sets. Journal of Theoretical & Applied Information
Technology 46, 1 (2012).

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:38 G. Jaswal et al.

M. Amraoui, J. Abouchabaka, and M. El Aroussi. 2014. Finger knuckle print recognition based on multi-
instance fusion of local features sets. In Proceedings of the IEEE International Conference on Multimedia
Computing and Systems. 87–92.
BEXLEYTIMES.co.uk. 2012. Bromley Paedophile Dean Hardy jailed for 10 years, Bexley Times. Re-
trieved from http://www.bexleytimes.co.uk/news/crime-court/bromley_paedophile_dean_hardy_jailed_
for_10_years_1_1176957.
Vinayak Ashok Bharadi. 2012. Texture feature extraction for biometric uthentication using partitioned
complex planes in transform domain. In Proceedings of the International Conference & Workshop on
Emerging Trends in Technology. 39–45.
G. S. Badrinath, Aditya Nigam, and Phalguni Gupta. 2011. An efficient finger-knuckle-print based recog-
nition system fusing SIFT and SURF matching scores. In Information and Communications Security.
Springer, Berlin, 374–387.
Rima Belguechi, Estelle Cherrier, Mohamad El Abed, and Christophe Rosenberger. 2011. Evaluation of
cancelable biometric systems: Application to finger knuckle prints. In Proceedings of IEEE International
Conference on Hand-Based Biometrics. 1–6.
Asish Bera, Debotosh Bhattacharjee, and Mita Nasipuri. 2014. Hand biometrics in digital forensics. In
Computational Intelligence in Digital Forensics: Forensic Investigation and Applications. Springer In-
ternational Publishing, 145–163.
Hamza Boucenna and Hamami Latifa. 2014. Finger Print Knuckle Feature Extraction Using Multi
Scale Wavelet Edge Detection Method. http://manifest.univ-ouargla.dz/index.php/component/search/
?searchword=Hamza%20Boucenna&searchphrase=all&Itemid=101.
Raffaele Cappelli. 2010. Biometric System laboratory, DISI- University of Bologna. Web link:
http://biolab.csr.unibo.it/DatabaseSoftware.asp?organize=Software.
Raffaele Cappelli, Matteo Ferrara, and Davide Maltoni. 2010. Minutia cylinder-code: A new representation
and matching technique for fingerprint recognition. IEEE Transactions on Pattern Analysis and Machine
Intelligence 32, 12 (2010), 2128–2141.
M. Carlos Travieso, Jaime R. Ticay Rivas, Juan C. Briceno, and Marcos del Pozo Banos. 2014. Hand shape
identification on multirange images. Information Sciences 275 (2014), 45–56.
Center for machine vision and signal analysis. 2012. Web link: http://www.cse.oulu.fi/CMV/Downloads/
LPQMatlab.
Center for machine vision and signal analysis. 2014. Web link: http://www.cse.oulu.fi/CMV/Downloads/
LBPMatlab.
Vassilios Chatzis, Adrian G. Bors, and Ioannis Pitas. 1999. Multimodal decision-level fusion for person
authentication. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 29,
6 (1999), 674–680.
Michal Choras and Rafal Kozik. 2010a. Knuckle biometrics based on texture features. In Proceedings of
IEEE International Workshop on Emerging Techniques and Challenges for Hand-Based Biometrics. 1–5.
Michal Choraś and Rafal Kozik. 2010b. Knuckle biometrics for human identification. In Image Processing
and Communications Challenges 2. Springer, Berlin, 91–98.
Michal Choras and Rafal Kozik. 2011. Contactless palm print and knuckle biometrics for mobile devices. In
Pattern Analysis Application. Springer, Berlin, 73–85.
Michal Choras. 2013. A short overview of feature extractors for knuckle biometrics. In Proceedings of the 8th
International Conference on Computer Recognition Systems. Springer International Publishing. 519–
526.
Eui Chul Lee, Hyunwoo Jung, and Daeyeoul Kim. 2011. New finger biometric method using near infrared
imaging. Sensors 11, 3 (2011), 2319–2333.
Charles Colbert. 1997. Knuckle profile identity verification system. U.S. Patent No. 5,594,806 (14 Jan 1997).
College of Electronics and Information Engineering, the Hebei University IKP Database. 2013. Homepage
Retrieved from http://www.ceie.hbu.cn.
CRIMEMUSEUM.org. 2009. John Dillinger- fingerprint obliteration, Crime Museum. Retrieved from
http://www.crimemuseum.org/blog/john-dillinger-fingerprint-obliteration.
Aritra Dey, Akash Pal, Aroma Mukherjee, and Karabi Ganguly Bhattacharjee. 2015. An approach for iden-
tification using knuckle and fingerprint biometrics employing wavelet based image fusion and SIFT
feature detection. In Advancements of Medical Electronics. Springer, 149–159.
Wafa El-Tarhouni, Muhammad K. Shaikh, Larbi Boubchir, and Ahmed Bouridane. 2015. Multi-scale shift
local binary batternsbased-descriptor for finger-knuckle-print recognition. In Proceedings of 26th Inter-
national Conference on Microelectronics. 184–187.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:39

Miguel A. Ferrer, Aythami Morales, Carlos M. Travieso, and Jesws B. Alonso. 2007. Low cost multimodal
biometric identification system based on hand geometry, palm and finger print texture. In Proceedings
of the 41st Annual IEEE International Carnahan Conference on Security Technology. 52–58.
Miguel A. Ferrer, Carlos M. Travieso, and Jesus B. Alonso. 2006. Using hand knuckle texture for biometric
identifications. IEEE Aerospace and Electronic Systems Magazine (2006), 23–27.
N. Fierer, C. L. Lauber, N. Zhoub, D. McDonaldc, E. K. Costelloc, and R. Knight. 2010. Forensic identification
using skin bacterial communities. PNAS 107, 14, 6477–6481.
Guangwei Gao, Jian Yang, Jianjun Qian, and Lin Zhang. 2014. Integration of multiple orientation and
texture information for finger-knuckle-print verification. Neurocomputing 135 (2014), 180–191.
Guangwei Gao, Lei Zhang, Jian Yang, Lin Zhang, and David Zhang. 2013. Reconstruction based finger
knuckle print verification with score level adaptive binary fusion. IEEE Transactions on Image Process-
ing. 22, 12, 5050–5062.
Rodrigo de Luis Garcia and Carlos Alberola. 2003. Biometric identification systems. Signal Processing 83,
12, 2539–2557.
Guangwei Gao and Jian Yang. 2013. Weight competitive coding for finger-knuckle-print verification. In
Biometric Recognition. Springer International Publishing, 185–192.
Marcialis Gian Luca and Fabio Roll. 2004. Fingerprint verification by fusion of optical and capacitive sensors.
Pattern Recognition Letters 25, 11, 1315–1322.
Ibrahim A. Gomaa, Gouda I. Salama, and Ibrahim F. Imam. 2012. Biometric OAuth service based on finger
knuckles. In Proceedings of 7th International Conference on Computer Engineering & Systems (IC-
CES’12). 170–175.
Mislav Grgic. Face Recognition Home page. Web link: http://www.face-rec.org/algorithms/#PCA.
Jyotsana Grovera, and Madasu Hanmandlub. 2015. Hybrid fusion of score level and adaptive fuzzy decision
level fusions for the finger-knuckle-print based authentication. Applied Soft Computing 31 (2015), 1–13.
D. S. Guru, K. B. Nagasundara, and S. Manjunath. 2010. Feature level fusion of multi-instance finger
knuckle print for person identification. In Proceedings of the 1st International Conference on Intelligent
Interactive Technologies and Multimedia. ACM, 186–190.
Hand-based Biometrics. 2003. Biometric Technology Today 11, 7, 9–11.
Chetana Hegde, J. Phanindra, P. Deepa Shenoy, K. R. Venugopal, and Lalit M. Patnaik. 2011a. Human au-
thentication using finger knuckle print. In Proceedings of the 4th Annual ACM Bangalore Conference. 9.
Chetna Hegde, P. Deepa Shenoy, K. R. Venugopal, and L. M. Patnaik. 2011b. FKP biometrics for human
authentication using Gabor wavelets. In Proceedings of the TENCON 2011 IEEE Region 10 Conference.
1149–1153.
Chetana Hegde, P. D. Shenoy, K. R. Venugopal, and L. M. Patnaik. 2013. Authentication using finger knuckle
prints. Signal, Image and Video Processing 7, 4, 633–645.
Baptiste Hemery, Romain Giot, and Christophe Rosenberger. 2010. SIFT based recognition of finger knuckle
print. In Proceediings of the 3rd Norsk Information Security Conference. 45–56.
Amanda B. Holbert, Holly P. Whitelam, Letha J. Sooter, Larry A. Hornak, and Jeremy M. Dawson. 2015.
Hand bacteria as an identifier: A biometric evaluation. Network Modeling Analysis in Health Informatics
and Bioinformatics 1, 4(1), 1–1.
L. Hong, A. K. Jain, and S. Pankanti. 1999. Can multi biometrics improve performance? In Proceedings of
Auto ID. 59–64.
Abdelhameed Ibrahim and A. Tharwat. 2014. Biometric Authentication Methods Based on Ear and Finger
Knuckle Images. Int. J. Comput. Sci. Issues (IJCSI) 11, no. 3 (2014): 134–138.
International Biometric Group. 2003. Biometric Market Report 2003–2007, New York, 2.
IIT Delhi. 2009. Finger Knuckle Database. Retrieved from http://www4.comp.polyu.edu.hk/∼CSajaykr/IITD/
iitd_knuckle.htm.
Institute of Science and Computer Engineering Department, Erciyes University. 2015. Hand Database.
Retrieved from http://aves.erciyes.edu.tr/neclaozkaya/dokumanlar.
ICICIBANK.com. 2015. ICICI Bank introduces voice recognition for biometric authentication. Retrieved
from http://www.icicibank.com/managed-assets/docs/about-us/2015/voice-recognition-for-biometric-
authentication.pdf.
INSPASS. 1996. Biometric Consortium: INS passenger accelerated service system. Retrieved from http://
www.biometrics.org/html/REPORTS/INSPASS.html.
M. Islam, M. M. Hasan, M. M. Farhad, T. R. Tanni. 2012. Human authentication process using finger knuckle
surface with artificial neural networks based on a hybrid feature selection method. In Proceedings of
15th International Conference on Computer and Information Technology. 61–64.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:40 G. Jaswal et al.

Anil K. Jain, Arun Ross, and Sahil Prebhakar 2004b. An introduction to biometric recognition. IEEE Trans-
actions on Circuits and Systems for Video Technology 14, 1, 4–20.
Anil Kumar Jain, Arun Ross, and Sharath Pankanti. 2006. Biometrics: A tool for information security. IEEE
Transactions on Information Forensics and Security 2006, 125–143.
A. K. Jain, S. C. Dass, and K. Nandakumar. 2004a. Can soft biometric traits assist user recognition? In
Defense and Security, International Society for Optics and Photonics, 561–572.
Anil K. Jain and Ajay Kumar. 2010. Biometrics of next generation: An overview. Second Generation Biometrics
12, 1 (2010), 2–3
Umarani Jayaraman, Aman Kishore Gupta, and Phalguni Gupta. 2014. Boosted geometric hashing based
indexing technique for finger-knuckle-print database. Information Sciences 275 (2014), 30–44.
W. Jia, D. Huang, and D. Zhang. 2008. Palm print verification based on robust line orientation code. Pattern
Recognition 41 (2008), 1504–1513.
Xiaoyuan Jing, Wenqian Li, Chao Lan, and Yongfang Yao. 2011. Orthogonal complex locality preserving
projections based on image space metric for finger-knuckle-print recognition. In Proceedings of the IEEE
International Conference on Hand-Based Biometrics. 1–6.
D. G. Joshi, Y. V. Rao, S. Kar, V. Kumar, and R. Kumar. 1998. Computer vision based approach to personal
identification using finger crease patterns. Pattern Recognition 31 (Jan. 1998), 15–22.
William Q. Jungbluth. 1989. Knuckle print identification. Journal of Forensic Identification 39 (1989), 375–
380.
Karbhari V. Kale, Yogesh S. Rode, Majharoddin M. Kazi, Siddharth B. Dabhade, and Shrinivas V. Chavan.
2013. Multimodal biometric system using fingernail and finger knuckle. In Proceedings of the IEEE
International Symposium on Computational and Business Intelligence. 279–283.
Wenxiong Kang, Xiaopeng Chen, and Qiuxia Wu. 2015. The biometric recognition on contactless multi-
spectrum finger images. Infrared Physics & Technology 2015, 19–27.
V. Kanhangad, A. Kumar, and D. Zhang. 2009. Combining 2d and 3d hand geometry features for biometric
verification. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern
Recognition Workshops. 39–44.
Usha Kazhagamani and Ezhilarasan Murugasen. 2014a. A Hough transform based feature extraction al-
gorithm for finger knuckle biometric recognition system. In Advanced Computing, Networking and
Informatics, vol. 1. 463–472.
Usha Kazhagamani and Ezhilarasan Murugasen. 2015. Contourlet transform based feature extraction
method for finger knuckle recognition system. In Computational Intelligence in Data Mining, vol. 3.
Springer, 407–416.
Usha Kazhagamani and M. Ezhilarasan. 2014b. Finger knuckle biometrics—A review. Computers and Elec-
trical Engineering.
H. B. Kekre and V. A. Bharadi. 2010. Finger-knuckle-print region of interest segmentation using gradient
field orientation and coherence. In Proceedings of 3rd International Conference on Emerging Trends in
Engineering and Technology (ICETET’10). 130–133.
H. B Kekre and V. A. Bharadi. 2011. Finger-knuckle-print verification using Kekre’s wavelet transform.
In Proceedings of the International Conference & Workshop on Emerging Trends in Technology. ACM,
32–37.
S. Khellat-Kihel, R. Abrishambaf, J. L. Monteiro, and M. Benyettou. 2016. Multimodal fusion of the finger
vein, fingerprint and the finger-knuckle-print using Kernel Fisher analysis. Applied Soft Computing 42,
439–447.
Min Ki Kim and Patrick J. Flynn. 2014. Finger knuckle print verification based on vector consistency of
corresponding interest points. In Proceedings of the IEEE Winter Conference on Applications of Computer
Vision. 992–997.
Alagar Kirthika and Arumugam Subbanna. 2012. Combined SIQT and SSF matching score for feature
extraction evaluation in finger knuckle print recognition. WSEAS Transactions on Computers 12, (2012),
384–393.
Gregor Koltzsch. 2007. Biometrics market segments and applications. Journal of Business Economics and
Management 2007, 119–122.
Tao Kong, Gongping Yang, and Lu Yang. 2013. A new finger-knuckle-print ROI extraction method based on
probabilistic region growing algorithm. International Journal of Machine Learning and Cybernetics 5,
4 (2014), 569–578.
Tao Kong, Gongping Yang, and Lu Yang. 2014. A hierarchical classification method for finger knuckle print
recognition. EURASIP Journal on Advances in Signal Processing 2014: 44.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:41

Neha Kudu, Disha Suru, and Sunil Karamchandani. 2013. Finger knuckle print authentication based on
KWT transform using FKP capture device. In Proceedings on the International Conference on Commu-
nication Technology. 12–16.
S. S. Kulkarni and R.D. Rout. 2012. Secure biometrics: Finger knuckle print. International Journal of
Advanced Research in Computer Engineering 1, 10 (2012).
A. Kumar and C. Kwong. 2015. Towards contactless, low-cost and accurate 3D fingerprint identification.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(3), 681–96.
Ajay Kumar and Ch. Ravikant. 2009. Personal authentication using finger knuckle surface. IEEE Transac-
tions on Information Forensics and Security 4, 1, 98–109.
Ajay Kumar and K. Venkata Prathyusha. 2009. Personal authentication using hand vein triangulation and
knuckle shape. IEEE Transactions on Image Processing 18 (2009), 2127–2136.
Ajay Kumar and Yingbo Zhou. 2009a. Human Identification Using Knuckle Codes. In Proceedings of the 3rd
International Conference on Biometrics, Theory and Applications (BTAS’09). 147–152.
Ajay Kumar and Yingbo Zhou. 2009b. Personal identification using finger knuckle orientation features.
Electronics Letters 45, 20 (2009), 1023–1025.
Ajay Kumar and Zhihuan Xu. 2014. Can we use second minor finger knuckle patterns to identify humans? In
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 106–112.
Soyuj Kumar Sahoo, Tarun Choubisa, and SR Mahadeva Prasanna. 2014. Multimodal biometric person
authentication: A review. IETE Technical Review 29, 1 (2012), 54–75.
K. Kumar Sricharan, A. Aneesh Reddy, and A. G. Ramakrishnan. 2006. Knuckle based hand correlation for
user authentication. In Proceedings of the Defense and Security Symposium. International Society for
Optics and Photonics, 62020X-62020X.
Ajay Kumar and Bichai Wang. 2015. Recovering and matching minutiae patterns from finger knuckle images.
Pattern Recognition Letters 68, 361–367.
Amioy Kumar, Madasu Hanmandlu, and H. M. Gupta. 2013a. Ant colony optimization based fuzzy binary
decision tree for bimodal hand knuckle verification system. Expert Systems with Applications 40, 2
(2013), 439–449.
Amioy Kumar, Madasu Hanmandlu, and H. M. Gupta. 2013b. Fuzzy binary decision tree for biometric based
personal authentication. Neurocomputing (2013), 87–97.
Amioy Kumar, Shruti Garg, and Madasu Hanmandlu. 2014. Biometric authentication using finger nail
plates. Expert Systems with Applications 41, 2014, 373–386.
Ajay Kumar. 2012. Can we use minor finger knuckle images to identify humans? In Proceedings of IEEE
Fifth International Conference on Biometrics: Theory, Applications and Systems, 55–60.
Ajay Kumar. 2014. Importance of Being Unique from Finger Dorsal Patterns: Exploring Minor Finger
Knuckle Patterns in Verifying Human Identities. Information Forensics and Security, IEEE Transactions
on 9, no. 8 (2014): 1288–1298.
Ajay Kumar. 2015. Web link: http://www.comp.polyu.edu.hk/csajaykr/Knuckle_Minutiae.rar.
Daichi Kusanagi, Shoichiro Aoyama, Koichi Ito, and Takafumi Aoki. 2014. Multi-Finger knuckle Recognition
from video sequence: Extracting Accurate Multiple Finger Knuckle Regions. In Proceedings of the IEEE
International Joint Conference on Biometrics. 1–8.
Young-Bin Kwon. 2014. Biometrics in Asia. Chung -Ang University, Korea. http://biometrics.org/
bc2009/presentations/tuesday/Kwon%20MR%2014%20Tue%20345%20PM%20-%20400%20PM.pdf.
R. P. Lemes, O. R. Bellon, L. Silva, and A. K. Jain. 2011. Biometric recognition of newborns: Identification
using palmprints. In Proceedings of the IEEE International Joint Conference on Biometrics. 1–6.
Kunlun Li, Hongxia Yuan, and Ming Liu. 2010. A novel preprocessing algorithm of knuckle print. In Pro-
ceedings of International Conference on Artificial Intelligence and Computational Intelligence, vol. 2.
49–53.
Zichao Li, Kuanquan Wang, and Wangmeng Zuo. 2012. Finger-knuckle-print recognition using local orienta-
tion feature based on steerable filter. In Emerging Intelligent Computing Technology and Applications.
Springer, Berlin, 224–230.
Fei Li, Mingyan Jiang, Xianye Ben, Tingting Pan, and Menglei Sun. 2015. Group collaborative representation
with L2 norm regularization in finger-knuckle-print recognition. Journal of Computational Information
Systems 11, 3, 1053–1062.
Qiang Li, Zhengding Qiu, Dongmei Sun, and Jie Wu. 2004. Personal identification using knuckle print.
SINOBIOMETRICS, Guangzhou. 680–689.
M. Liu, Y. Tian, and Y. Ma. 2013. Inner knuckle-print recognition based on improved LBP. In Proceedings
of the International Conference on Information Technology and Software Engineering. Springer, Berlin,
623–630.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:42 G. Jaswal et al.

Ming Liu, Yongmei Tian, and Li Lihua. 2014. A new approach for inner-knuckle-print recognition. Journal
of Visual Languages & Computing 25 (2014), 33–42.
Li Lu and Jialiang Peng. 2014. Finger multi-biometric cryptosystem using feature-level fusion. International
Journal of Signal Processing, Image Processing and Pattern Recognition 7, 3 (2014), 223–236.
Rong-Fang Luo, Tu-Sheng Lin, and Ting Wu. 2007. Personal recognition with finger crease pattern. Opto-
Electronics Review 34, 6 (2007), 116–121.
MARKETSANDMARKETS.com. 2015. Biometrics System Market- Global Forecast to 2020. Retrieved from
http://www.marketsandmarkets.com/Market-Reports/next-generation-biometric-technologies-market-
697.html.
N. B. Mahesh Kumar and K. Premalatha. 2015. Personal authentication using FDOST in finger knuckle-
print biometrics. International Journal of Computer, Control, Quantum and Information Engineering 9,
15, 323–328.
Sotiris Malassiotis, Niki Aifanti, and Michael G. Strintzis. 2006. Personal authentication using 3-D finger
geometry. IEEE Transactions on Information Forensics and Security 1, 1 (2006), 12–21.
D. Maltoni, D. Maio, A. K. Jain, and S. Prabhakar. 2009. Handbook of Fingerprint Recognition, 2nd ed.
Springer-Verlag.
S. Manjunath, D. S. Guru, K. B. Nagasundara, and M. G. Suraj. 2013. 2D2LPI: Two directional two dimen-
sional locality preserving indexing. International Journal of Computer Vision and Image Processing 3,
2 (2013), 17–31.
Maria De Marsico, Chiara Galdi, Michele Nappi, and Daniel Riccio. 2014. FIRME: Face aned iris recognition
for mobile ngagement. Image and Vision Computing 32, 12 (2014), 1161–1172.
B. Mathivanan, V. Palanisamy, and S. Selvarajan. 2012. A hybrid model for human recognition system using
hand dorsum geometry and finger knuckle print. Journal of Computer Science 8, 11 (2012).
Abdallah Meraoumia, Salim Chitroub, and Ahmed Bouridane. 2011a. Palmprint and finger-knuckle-print for
efficient person recognition based on Log-Gabor filter response. Analog Integrated Circuits and Signal
Processing 69, 1 (2011), 17–27.
Abdallah Meraoumia, Mohammed Saigaa, Salim Chitroub, and Ahmed Bouridane. 2011b. Fusion of finger-
knuckle-print and palm print for an efficient multi-biometric system of person recognition. In Proceed-
ings of IEEE International Conference on Communication. 1–5.
Abdallah Meraoumia, Salim Chitroub, and Ahmed Bouridane. 2013. On-line finger-knuckle-print identifi-
cation using Gaussian mixture models & discrete cosine transform. In Proceedings of the International
Conference on Electronics and Oil. 5–6.
He Mingxing, Shi-Jinn Horng, Pingzhi Fan, Ray-Shine Run, Rong-Jian Chen, Jui-Lin Lai, Muhammad
Khurram Khan, and Kevin Octavius Sentosa. 2013. Performance evaluation of score level fusion in
multimodal biometric systems. Pattern Recognition 43 (2013), 1789–1800.
Neha Mittal, Madasu Hanmandlu, and Ritu Vijay. 2012. A finger-knuckle-print authentication system based
on DAISY descriptor. In Proceedings of 12th International Conference on Intelligent Systems Design and
Applications (ISDA’12). 126–130.
Maruf Monwar and Marina L. Gavrilova. 2009. Multimodal biometric system using rank-level fusion ap-
proach. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics (2009), 867–878.
Nafiseh S. Moosavi Ehteshami, Mahmoud Tabandeh, and Emad Fatemizadeh. 2012. A new ROI extrac-
tion method for FKP images using global intensity. In Proceedings of 6th International Symposium on
Telecommunication. 1147–1150.
A. Morales, C. M. Travieso, M. A. Ferrer, and J. B. Alonso. 2011. Improved finger-knuckle-print authentication
based on orientation enhancement. Electronics Letters (2011), 380–381.
A. Morales, M. A. Ferrer, C. M. Travieso, and Jesws B. Alonso. 2007. A knuckles texture verification method
in a transformed domain. In Proceedings of 1st Spanish Workshop on Biometrics Girona.
A. Muthukumar and S. Kannan. 2013a. K-Means Based Multimodal Biometric Authentication Using Finger-
print and Finger Knuckle print with Feature Level Fusion. IJST, Transactions of Electrical Engineering
37 (2013), 133–145.
A. Muthukumar and S. Kannan. 2013b. Finger Knuckle print Recognition with SIFT and K-Means Algorithm.
ICTACT Journal on Image and Video Processing 3, 583–588.
Loris Nanni and Alessandra Lumini. 2009. A multi-matcher system based on knuckle-based features. Neural
Computing and Applications 18, 1 (2009), 87–91.
Loris Nanni and Alessandra Lumini. 2009. Web link: http://bias.csr.unibo.it/nanni/diffusion.rar.
Loris Nanni, Alessandra Lumini, and Sheryl Brahnam. 2010. High performance set of features for biometric
data. International Journal of Automated Identification Technology 2, 1, 1–7.
Loris Nanni. 2010. Web link: http://www.cubs.buffalo.edu/resources/enchancement.zip.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:43

M. Natarajan, T. Mekala, and R. Vikram. 2014. Multi-modal crypto-biometric system based on session key
navigation for secure transaction. In Proceedings of IEEE International Conference on Innovations in
Engineering and Technology.
Shubhangi Neware, Kamal Mehta, and A. S. Zadgaonkar. 2012. Finger knuckle surface biometrics. Interna-
tional Journal of Emerging Tech. and Advanced Engineering 2, 12, 452–455.
Shubhangi Neware, Kamal Mehta, and A. S. Zadgaonkar. 2013. Finger knuckle identification using principal
component analysis and nearest mean classifier. International Journal of Computer Applications 70, 9
(2013), 18–23.
Shubhangi Neware, Kamal Mehta, and A. S. Zadgaonkar. 2014. Finger knuckle print identification using
Gabor features. International Journal of Computer Applications 98, 14–17.
Aditya Nigam and Phalguni Gupta. 2011. Finger knuckle print based recognition system using feature
tracking. Biometric Recognition, Springer Berlin Heidelberg, 125–132.
Aditya Nigam and Phalguni Gupta. 2013a. Multimodal Personal Authentication System Fusing palm print
and knuckle print. In Emerging Intelligent Computing Technology and Applications. Springer, Berlin,
188–193.
Aditya Nigam and Phalguni Gupta. 2013b. Quality assessment of knuckle print biometric images. In IEEE
International Conference on Image Processing 2013. 4205–4209.
Aditya Nigam and Phalguni Gupta. 2014. Multimodal Personal Authentication using Iris and Knuckle print.
In Intelligent Computing Theory. Springer International Publishing, 819–825.
Aditya Nigam and Phalguni Gupta. 2015. Designing an accurate hand biometric based authentication system
fusing finger knuckleprint and palmprint. Neurocomputing 15, 1120–1132.
V. Ojansivu and J. Heikkila. 2008. Blur insensitive texture classification using local phase quantization. In
Image and Signal Processing, vol. 5099. 236–243.
Michael Kah Ong Goh, Tee Connie, and Andrew Teoh Beng Jin. 2010a. Bi-modal palm print and knuckle
print recognition system. Journal of IT in Asia 3 (2010), 53–66.
Michael Kah Ong Goh, Tee Connie, and Andrew Teoh Beng Jin. 2010b. An innovative contactless palm print
and knuckle print recognition system. Pattern Recognition Letters, 1708–1719.
Michael Kah Ong Goh, Tee Connie, and Andrew Teoh Beng Jin. 2010c. Robust palm print and knuckle print
recognition system using a contactless approach. In Proceedings of 5th IEEE Conference on Industrial
Electronics and Applications (ICIEA’10). 323–329.
Necla Ozkaya and Neslihan Kurat. 2014. Discriminative common vector based finger knuckle recognition.
Journal of Visual Communication and Image Representation 25, 7 (2014), 1647–1675.
Necla Ozkaya. 2015. Metacarpophalangeal joint patterns based personal identification system. Applied Soft
Computing 37, 288–295.
PALMSECURE-SL. 2015. Fujitsu to Launch Portable Palm Vein Authentication Sensor, Ideal for Mobile
Devices. Retrieved from. http://www.fujitsu.com/global/about/resources/news/press-releases/2015/0616–
01.html.
Jialiang Peng, Qiong Li, Qi Han, and Xiamu Niu. 2013. Feature-level fusion of finger biometrics based on
multi-set canonical correlation analysis. In Biometric Recognition. Springer International Publishing,
216–224.
Jialiang Peng. 2013. Linear discriminant multi-set canonical correlations analysis (LDMCCA): An efficient
approach for feature fusion of finger biometrics. Multimedia Tools and Applications 2013, 1–18.
Esther Perumal and Ramachandran Shanmugalakshmi. 2013. A multimodal biometric system based on
palm print and finger knuckle print recognition methods. International Arab Journal of Information
Technology (IAJIT) 12, 2.
Kyi Pyar Zaw and Aung Soe Khaing. Implementation of contactless finger knuckle identification system.
International Journal of Science, Engineering and Technology Research 3, 6 (2014), 1599–1605.
Esther Rani and R. Shanmugalakshmi. 2013. Finger knuckle print recognition techniques—A survey. IJES
2 (2013), 62–69.
R. D. Raut, Sujata Kulkarni, and Neha N. Gharat. 2014. Biometric authentication using Kekre’s wavelet
transform. In Proceedings of the IEEE International Conference on Electronic Systems, Signal Processing
and Computing Technologies. 99–104.
Ch. Ravikanth and Ajay Kumar. 2007. Biometric authentication using finger back surface. In Proceedings of
the IEEE Conference on Computer Vision and Pattern Recognition. 1–6.
Slobodan Ribaric and Ivan Fratric. 2005. A biometric identification system based on eigenpalm and eigenfin-
ger features. IEEE Transactions on Pattern Analysis and Machine Intelligence 27, 11 (2005), 1698–1709.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:44 G. Jaswal et al.

L. P. Rodriguez A. G. Crespo, M. Lara, and B. R. Mezcua. 2008. Study of Different Fusion Techniques for
Multimodal Biometric Authentication. In Proceedings of IEEE International Conference on Wireless and
Mobile Computing Networking and Communications. 666–671.
Zhao Rui, Kunlun Li, Ming Liu, and Xue Sun. 2009. A novel approach of personal identification based on
single knuckle print image. In Proceedings of the Asia-Pacific Conference on Information Processing.
18–19.
Zhao Rui, Lv Tao, Hou Shunyan, and Shi Jianying. 2011. A novel approach of personal identification based
on the fusion of multifinger knuckle prints. In Advances in information Sciences and Service Sciences
(AISS’11), Vol. 3.
Mohammed Saigaa, Abdallah Meraoumia, Salim Chitroub, and Ahmed Bouridane. 2012. Efficient person
recognition by finger-knuckle-print based on 2D discrete cosine transform. In Proceedings of Interna-
tional Conference on Information Technology and e-Services. 1–6.
Raul Sanchez-Reillo and Ana Gonzalez-Marcos. 2000. Access control system with hand geometry verification
and smart cards. IEEE Aerospace and Electronic Systems Magazine 15, 2 (2000), 45–48.
Raul Sanchez-Reillo, Carmen Sanchez-Avila, and Ana Gonzalez-Marcos. 2000. Biometric identification
through hand geometry measurements. IEEE Transactions on Pattern Analysis and Machine Intelli-
gence 22, 10 (2000), 1168–1171.
E. S. Shameem Sulthana and S. Kanmani. 2014. Implementation and evaluation of SIFT descriptors based
finger-knuckle-print authentication system. Indian Journal of Science and Technology 2014, 374–382.
Zahra S. Shariatmadar and Karim Faez. 2011a. Novel approach for finger knuckle print recognition based
on Gabor feature fusion. In Proceedings of 4th International Congress on Image and Signal Processing,
vol. 3. 1480–1484.
Zahra S. Shariatmadar and Karim Faez. 2011b. An efficient method for finger knuckle print recognition
based on information fusion. In Proceedings of IEEE International Conference on Signal and Image
Processing Applications. 210–215.
Zahra S. Shariatmadar and Karim Faez. 2013. Finger-knuckle-print recognition via encoding local-binary-
pattern. Journal of Circuits, Systems, and Computers 22, 06.
Zahra S. Shariatmadar and Karim Faez. 2014. Finger-Knuckle-Print recognition performance improvement
via multi-instance fusion at the score level. Optik-International Journal for Light and Electron Optics
125, 3 (2014), 908–910.
Shefali Sharma, Shiv Ram Dubey, Satish Kumar Singh, Rajiv Saxena, and Rajat Kumar Singh. 2015.
Identity verification using shape and geometry of human hands. Expert Systems with Applications
42(2015), 821–832.
Linlin Shen, Li Bai, and Zhen Ji. 2010. Hand-based biometrics fusing palm print and finger-knuckle-print. In
Proceedings of IEEE International Workshop on Emerging Techniques and Challenges for Hand-Based
Biometrics. 1–4.
A. Shoichiro, I. Koichi, and A. Takafumi. 2011. Finger-knuckle-print recognition using BLPOC-based local
block matching. In Proceedings of 1st Asian Conference on Pattern Recognition (ACPR’11). 525–529.
Aoyama Shoichiro, Koichi Ito, and Takafumi Aoki. 2013. A multi-finger knuckle recognition system for door
handle. In Proceedings of the IEEE 6th International Conference on Biometrics: Theory, Applications
and Systems (BTAS-2013). 1–7.
Aoyama Shoichiro, Koichi Ito, and Takafumi Aoki. 2014. A finger-knuckle-print recognition algorithm using
phase-based local block matching. Information Sciences 268 (2014), 53–64.
Diego A. Socolinsky, Andrea Selinger, and Joshua D. Neuheisel. 2003. Face recognition with visible and
thermal infrared imagery. Computer Vision and Image Understanding 91, 1(2003), 72–114.
J. Stanly Jayaprakash and S. Arumugam. 2014. Efficient biometric security system using intra-class finger-
knuckle pose variation assessment. International Journal of Computer Science & Engineering Technol-
ogy 5, 12 (2014), 1114–1119.
Shivaraj Subray, J. Aruna, and Anand Bhat. 2014. Biometrics: Access control and authorization based
on finger knuckle print Identification. International Journal of Science, Engineering and Technology
Research 3 (2014), 1751–1756.
R. Sumangali, B. Srinivasan, and P. Narendran. 2013. Multi-modal feature extraction scheme for hand
surface verification systems. International Archive of Applied Sciences & Technology 4, 2 (2013), 46–55.
Prakash Surya and Phalguni Gupta. 2015. Ear Biometrics in 2D and 3D: Localization and Recognition,
vol. 10, Springer.
M. R. Swati and M. Ravishankar. 2013. Finger Knuckle print recognition based on Gabor feature and KPCA+
LDA. In Proceedings of the International Conference on Emerging Trends in Communication, Control,
Signal Processing & Computing Applications. 1–5.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
Knuckle Print Biometrics and Fusion Schemes – Overview, Challenges, and Solutions 34:45

Tsinghua Finger-Vein Finger-Dorsal Image Database. 2012. Homepage. Retrieved from http://www.sz.
tsinghua.edu.cn/labs/vipl/thu-fvfdt.html.
The Hong Kong Polytechnic University Finger Knuckle Print Database. 2009. Homepage. http://www4.comp.
polyu.edu.hk/∼biometrics/FKP.htm.
The Hong Kong Polytechnic University Contactless Finger Knuckle Image Database, Version 1.0. 2012.
Homepage. Retrieved from http://www.comp.polyu.edu.hk/∼csajaykr/fn1.htm.
Yong-mei Tian, Ying-hui Ma, and Le-tao Qi. 2012. A location algorithm of inner-knuckle-print based on
dimension reduction. Journal of Hebei University of Science and Technology 5 (2012).
J. A. Unar, Woo Chaw Seng, and Almas Abbasi. 2014. A review of biometric technology along with trends
and prospects. Pattern Recognition 47 (2014), 2673–2688.
Gaurav Verma and Aloka Sinha. 2014. Finger knuckle print based verification using minimum average
correlation energy filter. International Journal of Electronic Commerce Studies 5 (2014), 233–246.
Irina Vorobyeva, Darci Guriel, Mary Ferguson, and Helina Oladapo. 2014. Benefits and issues of biometric
technologies. Are biometrics worth using? In IEEE SOUTHEASTCON 2014. 1–8.
C. Y. Wang, S. L. Song, F. R. Sun, and L. M. Mei. 2006. A novel biometrics technology finger back articular
skin texture recognition. Acta Automat (2006), 360–367.
Yang Wankou, Sun Changyin, and Zhenyu Wang. 2011b. Finger-knuckle-print recognition using Gabor
feature and MMDA. Frontiers of Electrical and Electronic Engineering in China 6, 2 (2011), 374–380.
Damon L. Woodard and Patrick J. Flynn. 2005. Finger surface as a biometric identifier. Computer Vision and
Image Understanding 100, 3, 357–384.
Damon L. Woodard and Patrick J. Flynn. 2005. Personal identification utilizing finger surface features. In
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition,
vol. 2. 1030–1036.
Damon L. Woodard. 2004. Exploiting Finger Surface as a Biometric Identifier. Ph.D. dissertation. Graduate
School of the University of Notre Dame, Indiana.
M. Xianguang, L. Xiaozheng, L. Shengli, and D. Hongyue. 2008. A new segmentation algorithm of finger
crease based on wavelet and radon projection. In Proceedings of IEEE 4th International Conference on
Wireless Communications, Networking and Mobile Computing. 1–4.
Ming Xiong, Wankou Yang, and Changyin Sun. 2011. Finger-knuckle-print recognition using LGBP. In
Advances in Neural Networks. Springer, Berlin, 270–277.
Xuemiao Xu, Qiang Jin, Le Zhou, Jing Qin, Tien-Tsin Wong, and Guoqiang Han. 2015. Illumination-invariant
and deformation-tolerant inner knuckle print recognition using portable devices. Sensors 15, 4326–4352.
Ying Xu, Yi-Kui Zhai, Jun-Ying Gan, Jun-Ying Zeng, and Yu Huang. 2014. Finger-knuckle print recognition
based on image sets and convex optimization. In Proceedings of the IEEE International Conference on
Machine Learning and Cybernetics, vol. 1. 58–64.
Yang Wankou, Sun Changyin, and Sun Zhonzxi. 2011a. Finger Knuckle print Recognition using Gabor
feature and OLDA. Frontiers of Electrical and Electronic Engineering in China 6, 2 (2011), 374–380.
Lijun Yan, Linlin Tang, Shu-Chuan Chu, Xiaorui Zhu, Jun-Bao Li, and Xiaochuan Guo. 2014. Genetic
generalized discriminant analysis and its applications. In Modern Advances in Applied Intelligence.
Springer International Publishing, 246–255.
Xuekui Yan, Wenxiong Kang, Feiqi Deng, and Qiuxia Wu. 2015. Palm vein recognition based on multi-
sampling and feature-level fusion. Neurocomputing 151 (2015), 798–807.
Wenming Yang, Xiaola Huang, Fei Zhou, and Qingmin Liao. 2014a. Comparative competitive coding for
personal identification by using finger vein and finger dorsal texture fusion. Information Sciences 268
(2014), 20–32.
Wenming Yang, Yi Chao Li, and Qing Min Liao. 2014b. Fast and robust personal identification by fusion of
finger vein and finger knuckle print images. Applied Mechanics and Materials 556 (2014), 5085–5088.
Jun Yin, Jingbo Zhou, Zhong Jin, and Jian Yang. 2010. Weighted linear embedding and its applications to
finger-knuckle-print and palmprint recognition. In Proceedings of International Workshop on Emerging
Techniques and Challenges for Hand-Based Biometrics. 1–4.
Hongyang Yu, Gongping Yang, Zhuoyi Wang, and Lin Zhang. 2015. A new finger-knuckle-print ROI extraction
method based on Two-stage center point detection. International Journal of Signal Processing, Image
Processing and Pattern Recognition 8, 2, 185–200.
Peng Fei Yu, Hao Zhou, and Hai Yan Li. 2014. Personal identification using finger-knuckle-print based on
local binary pattern. Applied Mechanics and Materials 441 (2014), 703–706.
Kam Yuen Cheng and Ajay Kumar. 2012. Contactless finger knuckle identification using smartphones. In
Proceedings of the International Conference of the Biometrics Special Interest Group (BIOSIG’12). 1–6.

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.
34:46 G. Jaswal et al.

Behnam Zeinali, Ahmad Ayatollahi, and Mohammad Kakooei. 2014. A novel method of applying directional
filter bank (DFB) for finger-knuckle-print (FKP) recognition. In Proceedings of IEEE 22nd Iranian
Conference on Electrical Engineering. 500–504.
Yikui Zhai, Junying Gan, Ying Xu, and Junying Zeng. 2012. Fast sparse representation for finger-knuckle-
print recognition based on smooth L0 norm. In Proceedings of IEEE 11th International Conference on
Signal Processing (ICSP’12), vol. 3. 1587–1591.
Lin Zhang and D. Zhang. 2010. Monogenic code: A novel fast feature coding algorithm with applications
to finger-knuckle-print recognition. In Proceedings of IEEE International Workshop on Emerging Tech-
niques and Challenges for Hand-Based Biometrics. 1–4.
Lin Zhang and Hongyu Li. 2012. Encoding local image patterns using Riesz transforms: With applications
to palm print and finger-knuckle-print recognition. Image and Vision Computing (2012), 1043–1051.
Yanqiang Zhang, Dongmei Sun, and Zhengding Qiu. 2010. Hand-based feature level fusion for single sample
biometrics recognition. In Proceedings of IEEE International Workshop on Emerging Techniques and
Challenges for Hand-Based Biometrics. 1–4.
Yanqiang Zhang, Dongmei Sun, and Zhengding Qiu. 2012. Hand based single sample Biometrics recognition.
Neural Computing and Applications 21, 8 1835–1844.
Lin Zhang, Hongyu Li, and Ying Shen. 2011. A novel Riesz transforms based coding scheme for finger-
knuckle-print recognition. In Proceedings of IEEE International Conference on Hand-Based Biometrics
(ICHB’11). 1–6.
Lin Zhang, Huaqing Li, and Jianwei Niu. 2012a. Fragile bits in palmprint recognition. IEEE Signal Process-
ing Letters 19, 10 (2012), 663–666.
Lin Zhang, Lei Zhang, David Zhang, and Zhenhua Guo. 2012b. Phase congruency induced local features for
finger knuckle print recognition. Pattern Recognition 45, 7, 2522–2531.
Lin Zhang, Lei Zhang, and David Zhang, and H. Zhu. 2010. Online finger knuckle print verification for
personal authentication. Pattern Recognition 7, 43 (2010), 2560–2571.
Lin Zhang, Lei Zhang, and David Zhang. 2009a. Finger knuckle print verification based on band limited
phase only correlation, CAIP 2009, LNCS 5702, 141–148.
Lin Zhang, Lei Zhang, and David Zhang. 2009b. Finger knuckle print: A new biometric identifier. In Pro-
ceedings of the IEEE International Conference on Image Processing. 1981–1984.
Lin Zhang, Lei Zhang, David Zhang, and H. Zhu. 2011. Ensemble of local and global information for finger–
knuckle-print recognition. Pattern Recognition 44 (2011), 1990–1998.
D. Zhang, W. K. Kong, J. You, and M. Wong. 2003. Online palm print identification. IEEE Transactions on
Pattern Analysis and Machine Intelligence 25, 9, 1040–1050.
L. Zhang, Y. Shen, H. Li, and J. Lu. 2015. 3-D palm print identification using block-wise features and
collaborative representation. IEEE Transactions on Pattern Analysis & Machine Intelligence`18;37, 8
(2015), 1730–1736.
David Zhang, Zhenhua Guo, and Yazhuo Gong. 2016. An online system of multispectral palmprint verifica-
tion. In Multispectral Biometrics. Springer International Publishing, 117–137.
Gang Zheng, Chia-Jiu Wang, and E. Boult. 2007. Application of projective invariants in hand geometry
biometrics. IEEE Transactions on Information Forensics and Security 2, 758–768.
Yingbo Zhou and Ajay Kumar. 2011. Human identification using palm-vein images. IEEE Transactions on
Information Forensics and Security 6 (2011), 1259–1274.
Le-qing Zhu. 2011. Finger knuckle print recognition based on SURF algorithm. In Proceedings of the 8th
IEEE International Conference on Fuzzy Systems and Knowledge Discovery, vol. 3. 1879–1883.
Le-qing Zhu and San-yuan Zhang. 2010. Multimodal biometric identification system based on finger geome-
try, knuckle print and palm print. Pattern Recognition Letters 31 (2010), 1641–1649.
Le-qing Q Zhu, San-Yuan Zhang, and R. Xing. 2009. Automatic personal authentication based on finger
phalangeal prints. Acta Automatica Sinica 35, 7 (2009), 875–881.

Received September 2015; revised March 2016; accepted May 2016

ACM Computing Surveys, Vol. 49, No. 2, Article 34, Publication date: November 2016.

You might also like