Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Jour of Adv Research in Dynamical & Control Systems, Vol.

10, 01-Special Issue, 2018

Local Extreme Edge Binary Patterns for


Face Recognition and Image Retrieval
G. Sucharitha, Department of Electronics and Communication, KL University, India. E-mail:sucharithasu@gmail.com
Ranjan K. Senapati, Department of Electronics and Communication, KL University, India.
E-mail:ranjan.senapati@kluniversity.in
Abstract--- In this paper, a new approach for image retrieval local extreme edge binary pattern(LEEBP) is
proposed. The standard local binary pattern (LBP) encode the sign information of the local differences which are
calculated between the center pixel and its neighbours. The proposed method differs from the existing LBP in such a
way that it collects sign code using the magnitudes of local differences in all directions i.e. 00, 450, 900 and 1350.
The local difference in each direction calculated between center pixel and its directional neighbours. Then, the
binary pattern in respective direction generated based on the sign code magnitude of the local difference. Finally,
LEEBP utilizes all four edges information in generating the binary code for each edge. Performance of the proposed
method verified on three different databases and compared with LBP, Block based local binary pattern (Blk_LBP),
Center symmetric local binary pattern (CS_LBP), Directional local extrema pattern (DLEP) and Local maximum
edge binary pattern (LMEBP). The results showed an acceptable improvement in terms of their assessment measures
as compared with the existing methods on respective databases.
Keywords--- Local Extreme Edge Patterns, Local Quantized Patterns, Image Retrieval, Histogram, Corel-10K,
A&T Face Database, Texture Database.

I. Introduction
Retrieving the similar images for a particular query image from the database is called the content based image
retrieval (CBIR). CBIR utilizes the low level features of an image like color, texture, shape and spatial layouts etc,
in order to characterize and index the image. CBIR is used to reduce the semantic gap between low level features of
an image and richness of human semantics. However, complete and extensive literature survey in [1-4].
Among all the low level image features, texture classification and extraction of features is an active research
area. In Ref.[5] texture provides the important information in image classification as it illustrates the content of
many real world images like bricks, fabrics, clouds, trees, leaves etc. Texture analysis is often concerned with
detecting aspects of an image that are rotationally invariant and it has gained an extensive attention in the fields of
face recognition, medical, image retrieval and object based image coding etc. Mean and variance of the wavelet
coefficients used as texture features for image retrieval [6]. Gabor and discrete wavelet transforms are widely used
for texture feature analysis [7-8]. Moghaddam et al. proposed the Gabor wavelet correlogram (GWC) for CBIR [9].
Ahmadian et al. used the wavelet transform for texture classification [10]. Ojala.et.al [11] proposed Local binary
patterns (LBP) for extracting local information of each pixel using neighbouring pixels, in addition, LBP was
converted in uniform and rotation invariant patterns [12]. Moreover LBP is used for facial expression analysis and
recognition [13-14], object tracking [15], texture classification etc.
Extension of LBP were introduced for better results. Heikkila et.al [16] proposed a modified version of LBP as
the center-symmetric local binary pattern (CS-LBP) is a combination of LBP with scale invariant feature transform
(SIFT) to describe the regions of interest. Completed LBP which is considering sign and magnitude in generating
the binary pattern[17]. Dominant LBP [18], Line edge pattern for segmentation and image retrieval (LEPSEG &
LEPINV) [19], local ternary pattern (LTP) [20], etc. have been proposed for image texture feature analysis and
extraction. Qian et al.[21] et.al proposed pyramid LBP extracts multi resolution images based on the original image
using a low pass filter and LBP. The homogeneity of the LBP is restricting to find the edge information of an
image. To overcome this issue, Murala et.al [22] proposed directional local extrema patterns(DLEP) to extract the
edge information in all possible directions and applied for CBIR. Hussain et.al [23] proposed the local quantized
patterns for visual recognition. The block-based texture feature [24] which use the LBP texture feature as the source
of image description is proposed for CBIR. Subrahmanyam et.al proposed various local patterns for texture feature
extraction such as local tetra patterns (LTrP)[25], local maximum edge binary patterns(LMEBP)[26],local mesh
patterns(LMeP)[27], directional binary wavelet patterns (DBWP)[28] and local ternary co-occurrence patterns
(LTCoP) [29]. Manisha et.al[30] proposed a texture and color based image retrieval using LBP and Gray level co-

ISSN 1943-023X 644


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

occurrence. Vipparthi et.al[31] proposed a combination of LMEBP and magnitude local operator for image
retrieval. The concepts of LQP[23] and LMEBP[26] have motivated to propose the local extreme edge binary
patterns (LEEBP) for CBIR.
1.1. Main Contributions
1. The proposed method collects the extreme edge binary patterns (LEEBP) for the query and database images.
2. Edge binary patterns are calculated on taking the reference of LMEBP.
3. The proposed method tested face, corel-10k, STex databases.
This paper is planned in the following style: In Section 1, introduction has been presented which comprises,
inspiration, literature survey and main roles of the proposed method. Concise review of existing methods and
proposed method explained in Section 2. In Section 3, demonstration and framework of proposed method has been
discussed. Section 4 is experimental analysis and the proposed method validity. Finally, conclusion in Section 5.

II. Review of Local Patterns


2.1. Local Binary Pattern
LBP introduced by Ojala.et.al[11] for rotation invariant texture classification. Each pixel in the image considered
as a centre pixel at a time. For a greyscale image,I of size mxn pixels and 𝐼𝐼(𝑔𝑔)denotes gray level of the 𝑔𝑔𝑡𝑡ℎ pixel in
the image. A pixel at the centre becomes the threshold to derive the local binary pattern in a small 3x3 array of
spatial structure. Mathematical expression for LBP is as given in Eq.(1)&(2).
𝐿𝐿𝐿𝐿𝐿𝐿𝑃𝑃,𝑅𝑅 = ∑𝑃𝑃𝑖𝑖=1 2(𝑖𝑖−1) 𝑓𝑓(𝑔𝑔𝑝𝑝 − 𝑔𝑔𝑐𝑐 ) (1)
1, 𝑔𝑔 ≥ 0
𝑓𝑓(𝑔𝑔) = � (2)
0, 𝑔𝑔 < 0
𝑐𝑐 𝑝𝑝
Where,𝑔𝑔 is gray value of centre pixel,𝑔𝑔 is gray value of circularly symmetric neighbourhood, P represents no.
of neighbours and R is length of the neighbourhood. After deriving the LBP structure for the whole image, a
histogram is built to represent the image as per the Eq(3)&(4).
𝐻𝐻𝐻𝐻𝐿𝐿𝐿𝐿𝐿𝐿 = ∑𝑚𝑚 𝑛𝑛 𝑃𝑃
𝑖𝑖=1 ∑𝑗𝑗 =1 𝐻𝐻1 (𝐿𝐿𝐿𝐿𝐿𝐿(𝑖𝑖, 𝑗𝑗), 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏); 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 ∈ [0, 2 − 1] (3)
1, 𝑢𝑢 = 𝑣𝑣
𝐻𝐻1 (𝑢𝑢, 𝑣𝑣) = � (4)
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
Where the size of an image is 𝑚𝑚𝑚𝑚𝑚𝑚.
Fig.1 shows an example of calculating the LBP pattern for a3x3 matrix. The histogram of these patterns includes
information on the distribution of edges in an image.

Figure 1: Calculation of LBP for a 3x3 Pattern


2.2. Local Maximum Edge Binary Patterns
Subrahmanyam et.al[26] proposed this method for an image. The first maximum edge is attained by the
magnitude of local differences between the center pixel and it’s all neighbours. After calculation of differences all
the values are arranged in descending order using only magnitudes as shown in Fig.2.

Figure 2: Example for Local Maximum Edge Binary Pattern

ISSN 1943-023X 645


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

Similarly, the remaining seven patterns are calculated to attain first maximum edge. The total eight maximum
edges are evaluated using nine binary values.
2.3. Local Extreme Edge Binary Patterns
The proposed LEEBP for a given image has four edges for each pixel. The four edges are calculated using the
local differences between center pixel to its four directions individually i.e ± 00, ±450, ±900 and ±1350 as shown in
the Fig.3. The four edges for each pixel will be calculated using the following Eq.s.
The pixel for which edges are calculated assumed as 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 ) = 𝐼𝐼𝐼𝐼(𝑖𝑖, 𝑗𝑗)
The 00 edge
𝐼𝐼𝐼𝐼(𝑖𝑖, 𝑗𝑗 − 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )

𝐼𝐼𝐼𝐼(𝑖𝑖, 𝑗𝑗 − 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
𝑓𝑓(00 ) = (5)
⎨ 𝐼𝐼𝐼𝐼(𝑖𝑖, 𝑗𝑗 + 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
⎩𝐼𝐼𝐼𝐼(𝑖𝑖, 𝑗𝑗 + 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
The 450 edge
𝐼𝐼𝐼𝐼(𝑖𝑖 − 1, 𝑗𝑗 + 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )

𝐼𝐼𝐼𝐼(𝑖𝑖 − 2, 𝑗𝑗 + 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
𝑓𝑓(450 ) = (6)
⎨𝐼𝐼𝐼𝐼(𝑖𝑖 + 1, 𝑗𝑗 − 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
⎩𝐼𝐼𝐼𝐼(𝑖𝑖 + 2, 𝑗𝑗 − 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
The 900 edge
𝐼𝐼𝐼𝐼(𝑖𝑖 − 2, 𝑗𝑗) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )

𝐼𝐼𝐼𝐼(𝑖𝑖 − 1, 𝑗𝑗) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
𝑓𝑓(900 ) = (7)
⎨𝐼𝐼𝐼𝐼(𝑖𝑖 + 2, 𝑗𝑗) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
⎩𝐼𝐼𝐼𝐼(𝑖𝑖 + 1, 𝑗𝑗) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
The 1350 edge
𝐼𝐼𝐼𝐼(𝑖𝑖 − 1, 𝑗𝑗 − 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )

𝐼𝐼𝐼𝐼(𝑖𝑖 − 2, 𝑗𝑗 − 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
𝑓𝑓(1350 ) = (8)
⎨𝐼𝐼𝐼𝐼(𝑖𝑖 + 1, 𝑗𝑗 + 1) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
⎩𝐼𝐼𝐼𝐼(𝑖𝑖 + 2, 𝑗𝑗 + 2) − 𝐼𝐼𝐼𝐼(𝑔𝑔𝑐𝑐 )
Sorting the magnitudes of each edge, after calculating the local differences in all directions using following Eq.s
𝑓𝑓(𝐼𝐼𝑂𝑂 0 ) = 𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆(𝑚𝑚𝑚𝑚𝑚𝑚(|𝐼𝐼00 (1)|, |𝐼𝐼00 (2)|, |𝐼𝐼00 (3)|, |𝐼𝐼00 (4)|)) (9)
𝑓𝑓(𝐼𝐼45 0 ) = 𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆(𝑚𝑚𝑚𝑚𝑚𝑚(|𝐼𝐼45 0 (1)|, |𝐼𝐼45 0 (2)|, |𝐼𝐼45 0 (3)|, |𝐼𝐼45 0 (4)|)) (10)
f(I900 ) = Sort(𝑚𝑚𝑚𝑚𝑚𝑚(|𝐼𝐼900 (1)|, |𝐼𝐼900 (2)|, |𝐼𝐼900 (3)|, |𝐼𝐼900 (4)|)) (11)
𝑓𝑓(𝐼𝐼135 0 ) = 𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆(𝑚𝑚𝑚𝑚𝑚𝑚(|𝐼𝐼135 0 (1)|, |𝐼𝐼135 0 (2)|, |𝐼𝐼135 0 (3)|, |𝐼𝐼135 0 (4)|)) (12)
Assign the binary values to the edges as, ‘1’ if it is positive, otherwise ‘0’.
1 𝑓𝑓𝑓𝑓𝑓𝑓 ẍ ≥ 0
𝑓𝑓(ẍ) = � (13)
0 𝑓𝑓𝑓𝑓𝑓𝑓 ẍ < 0
Finally, the four edges are obtained from the following Eq.s
𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿00 = 20 ∗ 𝑓𝑓�𝐼𝐼00 (1)� + 21 ∗ 𝑓𝑓�𝐼𝐼45 0 (1)� + 22 ∗ 𝑓𝑓�𝐼𝐼900 (1)� + 23 ∗ 𝑓𝑓�𝐼𝐼135 0 (1)� (14)
𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿45 0 = 20 ∗ 𝑓𝑓�𝐼𝐼00 (2)� + 21 ∗ 𝑓𝑓�𝐼𝐼45 0 (2)� + 22 ∗ 𝑓𝑓�𝐼𝐼900 (2)� + 23 ∗ 𝑓𝑓�𝐼𝐼135 0 (2)� (15)
𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿900 = 20 ∗ 𝑓𝑓�𝐼𝐼00 (3)� + 21 ∗ 𝑓𝑓�𝐼𝐼45 0 (3)� + 22 ∗ 𝑓𝑓�𝐼𝐼900 (3)� + 23 ∗ 𝑓𝑓�𝐼𝐼135 0 (3)� (16)
0 1 2 3
𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿135 0 = 2 ∗ 𝑓𝑓�𝐼𝐼00 (4)� + 2 ∗ 𝑓𝑓�𝐼𝐼45 0 (4)� + 2 ∗ 𝑓𝑓�𝐼𝐼900 (4)� + 2 ∗ 𝑓𝑓�𝐼𝐼135 0 (4)� (17)
Individual histograms are constructed and concatenated to construct the feature vector after calculating the edges
to each pixel.
𝐻𝐻𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿 𝛼𝛼 (𝑘𝑘) = ∑𝑀𝑀 𝑁𝑁 𝛼𝛼
𝑖𝑖=1 ∑𝑗𝑗 =1 𝑓𝑓1(𝐿𝐿𝐿𝐿𝐿𝐿𝐿𝐿 (𝑖𝑖, 𝑗𝑗), 𝑘𝑘), 𝑘𝑘 ∈ [0,15] α= 00, 450, 900 and 1350 (18)
1 𝑢𝑢 = 𝑣𝑣
𝑓𝑓1(𝑢𝑢, 𝑣𝑣) = � (19)
0 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
The calculation of LEEBP for a centre pixel marked with red color is shown in Fig3. The local differences
between the centre pixel and the directional pixels in horizontal, vertical, diagonal and anti-diagonal are calculated.
Further, these local differences sorted in ascending order based on the magnitudes for all directions. The sorted local
differences are coded '1' and '0' based on their sign. Finally, coding to 00 edge to the 1350 edge is performed using all
direction edges (H, V, D, A).

ISSN 1943-023X 646


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

The calculation and its result on a face is shown Fig.3 & Fig.4. The color face image is taken and the edges in all
the directions are shown in Figure 4(a)-(e). All the four edges have the given the edge information in their respective
direction. The 00 direction edge has more edge information as comparative to the 450, 900 and 1350.

Figure 3: Structure of Local Extreme Edge Binary Pattern

III. Proposed System Framework


1. At first convert the RGB image into gray image.
2. Compute the local difference between center pixel and its directional pixels in 00, 450, 900 and 1350 for each
pixel in the image using Eq.s (5)-(12).
3. Calculate the LEEP for each edge using Eq.s (13)-(17).
4. Construct the histogram for each direction using Eq.(18)& (19).
5. Concatenate all four histograms to construct the feature vector.
6. Compare the database images with the query image using Eq.22
7. Retrieve the images based on the best matches.

Figure 4: Results of LEEBP on Face Feature Map. a) Sample Image b) 00 Direction Feature Map c) 450 Direction
Feature Map d) 900 Direction Feature Map e) 1350 Direction Feature Map

ISSN 1943-023X 647


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

Figure 5: Frame Work of the Proposed Method


3.1. Similarity Metrics for Query Matching
Feature extraction has to be computed for all images including the query image, and a feature vector database
has to construct for all the images in the database. Table 2 gives the length of feature vector for all methods. After
completing the extraction procedure for the features of all images, similarity has to be calculated for query image to
the database images. There are various distance measure metrics are available, some are given in terms of equations
from Eq.20 -Eq.22. The proposed method d1 Eq.20 distance used to find the distance between query and database
image feature vectors.
𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑓𝑓 𝑏𝑏 (𝑖𝑖)−𝑓𝑓 𝑞𝑞 (𝑖𝑖)
𝑑𝑑1 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑: 𝑑𝑑(𝑞𝑞, 𝑏𝑏) = ∑𝑖𝑖=1 | | (20)
1+𝑓𝑓 𝑏𝑏 (𝑖𝑖)+𝑓𝑓𝑞𝑞 (𝑖𝑖)
𝑓𝑓 𝑏𝑏 (𝑖𝑖)−𝑓𝑓 𝑞𝑞 (𝑖𝑖)
𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶: 𝑑𝑑(𝑞𝑞, 𝑏𝑏) = ∑𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓
𝑖𝑖=1 | 𝑓𝑓 (𝑖𝑖)+𝑓𝑓𝑞𝑞 (𝑖𝑖) | (21)
𝑏𝑏
𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓
𝑀𝑀𝑀𝑀𝑀𝑀ℎ𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎: 𝑑𝑑(𝑞𝑞, 𝑏𝑏) = ∑𝑖𝑖=1 |𝑓𝑓𝑏𝑏 (𝑖𝑖) − 𝑓𝑓𝑞𝑞(𝑖𝑖) | (22)
Where q is the query image, b is the database image

IV. Experimental Results


To validate the performance of the proposed method, one face image database and two texture based image
databases have been used. Each and every image of the database has taken a query image once for each database.
The advantage of the proposed algorithm is confirmed based on evaluation measures i.e. recall and precision with
some recent texture retrieval local patterns for image retrieval. The precision and recall are given in Eq.(23)-(24).
Average precision and recall are defined as per Eq.(25)-(26).
𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑛𝑛𝑛𝑛 .𝑜𝑜𝑜𝑜 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
Precision(P)= (23)
𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑛𝑛𝑛𝑛 .𝑜𝑜𝑜𝑜 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑛𝑛𝑛𝑛 .𝑜𝑜𝑜𝑜 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
Recall(R) = (24)
𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑛𝑛𝑛𝑛 .𝑜𝑜𝑜𝑜 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
1
Avg.P = ∑𝑁𝑁
𝑖𝑖=1 𝑃𝑃𝑖𝑖 (25)
𝑁𝑁
1
Avg.R = ∑𝑁𝑁 𝑅𝑅 (26)
𝑁𝑁 𝑖𝑖=1 𝑖𝑖
Where, N is the number of images in the database.
Experiment 1
The Corel-10k [32]database is larger and adaptable than other Corel databases. It comprises of 10000 images of
100 different groups, where each group has 100 images. It includes images of animals, e.g fox, tiger, deer etc.,
human, natural scenes, ships, food, buses etc., army, ocean, cats, airplanes etc. Each image size of an image in the
database is 85x128. Some of the sample images from the Corel-10k shown in Fig.6. The retrieved images for a
query image are shown in Fig.7. In the Fig.7 top left image is the query image and the retrieved images according to
their respective rank. Fig.8(a) and (b) shows performance of the proposed algorithm calculated in terms of precision
and recall for each group in the database. Average precision and average recall are shown in Fig 8(c) and (d).
Table 1: Results of Proposed Method and Previous Methods for Following Databases
Corel-10K S-Tex AT&T Face
ARP ARR ARP ARR ARP ARR
CS_LBP 27 10.3 38 61 90 45.7
Blk_LBP 38.5 14.5 40 62.5 89 44.9
LBP 38 14.3 52 72 85 42.1
DLEP 40 14.9 53.5 72.5 92 50.7
LMEBP 39.5 15.1 53 73.5 94 51.8
LEEBP 47.3 17.7 56 77 97 56.5

ISSN 1943-023X 648


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

Experiment 2
The S-Tex image database (Salzburg texture) [33] has 7616 images of each image size is 128x128. The database
has total 476 categories and each category has 16 image. Some of the sample images are shown in Fig.9.Different
type of textures, e.g., rock, wood, water, rubber, food, etc. exist in this database.
Each image in the database for every category has taken as a query image from the database of 7,616 images to
evaluate effectiveness of the proposed method. Average precision and recall graphs have been shown in Fig. 10(a)
and (b). The efficiency of the proposed method proved a major improvement in comparison with LBP, CS_LBP,
Blk_LBP, DLEP and LMEBP.

Figure 6: Sample Images from Corel-10k Database


Experiment 3
The AT&T is a face database[34]. It contains 40 different face images, for each face image 10 face images with
various poses. The each image size in the database is 92x112. Some sample images from this database are shown in
Fig.11. The first image in each face category has taken as a query image. For a query image the retrieved results are
shown in Fig 12.
The performance of the proposed method shown in Fig 13(a)&(b) in terms of precision and recall. The
performance of the PM proved a significant improvement in comparison with LBP, CS_LBP, Blk_LBP, DLEP and
LMEBP. Table1 explains the average precision rate(ARP) and average retrieval rate (ARR) of two different texture
image databases and one AT&T face image database for other compared methods with the proposed algorithm.
The ARR results are concluding clearly that the proposed algorithm do better than existing one.

Figure 7: Retrieved Results for a Query Image from Corel-10k Database

ISSN 1943-023X 649


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

CS_LBP Blk-LBP LBP DLEP LMEBP LEEBP

100
90
80
70
Precision

60
50
40
30
20
10
0 10 20 30 40 50 60 70 80 90 100
No.of image categories

(a)

CS_LBP Blk-LBP LBP DLEP LMEBP LEEBP


90
80
70
60
Recall

50
40
30
20
10
0
0 10 20 30 40 50 60 70 80 90 100
No.of images category

(b)

CS_LBP Blk_LBP LBP CS_LBP Blk_LBP LBP


DLEP LMEBP LEEBP DLEP LMEBP LEEBP
48 18
43 16
38 14
33 12
ARR
ARP

28 10
23 8
18 6
13 4
8 2
10 20 30 40 50 60 70 80 90 100 10 20 30 40 50 60 70 80 90 100

No.of images retrieved No.of images retrieved

(c) (d)
Figure 8: Corel-10k comparative results a) Group Wise Precision b) Group Wise Recall c) Average Precision
d) Average Recall

ISSN 1943-023X 650


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

Figure 9: Sample images from S-Tex database

CS_LBP Blk_LBP LBP CS_LBP Blk_LBP LBP


DLEP LMEBP LEEBP
DLEP LMEBP LEEBP
60 90
50
70
40
ARR
ARP

30 50
20
30
10
0 10
16 32 48 64 80 96 112 16 32 48 64 80 96 112
No.of images retrieved No.of images retrieved

Figure 10: STex Database Results: (a) Average Precision (b) Average Retrieval

Figure 11: Sample Images from AT&T face Database


Table 2: Feature Vector Length for a Given Query Image using Several Methods
Method Feature vector length
CS-LBP 16
LBP 256
DLEP 2048
LMEBP 4096
LEEBP 64

ISSN 1943-023X 651


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

Figure 12: Results on AT&T Database

CS_LBP Blk_LBP LBP CS_LBP Blk_LBP LBP


DLEP LMEBP LEEBP DLEP LMEBP LEEBP
100 60
90
50
80
Precision

70 40
Recall

60 30
50
40 20
30 10
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10
No.of images retrieved No.of images retrieved

(a) (b)
Figure 13: Results of AT&T Database (a) Precision (b) Recall

4.1. Calculations Complexity


Retrieval time of similar images from the dtabase is extremely dependent on the length of a feature vector.
Generally distance metric will take more time for a lengthy feature vector in finding the difference between query
image and database images. Comparision interms of the feature vector lengths in between proposed method to the
further methods shown in table2 for speed evaluation. As per the table 2, the feature vector length of the proposed
method is less than all the methods except CS_LBP only, even though the PM do better than other methods in terms
of retrieval as stated in different database experimental sections. Also, DLEBP and LMEBP have lengthy feature
vector .Hence, those algorithms are taking more time than the proposed method in extracting the related images to
the query image.

ISSN 1943-023X 652


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

V. Conclusion
A new image retrieval method has been proposed for various image databases. The idea to propose this method
came from using local quantized pattern and local maximum edge binary pattern. The four edges for each pixel in all
possible directions calculated with the help of sign code magnitude of local differences. By concatenating all the
four directional histograms feature vector has been extracted. The proposed method tested on STEX texture
database, Corel-10k database, AT&T face image database. Efficiency of the proposed algorithm demonstrated by
successful experiments and provide evidence by comparing with other algorithms.

References
[1] Wang, J.Z., Wiederhold, G., Firschein, O. and Wei, S.X. Content-based image indexing and searching
using Daubechies' wavelets. International Journal on Digital Libraries 1 (4) (1998) 311-328.
[2] Rui, Y., Huang, T.S. and Chang, S.F. Image retrieval: Current techniques, promising directions, and open
issues. Journal of visual communication and image representation 10 (1) (1999) 39-62.
[3] Long, F., Zhang, H. and Feng, D.D. Fundamentals of content-based image retrieval. Multimedia
Information Retrieval and Management. Springer, Berlin, Heidelberg, 2003, 1-26.
[4] Smeulders, A.W., Worring, M., Santini, S., Gupta, A. and Jain, R. Content-based image retrieval at the end
of the early years. IEEE Transactions on pattern analysis and machine intelligence 22 (12) (2000)
1349-1380.
[5] Liu, Y., Zhang, D., Lu, G. and Ma, W.Y. A survey of content-based image retrieval with high-level
semantics. Pattern recognition 40 (1) (2007) 262-282.
[6] Smith, J.R. and Chang, S.F. Automated binary texture feature sets for image retrieval. IEEE International
Conference on Acoustics, Speech, and Signal Processing, 1996, 2239-2242.
[7] Laine, A. and Fan, J. Texture classification by wavelet packet signatures. IEEE Transactions on pattern
analysis and machine intelligence 15 (11) (1993) 1186-1191.
[8] Loupias, E. and Sebe, N. Wavelet-based salient points: Applications to image retrieval using color and
texture features. International Conference on Advances in Visual Information Systems. Springer, Berlin,
Heidelberg, 2000, 223-232.
[9] Moghaddam, H.A., Khajoie, T.T. and Rouhi, A.H. A new algorithm for image indexing and retrieval using
wavelet correlogram. IEEE International Conference on Image Processing, 2003, 3-497.
[10] Ahmadian, A. and Mostafa, A. An efficient texture classification algorithm using Gabor wavelet. IEEE
25th Annual International Conference of the Engineering in Medicine and Biology Society, 2003, 930-933.
[11] Ojala, T., Pietikäinen, M. and Harwood, D. A comparative study of texture measures with classification
based on featured distributions. Pattern recognition 29 (1) (1996) 51-59.
[12] Ojala, T., Pietikainen, M. and Maenpaa, T. Multi resolution gray-scale and rotation invariant texture
classification with local binary patterns. IEEE Transactions on pattern analysis and machine intelligence
24 (7) (2002) 971-987.
[13] Ahonen, T., Hadid, A. and Pietikainen, M. Face description with local binary patterns: Application to face
recognition. IEEE transactions on pattern analysis and machine intelligence 28 (12) (2006) 2037-2041.
[14] Shan, C., Gong, S. and McOwan, P.W. Facial expression recognition based on local binary patterns: A
comprehensive study. Image and Vision Computing 27 (6) (2009) 803-816.
[15] Ning, J., Zhang, L., Zhang, D. and Wu, C. Robust object tracking using joint color-texture histogram.
International Journal of Pattern Recognition and Artificial Intelligence 23 (7) (2009) 1245-1263.
[16] Heikkilä, M., Pietikäinen, M. and Schmid, C. Description of interest regions with local binary
patterns. Pattern recognition 42 (3) (2009) 425-436.
[17] Guo, Z., Zhang, L. and Zhang, D. A completed modeling of local binary pattern operator for texture
classification. IEEE Transactions on Image Processing 19 (6) (2010) 1657-1663.
[18] Liao, S., Law, M.W. and Chung, A.C. Dominant local binary patterns for texture classification. IEEE
transactions on image processing 18 (5) (2009) 1107-1118.
[19] Yao, C.H. and Chen, S.Y. Retrieval of translated, rotated and scaled color textures. Pattern Recognition 36
(4) (2003) 913-929.
[20] Tan, X. and Triggs, B. Enhanced local texture feature sets for face recognition under difficult lighting
conditions. IEEE transactions on image processing 19 (6) (2010) 1635-1650.
[21] Qian, X., Hua, X.S., Chen, P. and Ke, L. PLBP: An effective local binary patterns texture descriptor with
pyramid representation. Pattern Recognition 44 (10-11) (2011) 2502-2515.

ISSN 1943-023X 653


Received: 5 Dec 2017/Accepted: 15 Jan 2018
Jour of Adv Research in Dynamical & Control Systems, Vol. 10, 01-Special Issue, 2018

[22] Murala, S., Maheshwari, R.P. and Balasubramanian, R. Directional local extrema patterns: a new descriptor
for content based image retrieval. International journal of multimedia information retrieval 1 (3) (2012)
191-203.
[23] Ul Hussain, S. and Triggs, B. Visual recognition using local quantized patterns. Computer Vision–ECCV.
Springer, Berlin, Heidelberg, 2012, 716-729.
[24] Takala, V., Ahonen, T. and Pietikäinen, M. Block-based methods for image retrieval using local binary
patterns. Scandinavian Conference on Image Analysis. Springer, Berlin, Heidelberg, 2005, 882-891.
[25] Murala, S., Maheshwari, R.P. and Balasubramanian, R. Local tetra patterns: a new feature descriptor for
content-based image retrieval. IEEE Transactions on Image Processing 21 (5) (2012) 2874-2886.
[26] Subrahmanyam, M., Maheshwari, R.P. and Balasubramanian, R. Local maximum edge binary patterns: a
new descriptor for image retrieval and object tracking. Signal Processing 92 (6) (2012) 1467-1479.
[27] Murala, S. and Wu, Q.J. Local mesh patterns versus local binary patterns: biomedical image indexing and
retrieval. IEEE Journal of Biomedical and Health Informatics 18 (3) (2014) 929-938.
[28] Murala, S., Maheshwari, R.P. and Balasubramanian, R. Directional binary wavelet patterns for biomedical
image indexing and retrieval. Journal of Medical Systems 36 (5) (2012) 2865-2879.
[29] Murala, S. and Wu, Q.J. Local ternary co-occurrence patterns: a new feature descriptor for MRI and CT
image retrieval. Neurocomputing 119 (2013) 399-412.
[30] Verma, M., Raman, B. and Murala, S. Local extrema co-occurrence pattern for color and texture image
retrieval. Neurocomputing 165 (2015) 255-269.
[31] Vipparthi, S.K. and Nagar, S.K. Local extreme complete trio pattern for multimedia image retrieval
system. International Journal of Automation and Computing 13 (5) (2016) 457-467.
[32] Corel 10k Database, Available online:http://www.ci.gxnu.edu. in/cbir/.
[33] STex, Salzburg texture image database (STex), 2009.
http: //wavelab.at/sources/STex/.
[34] AT&T Laboratories Cambridge, The AT&T Database of Faces, 2002.
http://www.cl.cam.ac.uk/Research/DTG/attarchive/pub/data/att_faces.tar.Z

ISSN 1943-023X 654


Received: 5 Dec 2017/Accepted: 15 Jan 2018

You might also like