Journal of King Saud University - Computer and Information Sciences

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

Contents lists available at ScienceDirect

Journal of King Saud University –


Computer and Information Sciences
journal homepage: www.sciencedirect.com

Machine learning approach in melanoma cancer stage detection


Rashmi Patil ⇑, Sreepathi Bellary
Department of Computer Science and Engineering, RYMEC, VTU, Bellary, Karnataka, India

a r t i c l e i n f o a b s t r a c t

Article history: Melanoma is a dangerous skin cancer and spreads very fast. Hence, it is the deadliest skin cancer and
Received 11 May 2020 causes most deaths. Classification of cancer stages is a very tedious task and very important when a
Revised 24 July 2020 patient is diagnosed. Diagnosis of cancer at the surgical treatment time mainly depends on the stage
Accepted 4 September 2020
of cancer or tumor thickness. In this paper, two methods are designed to classify melanoma cancer stages.
Available online 9 September 2020
The first system classifies melanoma as stage 1 and stage 2. Second system classifies melanoma as stage
1, stage 2 or stage 3 melanoma. The proposed system uses convolutional neural network (CNN) algorithm
Keywords:
with Similarity Measure for Text Processing (SMTP) as loss function. The experimental results with dif-
Classification
Neural network
ferent loss functions are demonstrated and compared with proposed SMTP loss function. The proposed
Skin cancer, thickness algorithm is more efficient than several other loss functions that are specifically designed for the classi-
fication problem.
Ó 2020 The Authors. Published by Elsevier B.V. on behalf of King Saud University. This is an open access
article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction that it is perceived melanocytic lesion, at that point look at


whether it is generous or melanoma. If recognized skin lesion is
Melanoma is a very deadly and rapidly increasing skin cancer melanoma then the physician should start diagnosis of melanoma
all over the World. It is a very dangerous form of skin cancer, as at the earliest time, else it causes the most deaths. Stage of mela-
it spreads to lymph nodes rapidly, even before identifying about noma plays a very important role for diagnosis of melanoma
the cancer. In 2016, the anticipated new cases of melanoma are patient. Diagnosis of cancer at the surgical treatment time mainly
76,380 and the death anticipated was 10,130 in the U.S. This claims depends on stage of cancer or tumor thickness. The size and stage
more than one life every hour in the U.S. Hence, occurrence of mel- of tumor are critical parameters for diagnosis of patient.
anoma is increasing around the globe (Pehamberger et al., 1987). Stages of Melanoma skin cancer: By pathological inspection,
Early detection and diagnosis may cure melanoma completely. Clark scale and Breslow indexing are the methods to measure
The fundamental hazard factors for melanoma, with respect to, thickness and depth of the tumor. These methods can be used only
all skin malignant growth is an introduction to common and coun- after performing incisional or excisional surgery of doubt lesion.
terfeit bright (UV) light. Skin cancer can be developed by tanning Clark scale measures melanoma growth depth and skin levels that
or by hereditary. Melanoma metastasis speaks to the most note- are affected. In this, there are five levels of classification, severity
worthy reason for death from the ailment. increases with level. Breslow index (Breslow, 1970) has 5 stages
Dermoscopy pictures of lesions are inspected carefully. Pig- in it. This helps to find out the width of surgical margins of
mented lesions and non-pigmented lesions need to distinguish excision.
first. Then, if it is pigmented lesion, it is required to check whether SMTP (Chim and Deng, 2010) is distinguishing among appear-
it is non-melanocytic or melanocytic lesion. Further, in the event ance and non-appearance of an element that is viewed as more sig-
nificant than distinction among qualities related to the current
element. Closeness increments as distinction between 2 qualities
⇑ Corresponding author. related with current element diminishes. Commitment of thing
E-mail addresses: rashmiashtagi@gmail.com (R. Patil), sreepathib@gmail.com that matters is ordinarily scaled. Similitude diminishes when the
(S. Bellary).
quantity of essence nonattendance highlights increments. A miss-
Peer review under responsibility of King Saud University.
ing component has no commitment to the closeness.
The key contribution of proposed algorithm is to classify mela-
noma cancer stages based on tumor thickness without any invasive
method.
Production and hosting by Elsevier

https://doi.org/10.1016/j.jksuci.2020.09.002
1319-1578/Ó 2020 The Authors. Published by Elsevier B.V. on behalf of King Saud University.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

The lesion segmentation technique is a basic method of pattern automatic systems exist which identify melanoma and benign skin
recognition algorithms to distinguish melanoma skin malignant lesions from dermoscopy images and using features of dermoscopy
growth in patients at soonest stage, in any case, in further stages images. There is very less work in classifying the melanoma by its
it gets one of the deadliest illnesses and its death rate is extremely stages based on thickness of the tumor. Tumor thickness is very
high. Hence, a precise melanoma stage detection scheme is pre- critical and important to start prognosis of melanoma patient.
sented based on use of SMTP loss function to identify and diagnosis Invasive methods are available as a pathologist examines the skin
of dermoscopic images which classify melanoma based on its stage lesion by performing incisional or excisional biopsy. The non-
using CNN with SMTP. invasive automatic method is required to recognize stage or thick-
Two classification systems have proposed, both work with same ness of suspected lesion to prognosis melanoma patient (Patil and
algorithm. Based on thickness of melanoma skin cancer, the stages Bellary, 2017).
are classified as given below: Barata et al. (2014) proposed two distinct systems to detect
Table 1 shows identification of stages of melanoma skin cancer melanoma detection from dermoscopic image. First utilizes global
based on its thickness. There are 2 stages in it. It classifies mela- characteristics & another utilize local characteristics. These sys-
noma as stage 1 and stage 2. This classifies melanoma in two cat- tems classify whether the lesion is benign or melanoma. Mainly
egories, tumor thickness < 0.76 mm in first stage and tumor color and texture features are used for feature extraction and clas-
thickness  0.76 mm in second stage. sification. The dataset used is PH2 (Hospital Pedro Hispano) and at
Table 2 shows identification of stages of melanoma skin cancer last they determined that the color characteristics achieve better
based on its thickness. There are 3 stages, stage 1, 2 and 3. The first results than texture characteristics. In this only color and texture
system classifies melanoma in two categories, tumor thickness < 0. features are used. Global method may fail to recognize if the object
76 mm in first stage and tumor thickness  0.76 mm in second is complex.
stage. The second system classifier melanoma in three categories Ma and Tavares (2016) proposed a system to segment the
or stages, tumor thickness < 0.76 mm as first stage, tumor infected skin lesion in dermoscopic images and it is based the
thickness  0.76 mm to tumor thickness < 1.5 mm as second stage deformable model. The anticipated algorithm joins data enclosed
and tumor thickness > 1.5 mm as third stage. For classification, the in dermoscopic pictures, and characterizes quickness capacity
novel algorithm is proposed, stage classification using CNN with dependent on a delicacy, immersion and shading data that is devel-
SMTP. oping bend is directed to discontinue at a limit of lesions. Database
There are many methods available that identify melanoma from used is PH2. Deformable models are semi-automatic.
benign using dermoscopic images. It is also essential to identify Abuzaghleh et al. (2015) proposed 2 significant parts of a non-
stage of melanoma cancer by checking the thickness of melanoma invasive continuous computerized skin lesion examination frame-
from dermoscopic images. Very less work is carried out in identifi- work for the early identification and aversion of melanoma. The
cation of stage or type of melanoma cancer using some non- principal segment is a continuous alarm to enable clients to avoid
invasive method. Stage of cancer is very important to prognosis skin consume brought about by sunlight; new equation that calcu-
of patient. In Sáez and Sánchez-Monedero (2016), authors have lates ideal opportunity for skin to consume is along these lines pre-
worked on non-invasive method to detect cancer stage based on sented. The subsequent part is an automatic image investigation
tumor thickness. The proposed framework detects cancer stage element, which comprises image procurement, hair identification
based on tumor thickness and yields improved results than exist- and rejection, lesion division, highlight extraction, and classifica-
ing systems. The main advantage of this proposed system is it clas- tion. The proposed framework utilizes PH2 image database.
sifies the stage of cancer with higher accuracy. The proposed SMTP Sáez and Sánchez-Monedero (2016) proposed an automated
loss function produces very less loss compared to all other loss framework to assess the melanoma thickness from dermocospic
functions which proved very advantageous to improve sensitivity, pictures. 2 supervised classification frameworks were pro-posed.
specificity, and accuracy. In the first scheme is with two classes classifies melanoma into
thin or thick. The bi-nary classification distinguishes between mel-
2. Literature survey anomas situ (thickness < 0.76 mm) and thick ones (thickness
 0.76 mm). The second scheme is with three classes, classifies
There are many methods to identify melanoma. Melanoma need as thick, thin, and intermediate. The three-class scheme, melanoma
to be classified at an early stage and start diagnosis of patient at the are considered in three stages of depth: thin (thickness < 0.76 mm),
earliest. There are many methods and techniques that classify mel- intermediate (thickness 0.76–1.5 mm) and thick (thickness >
anoma and benign skin lesions (Sangve and Patil, 2014). Many 1.5 mm). Logistic regression utilizing LIPU model is a blend of
logistic regression with neural systems is utilized. It maintains
Table 1
up an impressive degree of precision. Results demonstrate that
Classification in 2 stages of melanoma cancer LIPU model gets exact outcomes for the two classes as well as
based on thickness. three-class variants of the issue.
Stages Thickness (mm)
Jaworek-Korjakowska et al. (2019) proposed a system to iden-
tify stage of melanoma cancer based on tumor thickness. Transfer
Stage I < 0.76
Stage II  0.76
learning with VGG16 CNN is used. This system classifies melanoma
in three stages with 87.2% accuracy (see Table 3).
Reshma and Shan (2017) proposed a system based on the total
dermoscopic score to identify stage of melanoma. Image is pre-
processed using rgb2gray conversion and median filter to reduce
Table 2
Classification in 3 stages of melanoma cancer noise. Sobel edge identification algorithm is utilized. Asymmetry,
based on thickness. Border irregularity, Color variation, Differential structure are fea-
tures used to calculate total dermoscopic score.
Stages Thickness (mm)
Rubegni (2010) proposed a system which classifies tumors into
Stage I < 0.76
two classes based on thickness of melanoma as thin and thick
Stage II  0.76 and <1.5
Stage III >1.5
melanomas. The digital dermoscopy analysis makes use of comput-
erized exploration of digital images and it deals investigation of
3286
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

Table 3
Literature survey.

Reference no. Methodology Limitation Advantage of Proposed System (Overcome


limitation in proposed system)
(Sáez and Sánchez-Monedero, 2016) Logistic regression utilizing initial variable Accuracy is comparatively less. Accuracy is more.
and product units.
(Jaworek-Korjakowska et al., 2019) Transfer learning and VGG19 CNN. Accuracy is comparatively less. Cnn with SMTP is more efficient in terms
accuracy, sensitivity, specificity, recall,
precision, F-measure.
(Reshma and Shan, 2017) Rgb2gray for image pre-processing, sobel Not Efficient. Efficient.
edge detection algorithm is used and total
dermoscopic score is calculated.
(Rubegni, 2010) Computerized examination of digital Identifies thin and thick Identifies in 2 stages as well as 3 three stages
images and offers chance of parametric melanoma. more accurately.
investigation of morphological parts of
pigmented skin lesions.

morphological features of lesions by incorporation with the soft- 4. Proposed algorithm


ware. Melanoma images were assessed for 49 different properties
like colour, structure, texture, also integration of these. 141 images 4.1. Stage classification using CNN with SMTP Algorithm:
of melanoma were used and 86.5% of accuracy. 97 out of 108 were
thin melanomas and 25 out of 33 were thick melanomas. The proposed algorithm CNN with SMTP is built with the fol-
Gong et al. (2020) proposed a decision fusion technique which lowing architecture.
is based on many pre-trained CNN. This mainly solves generaliza- Different layers in architecture are:
tion issue of CNN. StyleGANs are trained using ISIC 2019 dataset
and this improves the classification accuracy. Wang et al. (2020) (1) Input
presents a bi-directional dermoscopic feature learning scheme (2) Convolutional
which extracts features. Image parsing ability is improved by (3) Rectified Linear Unit (ReLU)
manipulating the feature propagation. The multi-scale consistent (4) Pooling
decision fusion scheme is also proposed to enhance the consis- (5) ReLU Fully Connected
tency and reliability. Wei et al. (2020) proposed detection and seg- (6) Softmax Fully Connected
mentation of skin cancer from dermoscopic images. It is based on (7) Loss Function
the lightweight deep learning network. This technique can extract Step 1: Input Layer
discriminative lesion features and also improves the identification
performance of the model. Zhang et al. (2020) proposed scheme Grouping x consists of n number of entries. Every entry is sym-
that uses improved whale optimization algorithm to optimize the bolizing to d-dimensional dense vector, in this way x is symbolizes
CNN. This optimized algorithm is used for the optimal selection to as feature map of dimensionality d{n}.
of weights and biases in the network to diminish error of the net- Next is convolution, used for portrayal gaining from descending
work outcome and the desired outcome. w-grams.
For an input sequence with n entries: x1 ; x2 ;    ; xn , where n be
3. System architecture total features in dataset & x be the feature.

Fig. 1 indicates the architecture of cancer stages identification of Step 2: Convolution Operation
melanoma system. Initially, the system is trained using a training
dataset and then testing is performed. Training dataset is trained Initially our plan of attack is convolution operation. In this pro-
data file which is utilized for learning the system. A test dataset gression, we will highlight on feature detectors, which essentially
is a dataset that is not dependent of the training dataset, but it fol- serve as the neural network’s filters.
lows the same probability distribution of training dataset. Then, Z 1
def
f ðsÞg ðt  sÞds

the CNN classification algorithm applied to classify that data for ðf g ÞðtÞ ¼ ð1Þ
1
detecting patient’s stage of melanoma. CNN is used for classifica-
tion because the dataset consists of a large number of records The yields of linear process for example, convolution is sent via
and hence, requires too much time to classify, so by applying nonlinear activation function. The most widely recognized nonlin-
CNN it is easy to classify huge dataset in less time. ear activation function utilizes ReLU that performs following
Classifiers allot every object to a class. This allotment is nor- function:
mally not ideal and objects may be allotted to some incorrect class. f(x) = max (0, x)
For estimating a classifier, correct class of objects should be well-
known. For calculating classifier quality, class allotted by the clas- Step 3: ReLU Layer
sifier is compared with the correct class. This permit objects to be
classified into following: Later, Rectified Linear Unit is involved, which is utilized as acti-
vation function. Essentially, executed in hidden layers of neural
(1) TP: classifier appropriately predicts positive class. network.
(2) TN: classifier appropriately predicts negative class.
(3) FP: classifier wrongly predicts positive class.  A(x) = max (0, x).The equation outcome is x if x is positive else it
(4) FN: classifier wrongly predicts negative class. outputs 0.

3287
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

Fig. 1. System architecture.

 Uses: - ReLU is small computationally costlier than sigmoid &  Output: - Softmax is utilized in output layer of CNN classifier
tanh due to it includes less difficult numerical tasks. At once just where we are at-tempting to do calculation and achieve the
a couple of neurons are enacted making the sparse productive probabilities to specify the class of every input.
system which is simple for calculation. Step 7: Loss Function
Step 4: Pooling
The loss function calculates the difference between real value
The pooling operation includes sliding a 2-dimensional filter and predicted value. Here, proposed SMTP is used.
over each channel of feature map and summarizing the features Characteristics of SMTP are as follows:
existing in area secured by a channel.
 The feature presence or absence is more significant than
Step 5: Full Connection distinction among qualities related to the current
feature.
At this stage everything that we covered all through the section  Similarity degree should increment when distinction
will be merged together. The two processes described before i.e. between two non-zero estimations of a particular feature
convolutions and pooling, can be thought of as a feature extractor, diminishes.
then we pass these features, usually as a reshaped vector of one  Similarity degree should diminish when number of presence or
row, further into the network, for instance, a multi-layer percep- absence features increments.
tron to be trained for classification.
Function F is given as:
Step 6: SoftMax Pm  
j¼1 N d1j ; d2j
F ðd1; d2Þ ¼ Pm   ð3Þ
j¼1 N [ d1j ; d2j
It is a function likewise a kind of sigmoid functions yet conve-
nient while we try to handle classification issues. Standard (unit)
softmax function is given as: 8  n o  
>
>
d d
0:5 1 þ exp 1jr 2j if d1j xd2j > 0
  < j
ezi N  d1j ; d2j ¼ 0
rðzÞi ¼ PK fori ¼ 1;    ; Kandðz ¼ z1 ;    ; zk Þ 2 RK ð2Þ > if d1j ¼ 0 and d2j ¼ 0
j¼1 e
zj >
:
k; otherwise
 Uses: - It is used when there are more than two classes. This ð4Þ
capacity would press the yields for each class somewhere in 
  0; if d1j ¼ 0 and d2j ¼ 0
the range of Pehamberger et al. (1987) and likewise separate N [ d1j ; d2j ¼ ð5Þ
by aggregate of yields. 1; otherwise

3288
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

F ðd1; d2Þ þ k 5.3. Evaluation indicators


sSMTP ðd1; d2Þ ¼ ð6Þ
1þk
Performance Parameters are:
where d1 is the probability of true label and d2 is the probability of
predicted label. ðTP þ TN Þ
Accuracy ¼ ð7Þ
ðTP þ TN þ FN þ FPÞ
J: number of features.
r: standard deviation of all non-zero values. Sensitiv ity ¼ True Positiv e Rate ð8Þ
k: between 0.01 and 0.0001
Specificity ¼ True Negativ e Rate ð9Þ

4.2. Pseudo code: Stage classification using CNN with SMTP algorithm TP
Precision ¼ ð10Þ
ðTP þ FPÞ
(1) Read the input melanoma dataset into an array.
TP
data = read_data(malenoma.csv) Recall ¼ ð11Þ
ðTP þ FNÞ
labels = data[’labels’]. values
num_classes = len(labels)
2  ðRecall  PrecisionÞ
F1 Score ¼ ð12Þ
ðRecall þ PrecisionÞ
(2) Create the placeholders to feed data into network.
n  2
x = tf.placeholder(tf.float32, [None, 4]) 1X bi
Mean Squared Error ðMSEÞ MSE ¼ Yi  Y ð13Þ
y = tf.placeholder(tf.float32, [None, num_classes]) n i¼1

(3) Dividedata into testing and training. Root Mean Squared Error ðRMSEÞRMSE
vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
u n  2
u1 X
x_train,y_train,x_test,y_test = split(data)
¼t Yi  Y bi ð14Þ
n i¼1
(4) Create the network, add 1D-CNN layers since you are not
dealing with images i.e. 2D matrix [W, H] so 1D-CNN will work fine Pn 
bi
2
as it will be returning 1D matrix. Additionally, 1D convolution Yi  Y i¼1
Relative; Squared Error ðRSEÞ RSE ¼ ð15Þ
means applying convolutions along one direction. Add max- Pn  b
2
pooling. i¼1 Z i  Z i

b i = 1 Pn Z i
where Z
conv_1 = tf.nn.conv1d(x, filters, stride, padding) n i¼1

pool_1 = tf.layers.max_pooling1d(conv_1, pool_size, strides,


padding) n is number of training dataset.
i represents ith training dataset.
(5) Apply loss function, compute cost and then apply optimizer Y i is actual class.
to minimize the cost. b i is predicted class.
Y

loss = tf.losses.smtp_error(labels,pred, labels) 5.4. Analysis


optimizer = tf.train.AdamOptimizer(learning_rate).minimize
(loss) The algorithm is trained and tested on the melanoma-10-fold
having multiple test and train file dataset and compared with
5. Results and discussion

5.1. Dataset description

Experiments are performed on melanoma dataset which is


downloaded dataset from https://www.uco.es/grupos/ayrna/ieeet-
mi2015 (Sáez and Sánchez-Monedero, 2016). The dataset is cate-
gorized into binary and multi class dataset having 81 attributes
or features. There are total 250 images of melanoma cancer: 167
melanomas < 0.76 mm, 54 melanomas between 0.76 and
1.5 mm, 29 melanomas > 1.5 mm. We have used extracted features
from these images. 81 features are extracted from these images.

5.2. Experimental setup

Experimental cases are implemented in Java in congestion with


IntelliJ IDEA Community edition tools, techniques, algorithms, and
classification strategy with numerous loss function approaches,
and execute in environment with System having configuration of
Intel Core i5-6200U, 2.30 GHz Windows 10 (64 bit) machine with
8 GB of RAM. Fig. 2. Melanoma 1 and 2 stages error parameters comparison graph.

3289
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

algorithms such as CNN with different loss functions such as Loss Table 4 shows the error performance parameters comparison
Hinge, Loss KL, Loss MSE, Loss Cosine, Loss Cross entropy and Loss readings used to generate the graph shown in Fig. 2.
SMTP. The results are demonstrated in Figs. 2-4, 7–9. Figs. 2 and 7 Table 5 shows the melanoma 1 and 2 stages specificity and sen-
show error performance parameters comparison of 2 stage and 3 sitivity comparison readings which are used to generate the graph
stage system. Figs. 3 and 8 show sensitivity and specificity compar- shown in Fig. 3.
ison graphs of 2 stage and 3 stage system. Figs. 4 and 9 show Pre- Table 6 shows the melanoma stage 1 and stage 2 precision,
cision, Recall, F-measure and Accuracy comparison graph of 2 recall, f-measure and accuracy comparison readings which are uti-
stages and 3 stage system. Figs. 5, 6, 10 and 11 show performance lized to generate the graph shown in Fig. 4.
parameter comparisons of different algorithms with the proposed Table 7 shows the melanoma stage 1 and stage 2 performance
algorithm. The proposed algorithm is better than other loss func- comparison of algorithm readings which are used to generate the
tions indicating that the algorithm has better accuracy and less graph shown in Fig. 5.
MSE, RMSE and RSE errors compare to other loss functions.

Fig. 5. Melanoma stage 1, stage 2, performance parameters comparison graph w.r.t


Fig. 3. Melanoma stage 1 and stage 2 specificity and sensitivity comparison graph. algorithms.

Fig. 4. Melanoma stage 1 and stage 2 Precision, Recall, F-measure and Accuracy Fig. 6. Melanoma stage 1 and 2, specificity and sensitivity comparison graph w.r.t.
comparison graph. algorithms.

3290
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

Fig. 7. Melanoma1, 2 and 3 stages error performance parameters comparison Fig. 9. Melanoma 1, 2 and 3 stages, precision, recall, F-measure and accuracy
graph. comparison graph.

Fig. 10. Melanoma 1, 2 and 3 stages, precision, recall, F-measure and accuracy
Fig. 8. Melanoma1, 2 and 3 stages sensitivity and specificity comparison graph. comparison graph SVM, CNN and CNN + SMTP algorithms.

Table 11 shows the melanoma stage 1, stage 2 and stage 3 pre-


Table 8 shows the melanoma stage 1 and stage 2 sensitivity and cision, recall, f-measure and comparison with various loss function
specificity comparison of algorithm readings which are used to parameters, table readings are used to generate the graph shown in
generate the graph shown in Fig. 6. Fig. 9.
Table 9 shows the melanoma stage 1 and stage 2 sensitivity Table 12 shows the melanoma stage 1, stage 2 and stage 3 pre-
and specificity comparison with SVM, CNN and CNN + SMTP cision, recall, f-measure and comparison with various algorithms
algorithms readings which are used to generate the graph parameters, table readings are used to generate the graph shown
shown in Fig. 7. in Fig. 10.
Table 10 shows the melanoma stage 1, stage 2 and stage 3 sen- Table 13 shows the melanoma stage 1, stage 2 and stage 3
sitivity and specificity comparison with various loss function sensitivity and specificity comparison with various algorithm
parameters, table readings are used to generate the graph shown parameters, table readings are used to generate the graph shown
in Fig. 8. in Fig. 11.

3291
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

Table 8
Melanoma stage 1 and 2, specificity and sensitivitycomparison table with SVM, CNN
and CNN + SMTP algorithms.

Sensitivity Specificity
SVM 84.2 83.33
CNN 84.7 90
CNN + SMTP 96.03 96.33

Table 9
Melanoma stage 1 and 2, specificity and sensitivity comparison table with SVM, CNN
and CNN + SMTP algorithms.

MSE RMSE RSE


HINGE 0.12 0.3464 0.5514
KL 0.08 0.2828 0.3676
MSE 0.085 0.2926 0.3934
MCXENT 0.108 0.3291 0.4979
COSINE 0.085 0.2923 0.3928
SMTP 0.035 0.1871 0.161

Fig. 11. Melanoma 1, 2 and 3 stages sensitivity and specificity comparison graph
SVM, CNN and CNN + SMTP of algorithms.

Table 10
Melanoma stage 1, 2 and stage 3 specificity and sensitivity comparison with various
Table 4 loss function.
Error performance parameters comparison table.
Sensitivity Specificity
MSE RMSE RSE HINGE 53.33 84.16
HINGE 0.2236 0.4728 1.0277 KL 64.44 88.33
KL 0.1855 0.4307 0.8526 MSE 75.55 87.5
MSE 0.1713 0.4139 0.7874 MCXENT 84.7 90
MCXENT 0.161 0.4013 0.7403 COSINE 86.66 91.66
COSINE 0.125 0.3536 0.5746 SMTP 96.03 98.33
SMTP 0.0854 0.2858 0.5124

Table 5 Table 11
Melanoma 1 and 2 stages specificity and sensitivity comparison table. Melanoma stage 1, 2 and stage 3 precision, recall, f-measure and accuracy comparison
with various loss functions.
Sensitivity Specificity
Precision Recall F-Measure Accuracy
HINGE 62.5 81.35
KL 75 82.35 HINGE 72.5 53.33 75.95 80
MSE 62.5 84.11 KL 83.16 64.44 68.15 84
MCXENT 75 88 MSE 95 75.56 82.3 88
COSINE 87.5 88.23 MCXENT 87.96 84.71 86.03 88
SMTP 86.66 91.66 COSINE 96.49 86.67 89.81 92
SMTP 94.44 98.04 95.96 96

Table 6
Melanoma stage 1 and stage 2 precision, recall, f-measure and accuracy comparison
table. Table 12
Melanoma stage 1, 2 and stage 3 precision, recall, f-measure and accuracy comparison
Precision Recall F1 Measure Accuracy with various algorithms.
HINGE 72.43 72.43 62.5 76
Precision Recall F-measure Accuracy
KL 77.08 78.68 70.59 80
MSE 83.77 78.31 71.43 84 SVM 82.5 76 70.4 76
MCXENT 81.62 81.62 75 84 CNN 87.96 84.71 86.03 88
COSINE 85.76 87.87 82.35 88 CNN + SMTP 94.44 98.04 95.96 96
SMTP 93.6 92 92.2 92

Table 13
Table 7 Melanoma stage 1, 2 and stage 3 sensitivity and specificity comparison with various
Melanoma stage 1, stage 2 performance parameters comparison table. algorithms.

Precision Recall F-measure Accuracy Sensitivity Specificity


SVM 83.9 84 83.3 84 SVM 61.03 77.66
CNN 81.62 81.62 75 84 CNN 84.7 90
CNN + SMTP 93.6 92 92.2 92 CNN + SMTP 96.03 96.33

3292
R. Patil, S. Bellary Journal of King Saud University – Computer and Information Sciences 34 (2022) 3285–3293

6. Conclusion prevention 4300212 IEEE J. Transl. Eng. Health Med. 3, 1–12. https://doi.org/
10.1109/JTEHM.2015.2419612.
Barata, C., Ruela, M., Francisco, M., Mendonça, T., Marques, J.S., 2014. Two systems
Melanoma skin cancer is very dangerous and gets metastasized for the detection of melanomas in dermoscopy images using texture and color
very fast. It is required to identify the stage of cancer to start the features. IEEE Syst. J., 965–979
Breslow, A., 1970. Thickness, cross-sectional areas and depth of invasion in the
diagnosis. Classification of the stages of dermoscopic images is
prognosis of cutaneous melanoma. Ann. Surg. 172 (5), 902–908.
considered one of the challenging tasks. Classification of the stages Chim, H., Deng, X., 2010. Efficient phrase-based document similarity for clustering.
of cancer is a very tedious task and very important when the IEEE Trans. Knowl. Data Eng. 20 (9), 1217–1229.
Gong, A., Yao, X., Lin, W., 2020. Dermoscopy image classification based on
patient diagnosis is considered. This paper proposed non-invasive
StyleGANs and decision fusion. IEEE Access 8, 70640–70650. https://doi.org/
stage classification system of melanoma skin cancer. Two systems 10.1109/ACCESS.2020.2986916.
are introduced in this; firstly, two stage classification system Jaworek-Korjakowska, J., Kleczek, P., Gorgon, M., 2019. Melanoma thickness
which classifies melanoma as stage 1 melanoma and stage 2 mel- prediction based on convolutional neural network with VGG-19 model
transfer learning, in: 2019 IEEE/CVF Conference on Computer Vision and
anoma, and three stage classification system which classifies in 3 Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, pp. 2748–2756,
different stages. We utilize an improved CNN neural network http://dx.doi.org/10.1109/CVPRW.2019.00333.
architecture which utilizes SMTP as loss function, likewise model Ma, Z., Tavares, J.M.R.S., 2016. A novel approach to segment skin lesions in
dermoscopic images based on a deformable model. IEEE J. Biomed. Health
contains lot of local filters and has a convolution layer, activation Inform. 20 (2), 615–623.
function, maximization pool layer, batch normalization layer, sig- Patil, R.R., Bellary, S., 2017. Review: melanoma detection & classification based on
moid layer, and loss function in order to accomplish acceptable thickness using dermascopic images. IJCTA 10 (8), 821–825.
Pehamberger, H., Steiner, A., Wolff, K., 1987. In vivo epiluminescence microscopy of
dataset feature learning and test division outcomes. pigmented skin lesions. Pattern analysis of pigmented skin lesions. J. Am. Acad.
We utilized assessment plan from melanoma-10-fold dataset to Dermatol. 17 (4), 571–583.
assess our algorithm, compared with ground truth, and computed Reshma, M., Shan, B.P., 2017. Two methodologies for identification of stages and
different types of melanoma detection, in: 2017 Conference on Emerging
several parameters such as specificity, sensitivity, accuracy, MSE,
Devices and Smart Systems (ICEDSS), Tiruchengode, 2017, pp. 257–259, http://
RMSE and RSE. We demonstrated the strategy accomplished a dx.doi.org/10.1109/ICEDSS.2017.8073689.
higher accuracy, specificity and sensitivity than existing frame- Rubegni, Pietro et al., 2010. Evaluation of cutaneous melanoma thickness by
digital dermoscopy analysis: a retrospective study. Melanoma Res. 20, 212–
works and furthermore it has lower percentage of RAE and RRES
217.
compared to other similar methods (loss functions). The proposed Sáez, Aurora, Sánchez-Monedero, Javier, Gutiérrez, Pedro Antonio, Hervás-Martínez,
method is compared with SVM and CNN, it gives better César, 2016. Machine learning methods for binary and multiclass classification
performance. of melanoma thickness from dermoscopic images. IEEE Trans. Med. Imaging 35,
1036–1045.
Sangve, S.M., Patil, R.R., 2014. Competitive analysis for the detection of melanomas
Declaration of Competing Interest in dermoscopy images. IJERT 3 (6), 351–354.
Wang, X., Jiang, X., Ding, H., Liu, J., 2020. Bi-directional dermoscopic feature learning
and multi-scale consistent decision fusion for skin lesion segmentation. IEEE
The authors declare that they have no known competing finan- Trans. Image Process. 29, 3039–3051. https://doi.org/10.1109/
cial interests or personal relationships that could have appeared TIP.2019.2955297.
to influence the work reported in this paper. Wei, L., Ding, K., Hu, H., 2020. Automatic skin cancer detection in
dermoscopy images based on ensemble lightweight deep learning
network. In: IEEE Access vol. 8, 99633–99647, http://dx.doi.org/10.1109/
ACCESS.2020.2997710.
References Zhang, Ni, Cai, Yi-Xin, Wang, Yong-Yong, Tian, Yi-Tao, Wang, Xiao-Li, Badami,
Benjamin, 2020. Skin cancer diagnosis based on optimized convolutional neural
Abuzaghleh, O., Barkana, B.D., Faezipour, M., 2015. Noninvasive real-time network. Artificial Intelligence Med. 102, 101756. https://doi.org/10.1016/j.
automated skin lesion analysis system for melanoma early detection and artmed.2019.101756.

3293

You might also like