Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

Article | DOI: 10.2478/ijssis-2022-0007 Issue 1 | Vol. 15 (2022)

Efficient way to ensure the data security in cloud computing

Anil Kumar Pallikonda1,*, Kavitha


Chaduvula2, Baburao Markapudi3
Ch Rathna Jyothi4 and Abstract
D N V S L S Indira2 This manuscript proposes a cloud data storage security, which
1
Department of Computer Science has always been an important aspect of Quality of Service (QOS).
and Engineering, PVP Siddhartha Here, an effectual and flexible distributed scheme (DS) with Explicit
Institute of Technology, Kanuru, Dynamic Data Support (EDDS) is proposed to ensure the accuracy of
Vijayawada, Andhrapradesh, India. user data in the cloud. By using hemimorphic token with distributed
verification of erasure-coded data, the proposed scheme achieves
2
Department of Information
the integration of storage correctness insurance and data error
Technology, S R Gudlavalleru localization. The proposed scheme supports secure and efficient
Engineering College, Gudlavalleru, dynamic operations on data blocks, such as data update, delete
Andhrapradesh, India. and append. The performance analysis shows that the proposed
3
Department of Computer Science scheme is highly efficient.
and Engineering, S R Gudlavalleru
Engineering College, Gudlavalleru, Keywords
Andhrapradesh, India. Localization, Data, Security, Cloud computing.
4
Department of Computer Science
and Engineering, Andhra Loyola
Institute of Engineering and
Technology, Vijayawada, Andhra
Pradesh, India.
*E-mail: anilkumar.pallikonda@
gmail.com
This paper was edited by
Subhas Chandra Mukhopadhyay.
Received for publication
September 14, 2021.

Introduction Although these web-based online services give


enormous amount of storage and customizable
Many trends usher in period of cloud computing, calculating, this shift in computer platform assumes the
utilizes web-based growth and mainframe technology. dependability of local machines to maintain the data.
Increasingly affordable and influential processors, As a consequence, users are approach to its cloud
coupled through software-as-a-service (SaaS) are service access and integrity of its data, for example
converting data centers addicted to large-scale current Amazon S3 crash (Bello et al., 2021; Jayashri
computing suites. The growing network bandwidth and Kalaiselvi, 2021; Ogwel et al., 2021; Aissaoui
and dependable flexible network connections creates et al., 2022; Koushik and Patil, 2022). From significant
it probable for consumer to give to the great-requalify feature of service quality and data security perspective,
services as data and software that exist at isolated cloud computing has unavoidable challenging safety
data centres. The complexity of straight hardware threats. First, conventional cryptographic antiquities
management does not consider. Nowadays, Amazon cannot be unswervingly considered for the reason
Simple Storage Service (S3), Amazon Elastic Compute that users in cloud computing protect data safety
Cloud (EC2) (Kumar and Kumar, 2021) are very based on controlling data loss. Thus, the verification
famous, these are the pioneers of cloud computing. of the accurate storage of data under cloud should

☉ Open Access. Published by Sciendo. cc BY 4.0


1
© 2022 Authors. This work is licensed under the Creative Commons Attribution-Non-
Commercial-NoDerivs 4.0 License https://creativecommons.org/licenses/by-nc-nd/4.0/
Data Security in Cloud Computing: Pallikonda et al.

be completed in the nonexistence of clear knowledge • Compared to numerous predecessors that


of entire data. Assume several sorts of data as every only give binary outcomes on storage status
user stored under cloud along with requirement for a on dispersed servers as challenge-response
continuous long-term guarantee security of its data protocol under this function also gives data
trouble verifies the accuracy of data storage under error localization.
cloud. Secondly, cloud computing is not 3rd party • Unlike most previous work to ensure remote
data warehouse. To update data on cloud, involving data, an innovative plan supports safe along with
inserting, deleting, modifying, adding, reordering, so well-organized dynamic functions under data
on. Therefore, ensuring storage accuracy in dynamic blocks, namely, data update, delete and append.
data inform is the utmost consequence. However, this • To analyze the safety with secrecy issues on
dynamic aspect renders habitual reliability assurance cloud and mobile cloud systems, this manu-
processes of no use and leads with innovative solu­tions. script explores the related attacks and own
The Cloud Computing implementation driven through counter measures to safeguard the systems.
data centers cooperatively as well as dis­persed. Each • A widespread safety and efficiency perfor-
user data are stored redundantly on manifold physical mance portrays that the proposed system
locations to lessen threats to data integrity (Babu and extremely well-organized as well as resistant
Senthilkumar, 2021). Thus distributed protocols to beside Byzantine flaws, malicious data altera-
ensure storage accuracy resolve are utmost significance tion attacks along with server collusion attacks.
to accomplish robust and protected cloud data storage • Finally, this manuscript concludes that the
under real world. Currently, this significance of ensure cloud, the mobile cloud computing platforms
integrity of distant data has been highlighted with are appropriate for hosting and analyzing the
subsequent research papers (Mythili et al., 2020; Rajesh big data. To threaten the cloud and mobile
and Shajin, 2020; Shajin and Rajesh, 2020; Thota cloud computing environments, several de-
et al., 2020; Mishra et al., 2021). While processes may be veloped and developing security attacks are
helpful in en­suring storage correctness in the absence existed. Securing big data from these sites
of users owning data, they cannot deal entire security should be a new proficient countermeasure.
threats under cloud data storage. The researchers also
proposed dispersed protocols to make sure storage Rest of this manuscript is organized as follows: The
accu­racy across multiple servers. As a consequence, section “Literature review” describes the related
its applicability to cloud data storage may be drastically recent studies. The section “Proposed methodology”
restricted (Hu et al., 2021). illustrates about the implementation process of the
Hence, these problems are motivated to develop the proposed model. The section “Results and discussion”
new methodology to perform the security process in proves the result and discussion, the section 5 con­
cloud computing process (Alashhab et al., 2021). Thus cludes the manuscript.
to enhance the data security, this manuscript develops
the novel methodology that can enhance the security
level, which are evaluated by introducing Byzantine Literature review
flaws, malicious data alteration attacks, along with
Some of recent literatures based on security in cloud
server collusion attacks. The key contributions of this
computing are discussed below,
work is mentions as below,
Chinnasamy et al. (2021) have presented the hybrid
approach that was the combined form of ECC and
• In this manuscript, enhance the security level Blowfish for securing data in cloud computing. Cloud
of the data on cloud computing, the new login providers typically have trouble ensuring files were
process is introduced that can improve the se- protected, as security was the largest issue to handle
curity of the data files. and transfer data, as they may be retrieved, misused,
• Owing to safety and secrecy are noteworthy and demolished under the form of original data. Cloud
issues in cloud technology, it explores techno- security was huge concern on cloud computing
logical developments and practices that can environment. To protect the cloud environment,
meet this challenge. numerous investigation papers were suggested. To
• After presenting the concerned attacks with overwhelmed security problem and accomplish CIA
threats of cloud computing, it investigates the ownership (confidentiality, integrity and obtainability),
well organized as well as resistant beside Byz- cryptography was utilized. It has certain restrictions
antine flaws, malicious data alteration attacks, in traditional symmetric and asymmetric. ECC and
along with server collusion attacks. Blowfish were combined to perform a hybrid approach

2
INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

that was utilized to overcome the draw­ backs of directly by users in cloud computing. So, vali­
symmetry and skewness. dation of exact data storage must be carried
Abdullayeva (2021) have presented the machine- out without explicit knowledge of the entire
encoder-base deep learning methodology to APT data. Numerous types of data are considered
attack identification. The gain of the presented method for every user saved in the cloud and the de-
was accomplished a great classification consequence mand of long-term continuous assurance of
when recognizing composite relation­ ships among their data safety.
features on database. In addition, the presented 2. Second, Cloud Computing is not just third-par-
method shortens the procedure of categorizing huge ty data warehouse. The data stored in the cloud
count of data by decreasing the data size. Initially, the may be frequently updated by the users, such
autocoder neural network was used and revealing as insertion, modification, deletion, reordering.
characteristics were considered as network traffic data Therefore, ensuring correct storage under dy-
under unsupervised manner. namic data update is utmost significance.
Velliangiri et al. (2021) have suggested a deep
learning-base classifier to detect the DDoS attack. Without users having, these techniques are useful for
Certain significant characteristics through the log ensuring the optimal quality of storage, it does not
file were chosen for classification with Bhattacharya handle all the security threats, because they are all
distance measure to diminish classifier training time. focuses on single server environment and most do
Taylor-Elephant Herd Optimization de­pending on Deep not consider dynamic data functions.
Belief Network was evolved by adjusting Elephant Herd
Optimization (EHO) through Taylor series, thus the Proposed methodology
presented algorithm was assu­med to train Deep Belief
Network (DBN) Detection of DDoS attacks. In this manuscript, an effectual and flexible Distributed
Shaikh and Meshram (2021) have presented service Scheme with Explicit Dynamic Data Support is
and deployment modes as well as essential features proposed to make sure the accuracy of user data
of cloud computing, secure and privacy issues on in the cloud. Relies on erasure-encoded under file
cloud. The cloud computing services were analyzed, distribution preparation to deliver redundancies and
such as software as service, platform as service and ensure data reliability. This construction dramatically
infrastructure as service. The several vulnerabilities, lessens the communication and storage overhead
attacks, and protection mechanisms were given to likened with conventional replication-base file dis­
protect the cloud environment. tribution systems. The hemimorphic token through
Su et al. (2021) presented a decentralized self- distributed verification of erasure-encoded data, the
auditing system for multiple cloud storage, known proposed system attains storage exactness assurance
as DSAS. Initially, the symmetric balanced imperfect as well as data error localization: when data corruption
block design, DSAS accomplishes integrity validation is noticed through storage correctness verification.
through cloud server connections and audit costs
were shared through CSs. Second, DSAS may locate 1. Related to numerous predecessors only deli­
misbehaving cloud servers through low compute costs ver binary consequences on storage status on
including resist denial-of-service attacks. Third, DSAS distributed servers, the challenge-response
may recover despoiled data devoid of getting data. At protocol gives data error localization.
last, the security test and function assessment portrays 2. Unlike most previous work to make sure remote
that DSAS has inclusive security and functionality data integrity, the proposed scheme facilitates
and outcomes of experiments portrays that DSAS is safe and well-organized dynamic operations on
efficient. data blocks.
3. The performance analysis portrays that the
proposed system is more effectual and resilient
Existing system against Byzantine flaws, malicious data modi­
fication attacks.
From the point of view of data security, the quality of
service has always been a significant factor, as cloud
Modules
computing unsurprisingly presents novelstimulating
security threats for number of reasons. Client module
1. First of all, typical cryptographic primitives The client sends the query with server. With respect
to protect data security cannot be adopted to the query, the server sends the related file to the

3
Data Security in Cloud Computing: Pallikonda et al.

client. At the server side, it verifies the client name Here, 3 different network entities can be recog­
and password of security process. If it is fulfilled, nized as follows:
then received the customer queries and search the • User: Users, who have data to be saved on
related files in the database. At last, identify that cloud, and rely on the cloud for data calculation,
file and forward to the client. If server identifies the has individual consumers and organizations.
intruder means, it set the alternative path to those • Cloud Service Provider (CSP): It contains
intruders. The process of the client module is vital resources along with experience under
depicted in Figure 1. works with cloud storage, owned and direct
cloud computing dispersed under the struc-
System module ture. The CSP contain certain inevitability and
request a customer to obligate with reticent
The network structure for cloud data storage is de­ capacity; the capacity reservation, the cloud
picted in Figure 2. service provider shares the risk through cloud
service customers. Thus, CSP reduces the
risks of initial investment in cloud infrastructure.
• Third Party Auditor (TPA): The user is trou-
bled about integrity of data stored on the cloud,
as user’s data may be attacked or altered via
an external attack. That is why, a new concept
named data auditing is suggested that verifies
the integrity of the data using an entity named
TPA. It is a preferred TPA through experience
and expertise and is believed to evaluate and
describe on behalf of clients seeking the possi-
bility of cloud storage services.

Cloud data storage module


Here, the consumer stores its data throughout the
position of cloud servers, runs at same time as the
user communicates through cloud servers during
CSP to evaluate or recover data. Users must perform
volume-level functions in their data. Users must be
Figure 1: Client module.
security measures in place to make sure consistent
validity of its store data, still the absence of local
copies. If consumers unavoidably do not contain time,
possibility or resources to check its data, it may hand
over tasks with preferred trusted TPA of its particular
selection. Consider that end-to-end communication
channels among every cloud server and consumer
are authentic and trusted that may be accomplished
within carry out through small overhead.

Cloud authentication server


It works through some extra behaviour included
with usual customer authentication protocol. This
initial accumulation sends that customer verification
information with masked router. The authentication
server (AS) under this model operates from ticket
influence, control permissions under the request
Figure 2: System architecture. network. The other optional purpose of assisting with
AS is inform customer lists, to diminution the customer

4
INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

from the valid customer based on the diminution or


request based on the authorization time.

Unauthorized data modification and


corruption
The main problems with successfully detecting un­
authorized data transfer and corruption is probably
based on server compromises and random Byzantine
flaws. Also, at dispersed case, while these discre­
pancies are effectively noticed, it is very important to
notice the presence of data error on server.

Adversary module
Safety threats facing CDS may appear as 2 bases. For
one thing, CSP may be self-interested, untrustworthy,
and probably hateful. Do you want to move data
Figure 3: System architecture in DFD.
has not been entrance or hardly ever access with
minimum level of storage than decided for monetary
reasons, to hide an incident of data loss based on
administration errors, Byzantine flaws etc.
In contrast, it can also be inexpensively aggravated
adversary consists of ability for compromising series
of cloud data storage servers at diverse time duration
can consequently adjust or remove user data without
being detected by CSPs for specified period of time.
Particularly, assume two sorts of adversaries through
dissimilar skill levels under this document:
Weak Adversary: This adversary is fascinated
under humiliating consumer data files. The adversary
may contaminate that innovative data files through
adjusting or entering their own fraudulent data to
avoid user as recovering of original data.
Strong Adversary: This is a very bad situation;
consider that the enemy may negotiate entire storage
server as data files may be deliberately repaired Figure 4: Use case diagram.
internally as long as they are trusted. This applies to
wherever entire servers cooperate to cover up data
loss or corruption.
the resources, which can provide the IP address
System architecture of the user. At that time, the unwanted user can’t
able to access the files because it is blocked by the
The proposed model procedure is detailed in this administrator. The original user can able to find the
section, which is elaborated the DFD client architecture files and access the data without any interruption.
that is given in Figure 3. In this, non-existing user as
well as existing user are must enter the particular Security analysis
username and password for identifying the data. If the
password is correct means the server is connected The analysis of DS model with EDDS proposed for
with the client, otherwise it is automatically rejected. data security during journey towards this cloud
At the same time, user can login to the particular computing pattern presents that subsequent steps
data by using some procedures that are detailed revealed in which data may be very susceptible
in use case diagram in Figure 4. Here, user and threats such as data leakage, modification, user
administrator to login the data and then access privacy and discretion, so on. The proposed DS with

5
Data Security in Cloud Computing: Pallikonda et al.

EDDS model is intended to address entirely of these Even server colluding attacks
security problems proficiently.
In this, the proposed scheme detects even the
information of the server collusion attacks on temporal
Unauthorized server with spatial dimensions on distributed mode. The
As data must be conveyed over network with cloud temporal documentation is done in a way that it
are frequent attacker may simply break addicted to denotes to sudden changes on correlation map of
Internet-based network and performance as cloud node. For instance, at a particular time, certain nodes
server owner of the data, thereby results in data loss. can pass information to opponent, then it continues to
To avoid data loss in this situation, EDDS is used function. At dissimilar times, the opponent can obtain
in this model. Certification authorities (CAs) issue information for dissimilar nodes. The correlation may
both certificate is recommendation of online world. change somewhat and changes may be monitored
The cloud server initially directs that recognizing through certain safety protocols. Also, a node portrays
information to owner. The owner confirms that working appropriately and communicating through
certificate and directs a message with server directs neighbors, and their confirmation with other nodes. At
a numerally signed acknowledgment, allowing en­ any given time, the node can pass more information.
crypted data transfer among browser and server. In this, the proposed DS with EDDS model is to
Furthermore, data and keywords are put in storage in identify the attacks like byzantine failure, malicious
the cloud on an encrypted form. data modification attack, even server colluding
attacks. Here, the attack agent as k and every attack
agent is mentioned eqn. (1),
Byzantine failure
Data security is main problem when consumers trust { }
z ki = z ki 1, z ki 2 ................ z kn
i
 (1)
on third-party services due to potential for Byzantine
cloud flaws. Byzantine faults are assumed more where k denotes the attack agent with iteration i and
hazardous to non-latent faults of cloud computing n represent the total collection of data’s. Additionally,
environment. System components may hurt as the random variable of the model for each attack
malicious failures, generating arbitrary results. These agent k with iteration i is defined as eqn. (2),
faults are named Byzantine faults. Byzantine faults
are hard to notice previously they cause system harm.  z i ; agent k is non − faulty
ski =  ki (2)
This work recognizes the Byzantine flaw on cloud  gk ; agent k is faulty 
computing environment by DS with EDDS, to make
sure the robustness of the multi-cloud environment. where the arbitrary d-dimensional random variable for
This attack was detected in the client side by EDDS every attack agent (faulty agent of attacks) is denoted
and removed for secure communication. as gki . Also, the random variable for each iteration is
mentioned in eqn. (3),
Malicious data modification attack
{
s i = ski , k = 1, 2,........., n  } (3)
The data modification attack is active attack; it is
depend on the intervention of the swapped data. If the proposed model identifies the attack agent in
This data may be adapted and removed to modify the data then it avoids that node while transmitting
that understanding of the message and prevent the the information. Therefore, the proposed DS with
information from reaching the recipients, eg. in the EDDS model is effectual and opposed to the attacks
event of an accident or traffic congestion. Modification of Byzantine flaws, malicious data modification, even
is defined as an attack on the original data integrity server collusion.
that means the unauthorized parties not only get
access to data but also deceive the data by inducing Results and discussion
denial-of-service attacks, like modifying forward data
packets or flooding the network with fake data. Thus, The implementation of proposed DS with EDDS
the proposed model has effectively detects the fake method is done by NetBeans, which is the open
data in server side, which are tested by asking quires source and the integrated development environment
by the user. The login ID and password should be (IDE) supports the language Java. The proposed
enter by the client side if it is correct means the data method simulations are run in PC through Intel
will be transmitted otherwise it rejected. Core, 2.50 GHz CPU, 8 GB of RAM. Therefore, it

6
INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

may be assumed to be most dangerous stage in


accomplishing effective novel system of confidence.
The execution phase comprises careful planning,
exploration of existing system and their limitations,
design of approaches to accomplish change, and
assessment of approaches of change. Thus, the
outcomes for cloud server login, client side login,
admin login and successful login are obtained.
The cloud computing structure has various types
of configurable distributed systems along various
connectivity and usage. The cloud computing
structure has various types of configurable distributed
systems along various connectivity and utilization. Due
to cost-effective, scalability, reliability and flexibility, the
organizations are adapting to cloud networks in rapid
pace. Even though the merits of cloud computing are Figure 6: Client side Login.
dependable, the cloud networks are susceptible to
numerous categories of network attacks as well as
secrecy issues. Initially, the authenticated person is
tried to login the server by the personal user ID and
password that is already used in server. The process
of login in the cloud server is detailed in Figure 5.
Here, the user can log in to the data and then
access the resources, which can provide the user’s
IP address. At that time, the unwanted user could
not access the files because it was blocked by the
administrator. The original user can detect the files
and access the data without any interruption. The
process of login by the client side is shown in Figure 6.
The process of admin login is humble; users input
its credentials on website’s login form. This information
forward with authentication server wherever the
information is likened to every consumer credentials
on file. The system can validate consumers and
grant them access to its accounts when found the
Figure 7: Admin login.
competition. The admin login is represented in Figure 7.
The successful login is represented in Figure 8.

Figure 5: Cloud server login. Figure 8: Successful login.

7
Data Security in Cloud Computing: Pallikonda et al.

Cloud storage is defined as cloud computing safety data storage by separating data across different
system that stores data on Internet via cloud cloud servers, here internal threats cannot misuse data
com­puting earner that achieves and operates the or retrieve information from saved data on the server.
database as service.It delivers on demand through During the process of transmission, the data must be
timely efficiency and cost, and eliminates buying encrypted. Higher proficiency data processing: this
and managing own data storage infrastructure. system averts higher communication with computation
In cloud computing, a secure data depending on overhead to lessen the latency.
distributed system. After attaining the cloud, data can Table 1 shows that the validation and verification
be stored randomly in one or more server. As per results of the proposed model of resilience to the
the storage mode characteristics, every server may failures and attacks. Here, the proposed approach
be abstracted as a storage node in the distributed has achieved better performance in terms of attack
system. The data storage details are exposed on level rate, Security level rate, attack detection rate,
(Figure 9a, b and c). and classification accuracy.
The intention of this method is to simultaneously
attain certain targeted performance that can assure the Performance analysis
data secure necessary for specific data users, namely,
financial practitioners, auditing professionals. Protecting The performance metrics like attack level rate,
threats from internal threats: it aims to access high-level security level rate, and classification accuracy for

Figure 9: (a–c) Data storage details.

8
INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

Table 1. Validation and verification results of the proposed model of resilience to the
failures and attacks.

Malicious data Even server


Performance metrics Byzantine failure
modification attack colluding attacks

Attack level rate 5.6 7.5 6.3


Security level rate 91.5 89.6 90.8
Classification accuracy (%) 95.7 94 97.6
Attack detection rate 0.91 0.896 0.935

Byzantine failure, malicious data modification attack here Mn refers total number of malicious nodes
are calculated through proposed approach. The present on network, and Tn is the total quantity of
efficiency of the proposed model is compared with nodes present in the network.
existing models, like, ECC with Blowfish (ECC-BF)
(Chinnasamy et al., 2021), Autoencoder with Softmax Security level rate
Regression Algorithm (AE-SRA) (Abdullayeva, 2021),
Optimization based Deep Network (ODN) (Velliangiri Security level rate is defined as the rate of attack
et al., 2021), SDE with SPIaaS (Shaikh and Meshram, detection by the proposed method that is distinct
2021) and Decentralized Self-auditing Scheme as the ratio of accurately detect the attacks in the
with Errors Localization (DS-EL) (Su et al., 2021) network which is calculated using eqn. (6),
techniques.
Tp
Sl = (6)
Tp + Fn 
Classification accuracy
The classification accuracy of proposed method is
calculated using eqn. (4), Comparative analysis of performance
metrics
Tp + Tn
Acc = (4) The performance metrics, like classification accuracy,
Tp + Tn + Fp + Fn  attack level rate, and security level rate of the
proposed model is likened with existing, ECC-BF,
Let, Tp refers count of attacks categorized as attacks, AE-SRA, ODN, SDE-SPIaaS and DS-EL models. The
Tn denotes count of normal categorized as normal, performance analyses of proposed model through
Fp denotes count of normal categorized as attack existing processes are detailed in Table 2.
and Fn denotes count of attacks categorized as Here, the attack level rate of the proposed DS
normal. with EDDS method has achieved 65%, 53.5%, 62%,
51.4%, and 63.5% lower than existing ECC-BF, AE-
SRA, ODN, SDE-SPIaaS and DS-EL methods for
Attack level rate evaluating Byzantine failure. Also, the attack level
Attack level rate is defined as the rate of malicious rate of the proposed DS with EDDS method has
nodes or devices present in network to the total achieved 53.5%, 48%, 68.4%, 45.6%, and 63.6%
number nodes present on network, which is calculated lower than existing methods for evaluating malicious
using eqn. (5). data modification attack. Moreover, the attack level
rate of the proposed DS with EDDS method has
achieved 61%, 52%, 65%, 55%, and 63.6% lower
Mn
AL = × 100 (5) than existing methods for evaluating even server
Tn − Mn  colluding attacks.

9
10
Table 2. Performance analysis.

ECC-BF AE-SRA ODN SDE with SPIaaS DS-EL DS with


Performance
Attack types (Chinnasamy (Abdullayeva, (Velliangiri (Shaikh and (Su et al., EDDS
metrics
et al., 2021) 2021) et al., 2021) Meshram, 2021) 2021) (Proposed)
Data Security in Cloud Computing: Pallikonda et al.

Attack level rate Byzantine failure 25 16.5 23.6 19 23 5.6


Malicious data 23.6 18.95 26.7 21.5 25 7.5
modification attack
Even server colluding 19.6 15.8 25.7 18.7 21.8 6.3
attacks
Security level rate Byzantine failure 73.5 68.4 55.6 58.9 76.8 91.5
Malicious data 78.6 75.4 59.3 55.7 68.04 89.6
modification attack
Even server colluding 71.65 71.54 63.6 53.9 73.8 90.8
attacks
Classification Byzantine failure 73 85.6 60.8 78 83 95.7
Accuracy (%) Malicious data
modification attack 75.7 81.5 70.2 76.5 82 94
Even server colluding 65.4 78.7 73.5 68.5 79 97.6
attacks
INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS

Subsequently, the security level rate of the verification and dynamic data storage can create a
proposed DS with EDDS method has achieved 26%, plan to fulfil the right warranty. In addition, a brief data
33%, 48%, 45%, and 32% higher than existing ECC- error investigation into vibrant cloud data programs to
BF, AE-SRA, ODN, SDE-SPIaaS and DS-EL methods inspect that the issue of localization.
for evaluating Byzantine failure. Also, the security
level rate of the proposed DS with EDDS method has
achieved 29%, 18%, 47.5%, 54%, and 32.8% higher Literature Cited
than existing methods for evaluating malicious data
modification attack. Moreover, the security level rate Abdullayeva, F. J. 2021. Advanced Persistent Threat
of the proposed DS with EDDS method has achieved attack detection method in cloud computing based on
28%, 33%, 17.5%, 43%, and 23.8% higher than autoencoder and softmax regression algorithm. Array
existing methods for evaluating even server colluding 10: 100067.
attacks. Aissaoui, K., Amane, M., Berrada, M. and Madani,
Additionally, the classification accuracy of pro­ M. A. 2022. A new framework to secure cloud based
e-learning systems. In Bennani, S., Lakhrissi, Y., Khaissidi,
posed DS with EDDS system has achieved 44%,
G., Mansouri, A. and Khamlichi, Y. (Eds), WITS 2020.
10%, 53%, 32%, and 12.4% higher than existing ECC-
Springer, Singapore, pp. 65–75.
BF, AE-SRA, ODN, SDE-SPIaaS and DS-EL methods Alashhab, Z. R., Anbar, M., Singh, M. M., Leau,
for evaluating Byzantine failure. Also, the classification Y. B., Al-Sai, Z. A. and Alhayja’a, S. A. 2021. Impact
accuracy of the proposed DS with EDDS method of coronavirus pandemic crisis on technologies and
has achieved 32%, 15%, 33.4%, 28%, and 13.7% cloud computing applications. Journal of Electronic
higher than existing methods for evaluating malicious Science and Technology 19(1): 100059.
data modification attack. Moreover, the classification Babu, M. C. and Senthilkumar, K. 2021. Machine
accuracy of the proposed DS with EDDS method learning based strategies for secure cloud. Materials
has achieved 45.7%, 32%, 38.6%, 42.4%, and 30% Today: Proceedings.
higher than existing methods for evaluating even Bello, S. A., Oyedele, L. O., Akinade, O. O., Bilal, M.,
server colluding attacks. Delgado, J. M. D., Akanbi, L. A., Ajayi, A. O. and Owolabi,
H. A. 2021. Cloud computing in construction industry:
Use cases, benefits and challenges. Automation in
Conclusion Construction 122: 103441.
Chinnasamy, P., Padmavathi, S., Swathy, R. and
In this manuscript, issue of data safety under cloud Rakesh, S. 2021. Efficient data security using hybrid
data storage is primarily dispersed storage is studied. cryptography on cloud computing. In Ranganathan, G.,
The accuracy of user data under cloud data storage Chen, J. and Rocha, Á. (Eds), Inventive Communication
introduced well-organized and flexible dispersed and Computational Technologies. Springer, Singapore,
system through clear dynamic data, involving block pp. 537–547.
inform, erase and append. The eraser correction code Hu, Q., Wu, F., Wong, R. K., Millham, R. C. and
under file distribution is prepared to give idleness Fiaidhi, J. 2021. A novel indoor localization system
balance vectors and make sure data reliability. using machine learning based on bluetooth low energy
Hemimorphic token through dispersed authentication with cloud computing. Computing 2021: 1–27.
of erased encoded data, program storage accom­ Jayashri, N. and Kalaiselvi, K. 2021. Collabora­
plishes combination of proper insurance and data tive approaches for security of cloud and knowl-
error, while data corruption is noticed when the edge management systems: benefits and risks. In
distribution service is properly verified. Through Bhardwaj, A. and Sapra, V. (Eds), Security Incidents
& Response Against Cyber Attacks. Springer, Cham,
comprehensive safety and performance display
pp. 57–64.
that system is extremely well-organized as well as
Koushik, S. and Patil, A. P. 2022. Exploring live
resistant to Byzantine flaws, cruel data modification
cloud migration on Amazon S3 instance using inter
attacks, still server collusion attacks. The security cloud framework. In Gandhi, T. K., Konar, D., Sen,
of data storage under cloud computing enormously B. and Sharma, K. (Eds), Advanced Computational
significant and challenging area that is still in early Paradigms and Hybrid Intelligent Computing. Springer,
level and many research problems are not yet Singapore, pp. 221–229.
recognized. Various probable directions for future Kumar, L. and Kumar, P. 2021. Amazon EC2:(Elastic
investigation are carried out in this area. Public Compute Cloud) overview. In Singh Mer, K. K., Semwal,
verification supports TPA to check cloud data storage V. B., Bijalwan, V. and Crespo, R. G. (Eds), Proceedings of
in the absence of demanding time, possibility or Integrated Intelligence Enable Networks and Computing.
resources. An attractive question is whether public Springer, Singapore, pp. 543–552.

11
Data Security in Cloud Computing: Pallikonda et al.

Mishra, P., Aggarwal, P., Vidyarthi, A., Singh, P., Khan, V. B., Khandare, A. and Patil, M. (Eds), Intelligent
B., Alhelou, H. H. and Siano, P. 2021. VM Shield: Memory Computing and Networking. Springer, Singapore,
Introspection-based Malware Detection to Secure pp. 63–77.
Cloud-based Services against Stealthy Attacks. IEEE Shajin, F. H. and Rajesh, P. 2020. Trusted secure
Transactions on Industrial Informatics 17(10): 6754–6764. geographic routing protocol: outsider attack detection
Mythili, S., Thiyagarajah, K., Rajesh, P. and Shajin, in mobile ad hoc networks by adopting trusted secure
F. H. 2020. Ideal position and size selection of unified geographic routing protocol. International Journal of
power flow controllers (UPFCs) to upgrade the dynamic Pervasive Computing and Communications 1(1): 1.
stability of systems: an antlion optimiser and invasive Su, Y., Li, Y., Yang, B. and Ding, Y. 2021. Decentralized
weed optimisation algorithm. HKIE Trans 27(1): 25–37. self-auditing scheme with errors localization for multi-
Ogwel, B., Odhiambo‐Otieno, G., Otieno, G., Abila, cloud storage. IEEE Transactions on Dependable and
J. and Omore, R. 2021. Leveraging cloud computing for Secure Computing 1(1): 1.
improved health service delivery: Findings from public Thota, M. K., Shajin, F. H. and Rajesh, P. 2020. Survey
health facilities in Kisumu County, Western Kenya‐2019 on software defect prediction techniques. International
Western Kenya: Learning Health Systems, pp. e10276. Journal of Applied Science and Engineering. 17(4): 331–344.
Rajesh, P. and Shajin, F. A. 2020. Multi-objective Velliangiri, S., Karthikeyan, P. and Vinoth Kumar, V.
hybrid algorithm for planning electrical distribution system. 2021. Detection of distributed denial of service attack
European Journal of Electrical Engineering 22: 224–509. in cloud computing using the optimization-based
Shaikh, A. H. and Meshram, B. B. 2021. Security deep networks. Journal of Experimental & Theoretical
issues in cloud computing. In Balas, V. E., Semwal, Artificial Intelligence 33(3): 405–424.

12

You might also like