4AD20EC032 KUSUMA Edge Computing Tech Report

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

VISVESVARAIAH TECHNOLOGICAL

UNIVERSITY
“Jnana Sangama”, Belagavi-590018, Karnataka

Technical Seminar Report


on

“Edge Computing: Classification, Applications, and


Challenges”
Submitted in partial fulfillment of the requirements for the award of the degree of

Bachelor of Engineering
In
ELECTRONICS AND COMMUNICATION ENGINEERING
For the Academic Year 2023-24
Submitted By
KUSUMA M : 4AD20EC032

Under the Guidance of


Mr. Chandra Shekar P
Asst. Professor
Department of ECE

13th Kilometer, Mysore – Kanakapura – Bangalore Road,


Mysore – 570 028, Karnataka
Phone: +91-821-25 93 335
ATME College of Engineering, Mysuru
13th Kilometer, Mysuru – Kanakapura – Bangalore Road,
Mysuru – 570028

DEPARTMENT OF ELECTRONICS AND


COMMUNICATION ENGINEERING

CERTIFICATE
Certified that the Technical Seminar entitled “Edge Computing: Classification,
Applications, and Challenges” carried outby KUSUMA M [4AD20EC032] Bonafede
student of ATME College of Engineering, Mysuru in partial fulfillment for the award
of Bachelor of Engineering in Electronics and Communication Engineering, of the
Visvesvaraiah Technological University, Belagavi during the year 2023-24. It is
certified that all corrections/suggestionsindicated for Internal Assessment have been
incorporated in the Report deposited in the departmental library.
The project report has been approved as it satisfies the academic requirements in respect
of Project work prescribed for the said Degree.

Name & Signature Name & Signature


Guide HOD
Seminar Viva-Voce

Name of the Examiners Signature with date


1.
2.
DEPARTMENT VISION AND MISSION
• To develop highly skilled and globally competent professionals in the field of
Electronics and Communication Engineering to meet industrial and social
requirements with ethical responsibility.

Mission
• To provide State-of-art technical education in Electronics and Communication at
undergraduate and post-graduate levels, to meet the needs of the profession and
society and achieve excellence in teaching- learning and research.
• To develop talented and committed human resource, by providing an opportunity
for innovation, creativity and entrepreneurial leadership with high standards of
professional ethics, transparency and accountability.
• To function collaboratively with technical Institutes/Universities/Industries,
offer opportunities for interaction among faculty-students and promote networking
with alumni, industries and other stake-holders.
Program Specific Outcomes (PSOs)

At the end of Graduation, the student will be able,

• To have the capability to understand and adopt the technological advancements with
the usage of modern tool to analyze and design embedded system or processes for
variety of applications.
• To work effectively in a group as an independent visionary, team member and
leader having the ability to understand the requirement and develop feasible
solutions to emerge as potential core or electronic engineer.
ACKNOWLEDGEMENT
We would like to express our immense gratitude to Dr. L Basavaraj , Principal,
ATMECE, Mysuru for his timely help and inspiration during the tenure of the course.

We would like to express our deep gratitude to Dr. L Basavaraj, Professor and
Head, Department of Electronics and Communication Engineering, ATMECE, Mysuru for
his timely co-operation while carrying the project work. His friendliness made us learn
more.

We would like to express our sincere thanks to the project guide Mr. Chandra
Shekar P, Assistant Professor, Department of Electronics and Communication
Engineering, ATMECE, Mysuru for their guidance, encouragement and suggestions that
helped us a lot in completion of the mini-project.

We also extend our sincere thanks to the TechnicalSeminar coordinators Dr.


Prakash Kuravatti and Mr. Girish M and all the faculty members, Department of
Electronics and Communication Engineering, ATMECE, Mysuru who have encouraged us
throughout the course.

Last but not the least, we express our heartfelt gratitude to Almighty, our parents
for their love and blessings that helped us complete the project work successfully.
ABSTRACT
Edge computing is a relatively recent phenomenon in the computing world, which takes
cloud computing services closer to the end user and is distinguished by fast processing and
application response time, which leads to many advantages such as faster and more efficient
data processing; Safety; reduced leakage on existing networks. Moreover, delaysensitive
applications may benefit from the Edge computing paradigm's low latency, agility, and
location awareness. Significant research has been conducted in the field of Edge
computing, which is reviewed in terms of recent technologies such as Mobile Edge
Computing, Cloudlet, and Fog computing, allows to gain a better understanding of current
and potential solutions
TABLE OF CONTENTS
Chapter No. Title

Chapter 1 INTRODUCTION Page No.


1.1 Overview 04

1.2 Related Work 06

Chapter 2 LITERATURE REVIEW


2.1 Survey Papers 07

Chapter 3 METHODOLOGY
3.1 Architecture 09

3.2 Classification 10

3.3 Challenges 14

Chapter 4 Overview
4.1 Application 16

4.2 Advantages 17

4.3 Disadvantages 18

CONCLUSION 19

REFERENCE 20
LIST OF FIGURES
Fig no. Description Page no.
1.1 Edge Computing Paradigm 04
3.1 Architecture of Edge Computing 09
3.2 Classification of Edge Computing 10
Edge Computing: Classification, Application and Challenges

CHAPTER 1
INTRODUCTION
With the escalation of the Internet of Things (IoT), Cloud computing, which is a collection of
networks, has tremendously changed the living and working style. Traditional cloud
computing as centralized is accosting severe major challenges, such as high latency in real-
time applications, low spectral efficiency (SE), and a non-adaptive machine type of
communication [1]. Propelled to explain all these difficulties, new technology is driving a
pattern that moves the capacity of incorporated distributed computing to the edge devices of
the organization or network. Edge computing is also recognized as edge processing. It is a
type of network communication technique to distribute the load on the system by placing a
huge number of servers near users and devices, In simple words "putting servers at edge
network closer to the device". In other words, an alternative to storing and processing all the
data on the cloud, a fragment of it is handled by the administering platform near the terminal.
This makes it possible to extremely digest traffic on the Internet and eliminates
communication delays. There are various types of network techniques, and in addition to edge
computing, there are "cloud computing" and "fog computing “and all of these computing
techniques processes the data differently, so let us see how it differs from edge computing [1].
Cloud computing is named a "cloud" because it remotely processes a computer in a location
away from the user via the Internet. As shown in fig. 1, In cloud computing, services such as
servers, storage, databases, and applications are all located in the cloud, so you can access
them on any device if you have the Internet [2]. Besides, there is no need to prepare a server
or use a data center, which can significantly reduce costs. It is possible to store a huge amount
of information in the cloud, but on the other hand, there is a concern that data processing will
take time.
1.1 Overview
Edge computing is a relatively recent phenomenon in the computing world, which takes cloud
computing services closer to the end user and is distinguished by fast processing and
application response time, which leads to many advantages such as faster and more efficient
data processing; Safety; reduced leakage on existing networks. Moreover, delay sensitive
applications may benefit from the Edge computing paradigm's low latency, agility, and
location awareness. Significant research has been conducted in the field of Edge computing,
which is reviewed in terms of recent technologies such as Mobile Edge Computing, Cloudlet,

Page 3
Edge Computing: Classification, Application and Challenges

and Fog computing, allowing researchers to gain a better understanding of current and
potential solutions. This article summarizes the edge computing paradigm classification,
applications, and various challenges in detail.
One of the primary advantages of edge computing is its ability to reduce latency. By
processing data closer to its source, edge computing minimizes the time it takes for data to
travel back and forth between devices and centralized data centres. This results in faster
response times, making it ideal for applications that require instantaneous feedback, such as
autonomous vehicles, remote surgery, and industrial automation.
Furthermore, edge computing enhances bandwidth efficiency by reducing the amount of data
that needs to be transmitted over networks. Instead of sending raw data to centralized servers
for processing, edge devices can preprocess and filter data locally, sending only relevant
information to the cloud. This not only conserves bandwidth but also reduces network
congestion and costs, particularly in environments with limited network connectivity.
The distributed architecture of edge computing enables scalability and flexibility.
Organizations can deploy edge nodes at various locations to meet the specific requirements of
their applications, whether it's in a factory, a retail store, or a smart city environment. This
scalability allows for the deployment of additional edge nodes as needed, accommodating
changing workloads and expanding computational capabilities on-demand.

Fig1.1: Edge Computing Paradigm

1.2 Related work


In this research article a review has been provided on the overview of edge computing

Page 4
Edge Computing: Classification, Application and Challenges

based on various issues and challenges of edge computing for diversified applications. This
main work aims to summarize edge computing and to make aware of its challenges with
modern technologies. Weisong Shi et al. (2016) the authors explained the definition of edge
computing which addresses the concern of response time where concerns are latency,
resource like battery-life constraint, bandwidth cost-saving, as well as data safety and
privacy [2]. Fang Liu et al. (2019) summarized the existing edge computing systems and
related tools. The authors divided the paper into two parts: System View and Application
View. In system View, Open-Source Edge computing projects and edge computing
systems & tools are discussed wherein application view, deep learning optimization at the
edge are discussed. [3]. Yuan Ai et al. (2017) presented the major three edge computing
technologies: mobile edge computing, cloudlets, and fog computing. The authors explained
application areas, architectures, standardization efforts for mobile edge computing,
cloudlets, and fog computing. [4]. S. Bhattacharyya et al. (2017) the authors described the
Edge computing that processes the gathered data from end devices at the edge of the
network. By covering a large range of technologies, edge computing addresses the various
concern as battery life constraint, bandwidth usage, latency, data security, and data privacy.
The need for edge computing (Push from Cloud Services and Pull from the Internet of
Things) is discussed [6]. Salman Taheri Zadeh et al. (2017) the authors explained the auto-
scaling applications in edge computing which maintained the online services at a
decentralized location. They broadly explained two aspects of this paper. In the first
section, they major focused on the different types of edge computing applications (IoT
Applications, Micro-service applications, Time-critical applications). For these
applications, auto-scaling challenges when the workload dynamically changes are
discussed.

Page 5
Edge Computing: Classification, Application and Challenges

CHAPTER 2
LITERATURE REVIEW
2.1 Survey Papers
[1] The paper "Edge Computing: Classification, Applications, and Challenges" presented at
the 2021 2nd International Conference on Intelligent Engineering and Management (ICIEM)
delves into the classification, applications, and challenges of edge computing. The authors
likely provide an overview of different classifications of edge computing architectures, such
as fog computing, mobile edge computing, and cloudlet-based architectures. They may
discuss how these architectures vary in terms of proximity to the end-users or devices,
resource allocation, and scalability.
[2] Shi, W., Cao, J., Zhang, Q., Li, Y., & Xu, L. (2016). Edge Computing: Vision and
Challenges. IEEE Internet of Things Journal, 3(5), 637-646. The paper identifies key
challenges in realizing the vision of edge computing, such as resource constraints, scalability
issues, and security concerns. It highlights the importance of efficient resource allocation
and management at the edge, considering the diverse and dynamic nature of edge
environments. Additionally, the authors discuss the implications of edge computing on
network architectures, emphasizing the need for new networking protocols and standards to
support distributed computing at the edge.
[3] P. Singh, A. Kaur, G. S. Aujla, R. S. Batth and S. Kanhere's paper, "DaaS: Dew Computing
as a Service for Intelligent Intrusion Detection in Edge-of-Things Ecosystem," published in
IEEE, proposes an innovative approach to intrusion detection in edge-of-things (EoT)
ecosystems using Dew Computing as a Service (DaaS).The authors introduce the concept of
Dew Computing, which extends the capabilities of edge computing by leveraging not only
the edge devices but also the intermediate nodes or "dew nodes" within the network. They
highlight how Dew Computing can enhance the efficiency and reliability of computing
resources in EoT environments, particularly for applications like intrusion detection.
[4] In the paper titled "Edge Computing Technologies for Internet of Things: A Primer,"
published in the journal Digital Communications and Networks in 2018, Ai, Peng, and
Zhang provide an insightful overview of edge computing technologies in the context of the
Internet of Things (IoT). Here's a review of their work:The authors begin by introducing the
concept of edge computing and its significance in addressing the challenges posed by IoT
applications, such as latency, bandwidth constraints, and data privacy concerns. They

Page 6
Edge Computing: Classification, Application and Challenges

highlight how edge computing decentralizes data processing and moves computational tasks
closer to IoT devices, enabling real-time analysis and decision-making.
[5] In the paper titled "Challenges and Opportunities in Edge Computing," presented at the
2016 IEEE International Conference on Smart Cloud, Varghese, Wang, Barbhuiya,
Kilpatrick, and Nikolopoulos provide a comprehensive examination of the key challenges
and opportunities in the emerging field of edge computing. Here's a review of their work:
The authors commence by establishing the context for edge computing, emphasizing its
transformative potential in addressing the limitations of centralized cloud architectures,
particularly in scenarios where low latency, real-time processing, and efficient use of
network resources are critical.
[6] "Auto-scaling Applications in Edge Computing: Taxonomy and Challenges" by S.
Taherizadeh and V. Stankovski provides a structured examination of auto-scaling techniques
tailored to the unique characteristics of edge computing environments. The paper develops
a taxonomy that categorizes auto-scaling approaches and discusses the challenges inherent
in dynamically adjusting resources at the edge. It offers insightful analysis into the
complexities of auto-scaling in edge environments, addressing factors like workload
dynamics, resource constraints, and coordination overhead. The paper's practical
implications and future research directions provide valuable guidance for researchers and
practitioners seeking to optimize auto-scaling mechanisms for edge computing systems.
Overall, the paper serves as a comprehensive resource for understanding and addressing the
challenges of auto-scaling in the context of edge computing.
[7] The paper "Private and Scalable Personal Data Analytics Cloud Deep Learning" by C.
Feature and E. Deep, published in 2018, introduces an innovative approach to personal data
analytics using deep learning within a cloud computing framework. The authors address
privacy concerns while emphasizing scalability. Their focus on leveraging cloud resources
for efficient and secure data analysis presents promising implications for personalized
analytics solutions. However, further details on the methodologies employed and empirical
results would enhance the paper's comprehensiveness and practical applicability.

Page 7
Edge Computing: Classification, Application and Challenges

CHAPTER 3
METHODOLOGY
3.1 Architecture
The general architecture of edge computing is having terminal, edge, and cloud layers which
is 3-tier architecture. In this, the terminal layer deals with the end devices and nodes which
include both tasks as data consumer and data provider. In the terminal layer, all the devices
like smartphones, sensors, wearable devices gathered all the data and provide it to the upper
layer for computations. The second layer which is the edge layer plays an important role as
processing unit of system. This layer contains routers, switches, gateway, access points
which store and compute all the data provided by the end devices. High-performance servers
are placed in the third layer, the cloud layer which is widely used in global applicable
situations and organizations that require large-scale centralized processing. Edge computing
industry alliance (ECC) released framework 3.0.
The structure of cloud-edge collaboration is generally divided into terminal layer, edge layer
and cloud computing layer. The following is a brief introduction to the composition and
functions of each layer in the edge computing architecture.
DEVICE LAYER
The device layer consists of all types of devices connected to the edge network, including
mobile terminals and many Internet of Things devices (such as sensors, smartphones, smart
cars, cameras, etc.). In the device layer, the device is not only a data consumer, but also a
data provider. In order to reduce the terminal service delay, only the perception of the various
terminal devices is considered, not the computing power. As a result, hundreds of millions
of devices in the terminal layer collect all kinds of raw data and upload it to the upper layer,
where it is stored and calculated.
EDGE LAYER The edge layer is the core of the three-tier architecture. It is located at the
edge of the network and consists of edge nodes widely distributed between terminal devices
and clouds. It usually includes base stations, access points, routers, switches, gateways, etc.
The edge layer supports the access of terminal devices downward, and stores and computes
the data uploaded by terminal devices. Connect with the cloud and upload the processed data
to the cloud [7]. Since the edge layer is close to the user, the data transmission to the edge
layer is more suitable for real-time data analysis and intelligent processing, which is more
efficient and secure than cloud computing.

Page 8
Edge Computing: Classification, Application and Challenges

CLOUD LAYER Among the federated services of cloud-edge computing, cloud


computing is still the most powerful data processing center. The cloud computing layer
consists of a number of high-performance servers and storage devices, with powerful
computing and storage capabilities, and can play a good role in areas requiring large amounts
of data analysis such as regular maintenance and business decision support. The cloud
computing center can permanently store the reported data of the edge computing layer, and
it can also complete the analysis tasks that the edge computing layer cannot handle and the
processing tasks that integrate the global information. In addition, the cloud module can also
dynamically adjust the deployment strategy and algorithm of the edge computing layer
according to the control policy.

3.1 Architecture of Edge Computing


3.2 Classification of Edge Computing
Edge computing encompasses various architectures that bring computing closer to the data
source, reducing latency and bandwidth usage. The divisions between mobile edge, cloudlet,
and fog computing are based on their proximity to the end devices and the scale of their
deployment

These divisions are not always rigid, and there can be overlaps between them. The choice of
architecture depends on factors such as the specific use case, scalability requirements,
resource constraints, and the desired trade-offs between latency, bandwidth, and processing
capabilities.

Page 9
Edge Computing: Classification, Application and Challenges

Fig 3.2 Classification Of Edge Computing

3.2.1 Mobile Edge Computing (MEC)


MEC is a distributed computing paradigm that brings computational resources and services
closer to mobile users, typically within the Radio Access Network (RAN) or at the edge of
cellular networks. It aims to address the increasing demand for low-latency, high-bandwidth
services and applications by leveraging edge computing capabilities in close proximity to
mobile devices. Here's an overview of Mobile Edge Computing:
Proximity to Mobile Users: MEC places computing resources, such as servers, storage, and
networking equipment, at the edge of cellular networks, typically within base stations or
access points. This proximity reduces the distance data needs to travel, resulting in lower
latency and improved responsiveness for mobile applications.
Real-time Processing: By moving computation closer to mobile users, MEC enables real-
time processing of data generated by mobile devices. This allows for faster response times
and enhanced user experiences in applications requiring low latency, such as online gaming,
augmented reality, and video streaming.
Network Offloading: MEC can offload computation-intensive tasks from mobile devices
to edge servers, reducing the processing burden on the devices themselves and conserving
battery life. This is particularly beneficial for resource-constrained mobile devices, such as
smartphones and IoT devices.
Resource Sharing and Optimization: MEC facilitates resource sharing and optimization
across multiple mobile users and applications. Edge servers can dynamically allocate
resources based on demand, ensuring efficient utilization of computing resources and
improving overall network performance.

Page 10
Edge Computing: Classification, Application and Challenges

Service Placement and Orchestration: MEC platforms enable intelligent service


placement and orchestration, allowing applications to be deployed and scaled dynamically
based on user location, network conditions, and application requirements. This flexibility
enhances the scalability and agility of mobile services.
Enhanced Security and Privacy: MEC provides opportunities for implementing security
and privacy mechanisms closer to mobile users, such as encryption, access control, and data
anonymization. This helps protect sensitive data and mitigate security risks in mobile
applications.
Use Cases: MEC enables a wide range of use cases across various industries, including:
Augmented Reality (AR) and Virtual Reality (VR): MEC enhances AR/VR experiences by
reducing latency and improving content delivery.
Internet of Things (IoT): MEC supports IoT applications by enabling real-time data
processing and analysis at the network edge.
Smart Cities: MEC facilitates smart city initiatives by enabling real-time monitoring,
analytics, and control of city infrastructure and services.
Overall, Mobile Edge Computing (MEC) plays a crucial role in enabling innovative mobile
services and applications, offering low-latency, high-performance computing capabilities at
the edge of cellular networks. As mobile networks continue to evolve towards 5G and
beyond, MEC is expected to become increasingly integral to delivering enhanced mobile
experiences and supporting emerging use cases.

3.2.2 Cloudlet

A cloudlet is a small-scale cloud data center or server located at the network edge, typically
within close proximity to end-users or devices. It serves as an intermediary between mobile
devices or IoT devices and centralized cloud data centers, providing computing resources
and services closer to where they are needed. Here's an overview of cloudlets:
Proximity to End-users: Cloudlets are deployed at the network edge, such as within local
access points, base stations, or data centers near end-users. This proximity reduces latency
and improves response times for applications and services accessed by nearby devices.
Computing Resources: Cloudlets typically consist of servers, storage, and networking
equipment capable of hosting virtualized or containerized applications. They offer
computing resources comparable to larger-scale cloud data centers but on a smaller scale,
catering to localized demand.

Page 11
Edge Computing: Classification, Application and Challenges

Resource Sharing: Cloudlets support resource sharing and multi-tenancy, allowing multiple
users or applications to utilize the same computing infrastructure concurrently. This enables
efficient utilization of resources and scalability to accommodate varying workloads.
Service Offloading: Mobile or IoT devices can offload computation-intensive tasks to
nearby cloudlets, reducing the processing burden on the devices themselves. This is
particularly beneficial for applications requiring real-time processing or data analytics, such
as augmented reality, video streaming, and edge AI.
Edge Orchestration: Cloudlets can be orchestrated to dynamically deploy and manage
applications based on factors such as device location, network conditions, and user
preferences. This flexibility enables adaptive and responsive computing environments
tailored to specific use cases and requirements
3.2.3 Fog Computing
Fog computing is a distributed computing paradigm that extends cloud computing
capabilities to the edge of the network, closer to where data is generated and consumed. It
aims to address the limitations of centralized cloud architectures by bringing computing
resources, storage, and applications closer to end-users and devices. Here's an overview of
fog computing:
Proximity to End-users and Devices: Fog computing places computing resources at the
network edge, typically within the proximity of end-users or IoT devices. This proximity
reduces latency and improves response times for applications and services accessed by
nearby devices.
Hierarchical Architecture: Fog computing often adopts a hierarchical architecture, with
multiple layers of computing resources distributed across the network. This includes edge
devices, fog nodes (intermediate computing devices), and centralized cloud data centers.
Data and processing tasks can be distributed across these layers based on factors such as
latency requirements, resource availability, and application demands.
Real-time Processing: Fog computing enables real-time processing and analysis of data at
the edge of the network, allowing for faster decision-making and response to events. This is
particularly important for latency-sensitive applications such as industrial automation, smart
cities, and autonomous vehicles.
Scalability and Flexibility: Fog computing supports scalable and flexible deployment
models, allowing computing resources to be dynamically provisioned and scaled based on
demand. This enables efficient resource utilization and accommodates varying workloads

Page 12
Edge Computing: Classification, Application and Challenges

across distributed environments.


Resource Sharing and Optimization: Fog computing facilitates resource sharing and
optimization across multiple edge devices and fog nodes. Tasks can be offloaded from
resource-constrained devices to more powerful fog nodes, improving overall system
performance and efficiency.
Security and Privacy: Fog computing introduces new security and privacy challenges, such
as securing distributed edge nodes, ensuring data integrity, and managing access control.
However, it also enables localized security measures and privacy-enhancing techniques,
such as data encryption and anonymization, closer to where data is generated.
3.3 Challenges of Edge Computing
Edge computing is still in its infancy. The major challenges in Edge Computing have been
classified as:
Bandwidth: The more data at the network bandwidth-edge, the more the bandwidth will
change. It should balance the greater bandwidth across the network. From the latency’s point
of view, the high bandwidth can reduce the network transmission time, especially for the
huge amount of data. In cloud higher bandwidth provided at data centers and lower at edge
but this will be challenge at edge network to provide higher bandwidth to end devices. If the
workload can be handled at the edge, the delay can be improved, and bandwidth will also be
saved [6]. Multi-domain Management and Cooperation: Multiple types of resources are
involved in Edge computing, where all resources belong to various owners. How to access
all the required resources for application and allocate them according to the need of the user’s
application, services. In an emergency, a problem is that the mobile edge computing system
needs to consider. In the initial phase, systems considered the functionalities rather than
implementation issues like the price model. With the upgrade of edge processing, the
genuine formation concern ought to be illuminated before we appreciate all the advantages.
Security and Privacy Security plays an important role in every field as every user is
concerned about privacy for their confidential information. While working in a network,
various attacks can be performed on the edge network. The following is different types of
security threats:
• Phishing - An attempt to extract sensitive or confidential information from the user of
cloud connected IoT equipment. Cybercriminals use this method to distribute electronic
content to a wide range of victims, prompting them to take specific actions, such as clicking
a link or sending a reply to an email.

Page 13
Edge Computing: Classification, Application and Challenges

• Malware - Malware that is specially designed to harm an edge device, system, or data. It
includes several types of tools, including adware, ransom ware, Trojans, viruses, and worms.
• Rootkit - Malware that enables secretly accessing parts of an edge device, software, or
system. It often modifies the hardware operating system in such a way that it remains hidden
from the user
• Trojan - A form of non-replicating malware that contains hidden functionality. The Trojan
usually does not try to propagate or inject itself into other files
• Backdoor - Malware targeting remote entry into a system, device, edge device, or software,
or bypassing traditional security measures installed in them.
• Spyware - Malware that spies on an edge device user, intercept keystrokes, emails,
documents, and even turns on a video camera without their knowledge.
Uncompromising Quality of Service (Latency) :Locating the compute at the edge is one
of the parameters that provide QoS. In edge computing, the focus is performing the
computing anywhere close to the data collected by the computer. To check the network
performance, usually Latency is vital QoS metrics which is not only determined by
computation time. The computation of the workload should be performed at closer layer.
Also, this layer must have enough computation capability to do the calculations at the edge
network.
Partition and Offloading tasks (resource allocation): Workload allocation is not an easy
task. Distributed computing should consider location as an additional aspect of computation.
Resource mobility defines how to dynamically discover the resources in the network and
manage all the required available resources, including longterm resources and short-term
resources. Also, in some situations, few edge devices or end devices are damaged, then at
that situation challenge is how the system can be resumed as soon as possible with the best
replacements [6]. At the edge, the resources are resource-constrained as having smaller
processors and a limited power budget, also resource heterogeneous where the processors
are with different architectures (Data Flow, Control and Tenancy) and workload need to
change dynamically

Page 14
Edge Computing: Classification, Application and Challenges

CHAPTER 4
CASE STUDIES
4.1 Application
It is proposed to bring some of the cloud services closer to the level at which the data are
found as well as using multiple levels, according to the needs of the different applications.
Due to multiple benefits of edge computing like speed, security, scalability, versatility, and
reliability edge computing is used in many real time applications. Basically, edge computing
is used to push applications and data to logical extremes by centralized hubs in a network.
We use edge computing in real time applications because better responsiveness and
robustness make this system more effective. Edge computing is distributed kind of system
where large data volume is processed and compiled. Distribution properties gives features
like supply chain tracking, point of sale system and distributed artificial intelligence
processing.
Edge computers and smart home devices: Increasingly common appliances, such as
thermostats and smart speakers, are using edge computing technology to respond quickly to
changing conditions and controls. Smart city and personal houses, and augmented reality
can also benefit from fast and localized data processing. The same principle works for any
activated device, with the processing center either directly on the device or in a near location.
Hypothetically, a system can mutually use the cloud, for less time-sensitive processing and
the edge for more urgent requirements. Edge computing also needs dedicated hardware for
edge nodes and these edge nodes ultimately provides services.
Edge computing in business: An example of a business is a retail store with a collection
of activated security cameras. Utilizing the cloud-based technology, the camera sends data
to a server, which is thousands of KMs away that identifies the movement and saves those
clips to the store data. By utilizing edge computing, the processing unit itself is a part of the
camera, categorizing the clips that should be saved and which should not, and transferring
only the relevant clip to the server. This procedure reduces the load on the store's computing
power, allowing for more efficient operation, and protecting other company data from
possible security attacks. Edge computing can be used in the variability of industries beyond
retail, including manufacturing, energy, utilities, transportation, defense, healthcare, and the
media
Edge computing for 5G technology: Edge computing takes a big role to play in 5G wireless

Page 15
Edge Computing: Classification, Application and Challenges

networks, which will be able to host several times the number of connected devices that use
them at a period.5G network can be able to transmit a large amount of data, hundreds of
times faster than 4G, will incorporate the number of cell stations several times, and reduce
latency. Telecommunications providers using 5G will use local data centers on or near 5G
towers, giving the network an extra boost.
Edge computing for smart technology: Experts consider edge computing to be the next
great thing in technology, which will transform our living way. Several types of equipment
fall under the edge computing umbrella, including 5G cell stations, data center infrastructure,
and applications to analyze data faster. The number of sensors associated with cities, homes,
and smart vehicles are expected to increase exponentially in the coming years. 5G wireless
networks, supported by peak computing, will be able to manage all data generated by these
devices in ways that 4G cannot.
4.2 Advantages
Edge computing offers several advantages over traditional centralized computing models,
including:
Low Latency: By processing data closer to where it is generated, edge computing reduces
the distance data needs to travel, resulting in lower latency. This is crucial for applications
that require real-time processing, such as autonomous vehicles, industrial automation, and
augmented reality.
Improved Performance: Edge computing enhances the performance of applications and
services by reducing the reliance on distant data centers. Tasks can be executed locally,
leading to faster response times and improved user experiences.
Bandwidth Optimization: Edge computing minimizes the need to transmit large volumes
of data over the network to centralized data centers. By processing data locally and
transmitting only relevant information, edge computing reduces bandwidth usage and
network congestion.
Enhanced Privacy and Security: Edge computing allows sensitive data to be processed
and stored locally, rather than transmitting it over the network to centralized servers. This
enhances privacy and security by reducing the risk of data breaches and unauthorized access
during transmission.
Offline Operation: Edge computing enables devices to continue functioning even when
disconnected from the network or in areas with limited connectivity. Local processing
capabilities allow devices to perform critical tasks autonomously, ensuring uninterrupted

Page 16
Edge Computing: Classification, Application and Challenges

operation.
Scalability: Edge computing architectures are highly scalable, allowing for the deployment
of additional edge nodes as needed to accommodate increasing workloads or changing
requirements. This scalability enables organizations to efficiently manage resources and
adapt to evolving demands.
Redundancy and Reliability: Edge computing distributes computing resources across
multiple edge nodes, reducing the risk of single points of failure and improving system
reliability. Redundancy measures can be implemented at the edge to ensure continuous
operation and fault tolerance.
Real-time Insights: Edge computing enables real-time analytics and decision-making by
processing data immediately as it is generated. This allows organizations to extract valuable
insights from data streams in real-time, facilitating faster decision-making and response to
events.
Cost Efficiency: Edge computing can lead to cost savings by reducing the need for
expensive network bandwidth and centralized infrastructure. Local processing and storage
capabilities reduce reliance on cloud resources, resulting in lower operational costs for data
transmission and storage.
Overall, edge computing offers numerous advantages that make it well-suited for a wide
range of applications and industries. By bringing computation closer to where data is
generated, edge computing enables faster, more efficient, and more secure processing of
data, driving innovation and enabling new use cases in the digital landscape.
4.3 Disadvantages
While edge computing offers numerous benefits, it also comes with several disadvantages
and challenges:
Limited Processing Power: Edge devices often have limited processing capabilities
compared to centralized data centers. This can restrict the complexity and scale of
applications that can be deployed at the edge, leading to performance limitations for certain
workloads.
Network Connectivity Issues: Edge devices rely on network connectivity to communicate
with each other and with centralized data centers. Poor network connectivity, intermittent
connections, or network outages can disrupt communication and data transfer, affecting the
reliability and availability of edge services.
Security Risks: Edge computing introduces new security challenges, including increased

Page 17
Edge Computing: Classification, Application and Challenges

attack surface, distributed security management, and potential vulnerabilities in edge devices
and networks. Securing distributed edge environments requires robust authentication,
encryption, access control, and intrusion detection mechanisms.
Data Privacy Concerns: Edge computing involves processing sensitive data locally on edge
devices or edge servers. This raises concerns about data privacy and compliance with data
protection regulations, as data may be exposed to unauthorized access or breaches if
adequate security measures are not implemented.
Management Complexity: Managing a distributed edge infrastructure with a large number
of heterogeneous devices and nodes can be complex and challenging. Tasks such as
provisioning, configuration, monitoring, and maintenance require centralized management
tools and automated processes to ensure consistency and efficiency.
Cost Considerations: Deploying and maintaining edge computing infrastructure can be
costly, particularly for organizations with distributed or remote locations. Investments in
hardware, software, networking, and management tools are required, along with ongoing
operational expenses for maintenance and support.

Page 18
Edge Computing: Classification, Application and Challenges

CONCLUSION

In conclusion, edge computing represents a paradigm shift in the way data is processed and
managed in modern computing environments. By bringing computation closer to where data
is generated, edge computing offers numerous advantages, including reduced latency,
improved performance, bandwidth optimization, enhanced privacy and security, offline
operation, scalability, and real-time insights. These benefits make edge computing well-
suited for a wide range of applications and industries, including IoT, healthcare,
manufacturing, smart cities, and retail. However, edge computing also presents several
challenges and disadvantages, including limited processing power, resource constraints,
network connectivity issues, security risks, data privacy concerns, management complexity,
interoperability issues, cost considerations, and data consistency challenges. Addressing
these challenges requires careful planning, investment in technology and infrastructure, and
collaboration between stakeholders to ensure the successful deployment and operation of
edge computing solutions. Despite these challenges, the potential of edge computing to
enable new use cases, improve efficiency, and drive innovation in the digital landscape
cannot be overstated. As edge computing continues to evolve and mature, it is expected to
play an increasingly important role in shaping the future of computing infrastructure and
applications. By leveraging the benefits of edge computing while addressing its challenges,
organizations can unlock new opportunities for growth, differentiation, and value creation
in the digital economy.

Page 19
Edge Computing: Classification, Application and Challenges

REFRENCES
1] P. Singh, A. Kaur, G. S. Aujla, R. S. Batth and S. Kanhere, 2020 "DaaS: Dew Computing
as a Service for Intelligent Intrusion Detection in Edge-of-Things Ecosystem," in IEEE
Internet of Things Journal
[2] W. Shi, J. Cao, Q. Zhang, Y. Li, and L. Xu, “Edge Computing: Vision and Challenges,”
IEEE Internet Things J., vol. 3, no. 5, pp. 637–646, 2016.
[3] F. Liu, G. Tang, Y. Li, Z. Cai, X. Zhang, and T. Zhou, “A Survey on Edge Computing
Systems and Tools,” Proc. IEEE, pp. 1–24, 2019.
[4] Y. Ai, M. Peng, and K. Zhang, “Edge computing technologies for Internet of Things: a
primer,” Digit. Commun.Networks, vol. 4, no. 2, pp. 77–86, 2018.
[5] B. Varghese, N. Wang, S. Barbhuiya, P. Kilpatrick, and D. S. Nikolopoulos, “Challenges
and Opportunities in Edge Computing,” Proc. - 2016 IEEE Int. Conf. Smart Cloud, Smart
Cloud 2016, pp. 20–26, 2016.
[6] S. Bhattacharyya, “Research on Edge Computing : A Detailed Study,” Academia.Edu,
vol. 2, no. 6, pp. 9–13, 2016.
[7] S. Taherizadeh and V. Stankovski, “Auto-scaling applications in edge computing:
Taxonomy and challenges,” ACM Int. Conf. Proceeding Ser., pp. 158–163, 2017.
[8] C. Feature and E. Deep, “Private and Scalable Personal Data Analytics Cloud Deep
Learning,” 2018

Page 20

You might also like