Electronics 11 03434 v2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

electronics

Article
Mobility-Aware Data Caching to Improve D2D
Communications in Heterogeneous Networks
Muhammad Sheraz 1, *, Shahryar Shafique 1 , Sohail Imran 1 , Muhammad Asif 2, *, Rizwan Ullah 3, *,
Muhammad Ibrar 4 , Andrzej Bartoszewicz 5 and Saleh Mobayen 6, *

1 Department of Electrical Engineering, Iqra National University, Peshawar 25124, Pakistan


2 Department of Electrical Engineering, Main Campus, University of Science & Technology, Township Bannu,
Bannu 28100, Pakistan
3 Wireless Communication Ecosystem Research Unit, Department of Electrical Engineering,
Chulalongkorn University, Bangkok 10330, Thailand
4 Department of Physics, Islamia College Peshawar, Peshawar 25000, Pakistan
5 Institute of Automatic Control, Lodz University of Technology, 18 Stefanowskiego St., 90-537 Lodz, Poland
6 Future Technology Research Center, National Yunlin University of Science and Technology,
123 University Road, Section 3, Douliou, Yunlin 64002, Taiwan
* Correspondence: mshpk2@gmail.com (M.S.); masifeed@ustb.edu.pk (M.A.);
eageleyes_2009@yahoo.com (R.U.); mobayens@yuntech.edu.tw (S.M.)

Abstract: User Equipment (UE) is equipped with limited cache resources that can be utilized to
offload data traffic through device-to-device (D2D) communications. Data caching at a UE level
has the potential to significantly alleviate data traffic burden from the backhaul link. Moreover, in
wireless networks, users exhibit mobility that poses serious challenges to successful data transmission
via D2D communications due to intermittent connectivity among users. Users’ mobility can be
 exploited to efficiently cache contents by observing connectivity patterns among users. Therefore,
 it is crucial to develop an efficient data caching mechanism for UE while taking into account users’
Citation: Sheraz, M.; Shafique, S.; mobility patterns. In this work, we propose a mobility-aware data caching approach to enhance data
Imran, S.; Asif, M.; Ullah, R.; Ibrar,
offloading via D2D communication. First, we model users’ connectivity patterns. Then, contents
M.; Bartoszewicz, A.; Mobayen, S.
are cached in UE’ cache resources based on users’ data preferences. In addition, we also take into
Mobility-Aware Data Caching to
account signal-to-interference and noise ratio (SINR) requirements of the users. Hence, our proposed
Improve D2D Communications in
caching mechanism exploits connectivity patterns of users to perform data placement based on users’
Heterogeneous Networks. Electronics
2022, 11, 3434. https://doi.org/
own demands and neighboring users to enhance data offloading via cache resources. We performed
10.3390/electronics11213434 extensive simulations to investigate the performance of our proposed mobility-aware data caching
mechanism. The performance of our proposed caching mechanism is compared to most deployed
Academic Editor: Christos J. Bouras
data caching mechanisms, while taking into account the dynamic nature of the wireless channel and
Received: 20 August 2022 the interference experienced by the users. From the obtained results, it is evident that our proposed
Accepted: 20 October 2022 approach achieves 14%, 16%, and 11% higher data offloading gain than the least frequently used, the
Published: 24 October 2022 Zipf-based probabilistic, and the random caching schemes in case of an increasing number of users,
Publisher’s Note: MDPI stays neutral
cache capacity, and number of contents, respectively. Moreover, we also analyzed cache hit rates,
with regard to jurisdictional claims in and our proposed scheme achieves 8% and 5% higher cache hit rate than the least frequently used,
published maps and institutional affil- the Zipf-based probabilistic, and the random caching schemes in case of an increasing number of
iations. contents and cache capacity, respectively. Hence, our proposed caching mechanism brings significant
improvement in data sharing via D2D communications.

Keywords: caching; data offloading; D2D; mobility; cache hits


Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
1. Introduction
conditions of the Creative Commons
Attribution (CC BY) license (https://
An unprecedented growth has occurred in mobile data traffic due to the growing
creativecommons.org/licenses/by/ number of users and their ever increasing data demands. It is anticipated that data traffic
4.0/). will reach 77.5 exabytes/month in 2022, which is seven times more than in 2017 [1]. To

Electronics 2022, 11, 3434. https://doi.org/10.3390/electronics11213434 https://www.mdpi.com/journal/electronics


Electronics 2022, 11, 3434 2 of 15

address this challenge, ultra-dense wireless networks are proposed to improve network
capacity [2]. However, ultra-dense network architecture may overburden constrained
backhaul links with huge data traffic and adversely affect network performance. A solution
to the aforementioned problem can be data caching at the network edge, e.g., base stations
(BSs) or user equipment (UE). This approach has the potential to alleviate redundant data
transmission burden from the backhaul links [3]. In addition, users can access their desired
content from serving BS or neighboring UE through D2D links, which can enhance users’
quality of experience (QoE).
Caching at the UE level has the potential to relieve data traffic burden by data offload-
ing via UE cache resources. In [4], the authors proposed a probabilistic content caching
scheme for optimal data caching to improve data retrieval via D2D communications. In [5].
The authors exploited trilateral cooperation among users, BSs, and core networks to op-
timize selection of users for data caching and content placement. However, users have
distinct and diverse data demands. Therefore, in [6], the authors exploited users’ individual
content preferences by clustering the network to benefit cache-enabled D2D communica-
tions. The users were grouped based on their data preferences for an optimal data caching
process to enhance the probability of successful data discovery. The cache-enabled D2D-
assisted wireless networks can greatly reduce data traffic strain and successfully address
the issue of content placement and content delivery. Therefore, in [7], the authors presented
joint data delivery and content placement to improve D2D communications. A support
vector machine was devised to determine data popularity profile. Moreover, a Hungarian
algorithm was used to address the issue of optimal content distribution in an effort to raise
the level of satisfaction. The obtained results demonstrated a significant increase in the
cache hit ratio. Consequently, power conservation and reduction in latency was achieved.
In [8], the authors focused on joint optimization of data caching and channel allocation.
The authors proposed a two-step heuristic scheme to perform channel allocation in the first
stage and data caching in the second stage. The obtained results demonstrate a significant
advance in successful data delivery. In [9], the authors focused on maximizing data offload-
ing via D2D communications. The authors proposed an efficient low complexity heuristic
data caching mechanism. The obtained results showed an improvement in data access via
cache-enabled D2D communications.
However, users have an intrinsic characteristic of mobility. The users’ locations vary
with respect to time, which results in dynamic connectivity patterns among users. This
dynamic connectivity among users can be exploited to improve the probability of data
sharing among users. However, users exist within D2D communication range of each
other for a small time period. This brings a serious challenge to optimize the mobility-
aware data caching process for successful data transmission within a limited time period.
In [10], the authors proposed an inter-contact model to characterize users’ mobility patterns
to improve D2D communications. The users’ contact and inter-contact duration follow
stochastic distributions. In [11], the authors proposed a cost-effective data caching scheme
utilizing users’ mobility patterns. The mobile users have the potential to improve data
sharing via UE cache resources. Moreover, in [12], the authors exploited pervasive user
movements and the uneven distribution of data popularity profile for data placement in
UE cache resources to improve D2D communications. In light of the above mentioned facts,
it is necessary to utilize realistic users’ mobility patterns to design a mobility-aware data
caching mechanism for UE. In Table 1, we provide a comparison of the above-discussed
caching approaches.
Electronics 2022, 11, 3434 3 of 15

Table 1. A comparison of existing caching approaches.

Performance
Ref. Techniques Outcomes
Metrics
Scalable video coding is devised Optimized caching and power
[4] Data rate
for multilayer content coding allocation
Attention weighted federated
Optimized caching and power
[5] deep reinforcement learning Delay
allocation
model using federated learning
users’ data preference based
Individual preference-aware
[6] Hit rate data caching improves D2D
caching policies
communications
Data delivery approach min-
Support vector machine devised
[7] Cache hits imizes power consumption
to predict data popularity profile
and delay
Low-complex heuristic ap-
Delivery
[8] proach for channel allocation Improved caching and power
probability
and caching
Low complexity heuristic
Data offload-
[9] caching strategy for users’ data Optimized data placement
ing
preference based caching
Analytical model is proposed
to characterize dependence users’ contact and inter-
Inter-contact
[10] between the individual inter- contact duration follow
times
contact distributions and stochastic distributions
aggregate distribution
A balance is maintained be-
Mobility-aware user-by-user al-
[11] Cost tween accuracy and complex-
gorithm to optimize data caching
ity
Long short-term memory incor-
Cache hit Large cache size and suitable
[12] porated in caching replacement
rate rewards improve cache hits
algorithm

Motivated by the abovementioned works, in this paper our objective is to design a


mobility-aware data caching mechanism for UE while considering practical scenarios such
as interference, users’ intermittent connectivity, and diverse data demands. To model the
connectivity patterns among users, the timeline of a pair of mobile users is partitioned
into contact and inter-contact duration. Both the contact and inter-contact duration follow
exponential distributions. The contact duration represents the time duration of two mobile
users existing within D2D communication range of each other. Moreover, an inter-contact
duration represents the time interval between two contact durations. In addition, we
take into account a realistic dynamic channel condition and users’ data demand. We
compared our proposed caching mechanism with the most widely deployed Zipf-based
probabilistic, random, and least frequently used caching schemes. In [13], the authors
proposed random caching to optimize data caching to enhance successful data transmission.
The authors exploited random caching in a mobility environment, and the obtained results
demonstrated a significant gain in successful data transmission in a limited time period.
In [14], the authors proposed the least frequently used data caching to improve cache
hits. The least frequently used caching approach cached content based on the frequency of
content requested, and content with low data request frequency are removed with contents
that are highly demanded. The obtained results depict a significant gain in cache hits.
In [15], the authors exploited Zipf-based probabilistic caching to improve data offloading
Electronics 2022, 11, 3434 4 of 15

via cache resources. The authors modeled stochastic geometry to derive probability of
data requester association, data provider, data provider traffic load, and successful data
offloading. The obtained results demonstrated significant accuracy of data placement and
gain in data offloading via cache resources. The obtained results demonstrate a significant
improvement in data offloading through cache resources due to devising users’ positioning
to optimize data caching based on users’ own and neighboring users’ data preferences
existing within D2D communication range. The main contributions are summarized
as follows:
• First, we formulate a data caching mechanism for UE considering a realistic network
scenario such as interference, users’ mobility, and diverse data demands. This makes
it challenging to establish a realistic D2D communications environment.
• We maximize data offloading via cache resources through formulating the data offload-
ing problem in D2D communications. Therefore, we formulate our data offloading
problem into a submodular optimization problem over matroid constraint. Then, we
propose a greedy algorithm that can guarantee (1-1/e)-optimal performance. Our
problem formulation is constrained to achieve high data offloading via cache resources,
taking into account limited cache resources, interference, and diverse data preferences.
• Extensive simulations are conducted by devising a real-world mobility data set. The
obtained results demonstrate a significant improvement in D2D communications due
to efficient data placement based on users’ connectivity patterns and data preferences.

Organization
The rest of this paper is organized in the following manner. In Section 2, we present
the mobility model, the data caching model, and problem formulation. In Section 3, we
formulate our optimization problem of data offloading via UE’ cache resources. In Section 4,
a mobility-aware data caching scheme is proposed. In Section 5, we present simulation
results and their discussion. Finally, Section 6 concludes this paper. Furthermore, Table 2
lists the key notations.

Table 2. Notations.

Parameter Description
U Set of mobile users
d D2D user
C Set of users operating in cellular mode
RD D2D communication range
di,j D2D link between users i and j
γi,j D2D contact probability of users i and j
W Wireless channel bandwidth
Ri,j Data transmission rate of a D2D pair i and j
α Path-loss exponent
di,j Distance between users i and j
2
h Channel fading coefficients of channel between D2D pair di,j
i,j 2
hc ,j Channel fading coefficients of channel between interfering cellular
i
user ci and receiver of D2D pair d j
Γ Signal to Interference Noise Ratio (SINR)
µdi,j Intra-cell interference threshold value
F Number of contents
H Size of content
Electronics 2022, 11, 3434 5 of 15

Table 2. Cont.

Parameter Description
ψ Popularity skewness parameter
pi, f Content popularity distribution
Q Cache capacity of each user
xi, f Content f cache placement in the cache of user i
td Content download deadline time
xi, f A binary random variable showing a content is cached or not
ρi ( t ) Position of user i at time t
λi,j Contact rate between users i and j

2. System Model
We consider a heterogeneous wireless network (HetNet) consisting of a small base
station (SBS) and multiple mobile users U = {u1 , u2 , ..., un } as illustrated in Figure 1. Mobile
users can access data from the core network through serving SBS or from proximity users by
exploiting device-to-device (D2D) communication. The cellular users and D2D users will
co-exist in a cell. The users operating in cellular mode and D2D mode are represented by
the sets C = {c1 , c2 , ..., cn } and D = {d1 , d2 , ..., dn }, respectively. For D2D communications,
the cellular users will share their spectrum resources S = {s1 , s2 , ..., sn } with the D2D pairs.
When two users i and j are within D2D communication range R D of each other, they can
establish a D2D link and can perform data sharing, and a D2D pair is represented by di,j .
However, users are mobile and exhibit intermittent connectivity with each other.
Therefore, it is of paramount significance that complete data delivery through the D2D link
should be achieved within a certain time period td . After td , the content will be fetched
from the core network through serving SBS to meet the demand of the content-requesting
user, while posing a burden over the backhaul. The contact between users i and j in order
to establish a D2D pair di,j is represented by a random variable γi,j . If ∑in=1 γi,j ≤ 1, then
users i and j may establish a D2D link by utilizing the up-link channel resource of a cellular
user ci ∈ C .

Core Network
orrk

SBS

MBS
d1 c1
d3
d6

d2

d5 d4

Figure 1. An illustration of cache-enabled D2D communications.

We assume an orthogonal frequency-division multiplexing (OFDM) system with cn


orthogonal channels, where each channel bandwidth is W. Moreover, D2D users operate
in an underlay mode by spectrum sharing with cellular users. To ensure fairness, each
Electronics 2022, 11, 3434 6 of 15

D2D user can occupy only one subchannel to perform data communications [16]. The
background noise follows an independent and identically distributed (i.i.d.) circularly sym-
metric complex Gaussian distribution with zero mean, and variance is equal to σ2 at each
receiving user. For different frequency bands, the fading coefficients of the transmission
links are i.i.d. Furthermore, we assume that fading coefficients do not vary within the
duration of one time block To .
We consider a Rayleigh fading channel and utilize the propagation path-loss model [17].
The data transmission rate of a D2D pair is evaluated as follows:
 2 − α 
pi hi,j li,j
Ri,j = W × log2 1 + 2 , (1)
σ2 + pci hci ,j lc−,jα

i

where W represents the bandwidth, pi is the transmitter power of D2D pair di,j ; α represents
the path-loss exponent, and noise power is represented by σ2 . li,j represents the distance
between two users i and j of di,j . The distance between an interfering cellular user ci and
2 2
content receiver user j of di,j is represented by lc ,j . hi,j , and hc ,j represents the channel
i i
fading coefficients of the channel between D2D pair di,j and the channel between interfering
cellular user ci and the receiver of D2D pair d j , respectively [18].
The utilization of spectral resources of a cellular user ci by di,j is represented by a
random variable β ci ,di,j ∈ {0, 1}. If β ci ,di,j = 1, it indicates the resource block of cellular
user ci is given to the D2D pair di,j ; otherwise, it is equal to 0. In our system model, we
consider that one D2D pair can utilize the spectrum resource of only one cellular user.
Therefore, we have a constraint of ∑ci ∈C β ci ,di,j ≤ 1, ∀ di,j ∈ D . We take into consideration
that inter-cell interference can be managed by deploying efficient resource scheduling
and power control mechanisms [19]. Therefore, we only focus on intra-cell interference of
cellular users and D2D pairs. As mentioned earlier, when two users i and j exist within R D ,
they can directly share data through D2D communication. However, there is a constraint of
receiving complete content with td ; otherwise, failure of the D2D link occurs, and content
will be fetched from the core network through serving SBS. Hence, to achieve successful
data transmission, the data transmission rate should be Ri,j ≥ Rmin . The evaluation of Rmin
2 −δ
pi | hi,j | li,j
depends on the distance and signal-to-interference noise ratio (SINR) Γ = 2 .
σ2 + pci hc ,j lc−,jδ

i i
For successful data transmission, the intra-cell interference should be Γ ≥ µdi,j , where µdi,j
is the threshold value.

Data Caching
We consider a data library containing a total of F contents. The index set of the
data library is represented by F = {1, 2, ..., F }, and the size of each content is equal to H.
The contents are arranged in an ascending order based on the content popularity. The
content popularity distribution follows a Zipf distribution with parameter ψ [20], which is
evaluated as follows:
1/ f ψ
pi, f = , (2)
F
∑ 1/ f ψ
f =1

The cache capacity of each user is Q < F. Each content is cached either completely or
not at all. The content cache placement in the cache of user i is expressed as:

1, if user i cache content f ,
xi, f = (3)
0, otherwise,

where i ∈ D and f ∈ F . When a user needs content, it first checks its own cache to fulfill
its content demand. If the content is not cached by itself, then it finds the desired content
in the cache of proximity users. If content is cached by another proximity user, then D2D
Electronics 2022, 11, 3434 7 of 15

communication is exploited to access the desired content. However, there is a deadline


td during which complete content should be obtained by the receiver through D2D links.
The deadline time is greater than the time needed to completely download a content, i.e.,
td > H/Ri,j . After finishing td , the desired content will be fetched from the core network
through serving SBS by posing a traffic burden on the network system.
Only a communication channel is used for the transmission of requested content f ,
and multiple channels cannot be occupied for the transmission of f . Moreover, a D2D
transmitter cannot transmit content f to more than one receiver requesting f . After a
successful transmission of the desired content, the corresponding content transmitter stays
silent during the remaining time block. Then, the channel resource is allocated to other
content demands in the next time blocks.

3. Problem Formulation
There is an intermittent connectivity between mobile users. If γi, f = 1, two users i and
j exist within R D , and a D2D pair is created. The time when two users create a D2D link is
known as contact time, and contents can be shared. Moreover, when users i and j move
out of R D , they can only share content at the next contact or the data-demanding user can
obtain after td from the core network through serving SBS. The time taken by two users i
and j to come back into R D is known as inter-contact time (ICTi,j ) that is evaluated as:

ICTi,j = (t − to ) : ρi (t) − ρ j (t) ≤ R D , t > to , (4)

where ρi (t) and ρ j (t) are the positions of users i and j at time t; k·k represents the distance
evaluation between two users i and j of a D2D pair di,j . ICTi,j is a time-dependent random
variable that can be modeled as an exponential distribution with heterogeneous coefficient.
If complete content is transmitted within td , then the content transmission is considered
successful; otherwise, content is fetched from the core network. Moreover, the contact rate
between two users i and j of a di,j is represented by λi,j = 1/E[ ICTi,j ]. The contact rate
follows a Poisson distribution with mean arrival rate λi,j , which varies for each content-
receiving user. The maximization of successful content transmission ratio within td is
equivalent to maximizing the probability of successful content transmission through D2D
links. The total amount of requested content f received through D2D communications
during td is represented by ∑ wi ≤ H.
i ∈D\ j
There can be two cases of D2D link between two users i and j at the time of data
offloading. User i either knows user j is in contact or not, due to which there are two
different cases of evaluation of the probability of successful transmission of content f that
f
are represented by Pi (wi ).
We set the time when data transmission is initiated by user i for data requesting user j
through the D2D link di,j , and the time just before td is represented by τ.
• The time when users i and j come into D2D range of each other is represented by t1 . If
content f requested by user j is cached by user i (i.e., xi, f = 1), then a spectral resource
f
is allocated to di,j , and content f transmission to user j is initiated by user i. Pi (wi )
under the condition t1 = t for 0 ≤ t < τ −t f is:
wi

t−t−t f
w
i  
Z f
− λ i t − t − wi
 
f − λi y
Pi wi |ti,1 =t = λi e dy = 1 − e . (5)
0
Electronics 2022, 11, 3434 8 of 15

 
f f f
For t ≥ τ − wi , we get Pi wi |ti,1 = 0 . Thus, Pf (wi ) is derived as:

t−t f
w
  Z i  
f f
Pi wi = Pi wi |ti,1 = t λi e−λi t dy (6)
0
!
    − λi τ − t f
w
= 1 − λi τ − t f
wi
+1 e i . (7)
   
f f
Equation (7) holds if λi τ − t f = ηi wi > 0, and wi is greater than zero as well,
wi
i.e.,
 
f
Pi wi =
 
f

 1, ηi wi = λi τ,
(8)


     − η w f   
f i f
 1 − η i w i + 1 e i
, 0 < η i wi < λi τ,
 
f

η w ≤ 0.

 0,
i i

• If both users i and j already exist within D2D communication range of each other, then
we have
τ −t f
!
w
i
  Z − λi τ − t f
f
λi e−λi y dy = 1 − e
w
Pi wi = i , (9)
0
 
f
for λi (τ − t f ) > 0 and λi (τ − tw f ) = ηi wi , therefore,
wi i
 
f


 1, ηi wi = λi τ,
  
   f
f − η i wi
 
f
Pi wi = 1 − e , 0 < η i wi < λi τ, (10)
  
f

ηi wi ≤ 0.

 0,

• Both cases can be represented as an array A = { A1 (t), A2 (t), ..., A D (t)} and defined
as : 
0, i meets j at time t,
A(t) = (11)
1, Otherwise.
We obtain 
   1, ηi = λi τ,
Pi wfi = 1 − e−ηi − Ai (t)ηi e−ηi , 0 < ηi < λi τ, (12)
0, ηi ≤ 0.

Thus, the success probability of receiving content f through D2D communication at


dividing time t can be expressed as:

N
∏ Pi ( H |t) = Yi .
f f
(13)
i =1

We formulate the data caching and resource allocation problem to optimize the success
probability of data transmission through D2D communications. The problem is formulated
in the following manner:

1 h   i
∑ ∑ ∑
f
V = max pi, f xi, f + 1 − xi, f Yi β ci ,di,j (14)
N ci ∈C i ∈D f ∈F
Electronics 2022, 11, 3434 9 of 15

H ∑ xi, f ≤ Q, (15)
f ∈F

xi, f ∈ {0, 1}, ∀ i ∈ D , ∀ f ∈ F (16)


N
∑ wi
f f
≤ H, wi > 0, (17)
i =1
N
∑ γi ≤ 1, (18)
i =1

Γ ≥ µ di , (19)
rdi ,ci ≥ rmin , (20)
β ci ,di,j ∈ {0, 1}, ∀ ci ∈ C, ∀ di ∈ D, (21)

∑ β ci ,di,j ≤ 1, ∀ di ∈ D. (22)
ci ∈C

Our problem is a 0/1 knapsack problem, and it is hard to find an optimal solution.
Therefore, we deploy a heuristic algorithm to find a sub-optimal solution.

4. Proposed Solution
Submodular optimization has a vast range of applications such as max-k-cover and
max-cut problems [21]. Submodular optimization over a matroid constraint approach
is deployed in a wireless network to minimize network delay [22]. In the following,
we will discuss necessary definitions and properties of submodular set function and
matroid constraint.
Matroids are structures for expressing a linear independence property for general
sets. We require a finite ground set X. Moreover, matroid is a way to label a subset of X
as independent.

Definition 1. A matroid H is a tuple H = (X , I), where X represents a finite ground set and
I ⊆ 2X (the power set of X ) is a collection of independent sets with the following properties:
• I is a nonempty, ∅ ∈ X ,
• I is downward closed. If A ∈ I and B ⊆ I , then B ∈ I ,
• If B, A ∈ I and | B| < | A|, then ∃z ∈ A\ B s.t. B ∪ {z} ∈ I .

4.1. Problem Reformulation


We reformulate our problem in terms of a submodular
n optimization o
problem over
a matroid constraint. We define ground set X = yi, f |i ∈ X and f ∈ F . The cache
placement is denoted as a subset of X, such as Y ⊆ X, yi, f ∈ Y. Here, yi, f ∈ Y means user
i caches content f (i.e., xi, f = 1), and yi, f ∈
/ Y means user i does not cache content f (i.e.,
xi, f = 0). Accordingly, our problem can be rewritten as:

V (Y ) =
1 h     i
(23)
1 yi, f ∈ Y + 1 yi, f ∈
N c∑ ∑ ∑
f
pi, f / Y Yi .
∈C i ∈D
i f ∈F

We rewrite constraint in problem (14) as a matroid constraint.


n o
Lemma 1. Let Xi = yi, f | f ∈ F include cached contents at user i. The constraint in problem (14)
becomes equivalent to matroid constraint Y ∈ I , where I is as follows:
 
Q
I = Y ⊆ X |Y ∩ X i | ≤ , ∀ i ∈ X . (24)
F
Electronics 2022, 11, 3434 10 of 15

Constraint (24) is a typical partitioned matroid [23]. Therefore, we reformulate our


problem as a submodular optimization problem over matroid constraint as follows:

max V (Y )
Y∈ I
1 (25)
pi, f 1 yi, f ∈ Y + 1 yi, f ∈
h     i
∑ ∑ ∑
f
= / Y Yi .
N c ∈C i ∈ D f ∈ F
i

4.2. Proposed Algorithm


The pseudo code of our proposed algorithm is given in Algorithm 1. Our proposed
performs data caching based on data demands. In the beginning, a cache index set is
initialized. We represent the set of elements that can be cached and added to the cache
placement set Y by X r . Then, following users’ data demands from neighboring users within
D2D range, contents are selected for caching. The priority value of every element yi, f ∈ X r
defines the gain of including yi, f into Y as:
 n o
gi, f = V Y ∪ yi, f − V (Y ) . (26)

The main complexity of the proposed approach is determining the priority values
by taking into consideration the data requests of users. In our proposed algorithm, first
we initialize Y as an empty set (i.e., Y = ∅), which means no content is cached. Then, in
each iteration, an element yi0 , f 0 of highest priority value is selected and placed in set Y with
respect to the constraints (19)–(22). Moreover, yi0 , f 0 is removed from the set X r . When the
cache capacity of user i0 is full, then all elements of set Xi0 are removed from the set X r .
This process continues until all contents are placed in the cache resources of users, or cache
capacity of all the users is full. In this manner, contents are cached following users’ own
and neighboring users’ data demands.

Algorithm 1 Cache placement algorithm.


1: Set Y = ∅ ⇔ xi, f = 0, ∀i ∈ S and f ∈ F
2: Xr = X n  n o o
3: Initializing the priority values gi, f = R Y ∪ yi, f − R(Y )|i ∈ S and f ∈ F
4: while |Y | < uH nQ
do
5: yi0 , f 0 = maxr gi, f w.r.t. (19)–(22)
yi, f ∈S
n o
6: Set Y = Y ∪ yi0 , f 0 ⇔ xi0 , f 0 = 1
n o
7: X r = X r − yi 0 , f 0

if Y ∩ X j0 + 1 > Q H then

8:
9: r
X = X − Xi 0 r

10: end if n o
11: Updating the priority values gi, f 0 |i ∈ S, yi, f 0 ∈ X r
12: end while

5. Performance Evaluation
In this section, we provide detailed discussion of our obtained results in comparison
to the three state-of-the-art schemes. All simulation parameters are provided in Table 3. We
have devised a realistic human walk mobility data set that is produced by Korea Advanced
Institute of Science and Technology (KAIST), Korea. In simulation settings, we consider
a small cell with area 250 m × 250 m that is governed by a small base station having
transmission power of 30 dBm. Users are considered mobile, and their transmission power
is set to 23 dBm. In our work, we considered D2D communication. Therefore, when two
users are within a 20 m range of each other, then both the users can establish a D2D link.
Unless otherwise stated, we consider the number of contents equal to 100 and capacity of
Electronics 2022, 11, 3434 11 of 15

each cache resource equal to 50 MB. Since we model the data popularity profile through
Zipf distribution, we set popularity skewness parameter equal to one1. All simulations
are performed with MATLAB R2018a. For performance evaluation, we have utilized three
schemes as our baselines that are briefly discussed as follows:
• Random Caching: This scheme enables data caching without any information in a
random fashion. All users cache contents with the same probability [13].
• Least Frequently Used: Contents are cached based on users’ data requests frequency
of each content. Most frequently requested or used contents are cached in UE [14].
• Zipf-based probabilistic: Users’ cached contents based on the data request probability
following Zipf distribution [15].

Table 3. System parameters.

Parameter Value
Area of small cell 250 m × 250 m
Channel bandwidth 20 MHz
SBS transmission power 30 dBm
UE transmission power 23 dBm
Background noise −174 dBm/Hz
D2D communication range 20 m
Popularity skewness 1
Noise figure 7 dB
Receiver sensitivity −70 dBm

In Figure 2, we investigated the influence of increasing the number of users on the


data offloading through D2D communications. The obtained results demonstrate an over-
whelming situation, where more contents are offloaded through D2D communications with
an increasing number of users. This rise in data offloading through D2D communications
is due to the increasing probability of users’ connectivity and data sharing via D2D links.
Hence, this approach facilitates high data rate and low latency in data access because con-
tents can be obtained from the neighboring UE. Our proposed scheme achieves 14%, 26%,
and 32% better performance than the least frequently used, the Zipf-based probabilistic,
and the random caching schemes, respectively. It is evident that the proposed scheme
exploits users’ mobility patterns effectively to improve data caching in UE, which results in
a successful probability of data sharing via D2D links while taking into account limited
duration of users’ connectivity among themselves.

0.9
Data Offloading Ratio

0.8

0.7
Proposed Scheme
0.6 Least Frequently Used
Zipf based Probabilistic
Random Caching
0.5
10 20 30 40 50
Users
Figure 2. Data offloading ratio versus users with F = 100, H = 50 MB, and Q = 0.4 GB.

In Figure 3, the impact of increasing cache capacity of UE on the performance of data


offloading via D2D communications is analyzed. It is evident from the obtained results that
there is a significant growth in the data offloading via D2D links. This rise in data offloading
Electronics 2022, 11, 3434 12 of 15

is due to the fact that large cache resources can accommodate a large number of diverse
contents, which can fulfill users’ data demands through the cache resources rather than
data fetching from the core network. Our proposed caching scheme achieves 16%, 37%,
and 47% higher data offloading than the least frequently used, the Zipf-based probabilistic„
and the random caching schemes, respectively. From obtained results, it is evident that
large cache capacity improves the opportunity of data access through D2D links.

0.9 Proposed Scheme


Least Frequently Used
Data Offloading Ratio

Zipf based Probabilistic


0.8 Random Caching

0.7

0.6

0.5

0.3 0.4 0.5 0.6 0.7


Cache Capacity (GB)
Figure 3. Data offloading ratio versus cache capacity with F = 100, H = 50 MB.

In Figure 4, the impact of users’ participation in D2D communications is observed


on the network performance. It is evident from the obtained results that data offloading
increases with increasing number of D2D pairs because more users are available to par-
ticipate in D2D communications. Initially, data offloading increases due to growing D2D
links. However, the trend stabilizes at a certain stage due to growing data demands with
an increasing number of users. This causes data fetching from the core network, which
halts data offloading gain.

0.9
Data Offloading Ratio

0.8

0.7

0.6
Proposed Scheme
0.5 Least Frequently Used
Zipf based Probabilistic
Random Caching
0.4
4 6 8 10 12
D2D Users
Figure 4. Data offloading ratio versus D2D users with F = 100, H = 50 MB, and Q = 0.4 GB.

In Figure 5, we evaluate the impact of an increasing number of contents on the data


offloading via D2D communications. It is evident from the obtained results that the data
offloading ratio degrades for a large number of contents. It is because there is limited cache
capacity of each UE, and all contents cannot be stored. Therefore, content demands are
fulfilled by data access from the core network. However, our scheme utilizes users’ mobility
patterns and their data demands for data caching, which increase the probability of data
access via UE’ cache resources. As a result, our proposed scheme achieves 11%, 24%, and
48% performance gain than the least frequently used, the Zipf-based probabilistic, and the
random caching schemes, respectively. This result demonstrates that despite limited cache
Electronics 2022, 11, 3434 13 of 15

capacity of UE, our proposed scheme maintains a higher data offloading ratio by handling
growing data library size.

0.8
Proposed Scheme

Data offloading Ratio


Least Frequently Used
0.7 Zipf based Probabilistic
Random Caching

0.6

0.5

0.4

0.3
1000 2000 3000 4000 5000
Contents
Figure 5. Data offloading ratio versus contents with U = 50, H = 50 MB, and Q = 1 GB.

The effect of growing data library size on the obtained cache hit rate is shown in
Figure 6. The obtained results demonstrate that cache hit rates of all the caching schemes is
trended downward related to the increasing data library. Our proposed caching scheme
outperforms three baseline schemes despite a decline in performance because our proposed
scheme performs data caching based on users’ connections and data demands. For instance,
our suggested caching scheme achieves 8%, 15%, and 20% higher cache hits than the least
frequently used, the Zipf-based probabilistic, and the random caching schemes, respectively,
when the data library size is equal to 3000. This substantial gain achieved by our proposed
caching scheme is due to exploitation of mobility patterns and optimal data placement.

80
Proposed Scheme
Least Frequently Used
Zipf based Probability
Random Caching
Cache Hit Rate

70

60

50
1000 2000 3000 4000 5000
Contents
Figure 6. Cache hit rate versus contents with U = 50, H = 50 MB, and Q = 1 GB.

The effect of increasing cache capacity on the cache hit rate is shown in Figure 7. The
obtained results show an increase in cache hits as cache capacity rises. This increase in
cache hits is due to availability of a large number of contents in the cache resources to
meet users’ data demands via cache-enabled D2D communications. For instance, our
proposed caching technique achieves 5%, 11.4%, and 15.2% higher cache hit rates than
the least frequently used, the Zipf-based probabilistic, and the random caching schemes,
respectively, when the cache size is equal to 60. Hence, our proposed caching scheme is
capable to precisely predict users’ data preferences while taking into account users’ data
preferences, in comparison to the baseline schemes.
Electronics 2022, 11, 3434 14 of 15

90

Cache Hit Rate


80

70
Proposed Scheme
Least Frequently Used
Zipf based Probabilistic
Random Caching
60
0.3 0.4 0.5 0.6 0.7
Cache Capacity (GB)
Figure 7. Cache hit rate versus cache capacity with F = 100, H = 50 MB.

6. Conclusions
We proposed data caching in UE based on users’ mobility patterns to improve data
offloading through UE cache resources. An intermittent connectivity characteristic of
mobile users was devised to perform data sharing among users via D2D links. However,
mobility also brings the challenge of data sharing within a limited time period. There-
fore, we designed a mobility-aware data caching scheme while taking into account users’
evolving connectivity patterns and diverse data demands. Our proposed approach also
considered interference experienced by mobile users from each other. From simulation, the
efficiency of our proposed approach was evident as it can achieve significant performance
gain by exploiting real mobility traces of UE to optimize data caching in UE. In our future
research work, we will focus on minimization of communication overhead by exploiting
data caching in wireless networks. Moreover, we will devise deep reinforcement learning
to optimize data caching at UE level based on feature extraction of data contents.

Author Contributions: Conceptualization and methodology, M.S., S.I. and M.A.; supervision, S.S.,
R.U., S.M., M.I. and M.A.; software, M.S. and S.I.; writing, M.S., S.I., A.B. and M.A.; review and
editing, M.S., S.I., R.U., M.I., S.M., A.B. and M.A. All authors have read and agreed to the published
version of the manuscript.
Funding: This research received no external funding.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Cisco Visual Networking Index. Global Mobile Data Traffic Forecast Update, 2018–2023 White Paper. Available online: http:
//goo.gl/ylTuVx (accessed on 1 August 2022)
2. Lyu, L.; Chen, C.; Zhu, S.; Cheng, N.; Tang, Y.; Guan, X.; Shen, X. Dynamics-aware and beamforming-assisted transmission for
wireless control scheduling. IEEE Trans. Wirel. Commun. 2018, 17, 7690–7767. [CrossRef]
3. Sun, R.; Wang, Y.; Cheng, N.; Lyu, L.; Zhang, S.; Zhou, H.; Shen, X. QoE-Driven transmission-aware cache placement and
cooperative beamforming design in Cloud-RANs. IEEE Trans. Veh. Technol. 2020, 69, 636–650. [CrossRef]
4. Ma, J.; Liu, L.; Song, H.; Shafin, R.; Shang, B.; Fan, P. Scalable Video Transmission in Cache-Aided Device-to-Device Networks.
IEEE Trans. Wirel. Commun. 2020, 19, 4247–4261. [CrossRef]
5. Wang, X.; Li, R.; Wang, C.; Li, X.; Taleb, T.; Leung, V.C.M. Attention-Weighted Federated Deep Reinforcement Learning for
Device-to-Device Assisted Heterogeneous Collaborative Edge Caching. IEEE J. Sel. Areas Commun. 2021, 39, 154–169. [CrossRef]
6. Lee, M.-C.; Molisch, A.F. Individual Preference Aware Caching Policy Design in Wireless D2D Networks. IEEE Trans. Wirel.
Commun. 2020, 19, 5589–5604. [CrossRef]
7. Ahmad, F.; Ahmad, A.; Hussain, I.; Muhammad, G.; Uddin, Z.; AlQahtani, S.A. Proactive Caching in D2D Assisted Multitier
Cellular Network. Sensors 2022, 22, 5078. [CrossRef] [PubMed]
8. Jaafar, W.; Mseddi, A.; Ajib, W.; Elbiaze, H. Content Caching and Channel Allocation in D2D-Assisted Wireless HetNets. IEEE
Access 2021, 9, 112502–112515. [CrossRef]
Electronics 2022, 11, 3434 15 of 15

9. Abdolkhani, N.; Eslami, M.; Haghighat, J.; Hamouda, W. Optimal Caching Policy for D2D Assisted Cellular Networks with
Different Cache Size Devices. IEEE Access 2022, 10, 99353–99360. [CrossRef]
10. Passarella, A.; Conti, M. Analysis of individual pair and aggregate intercontact times in heterogeneous opportunistic networks.
IEEE Trans. Mob. Comput. 2013, 12, 2483–2495. [CrossRef]
11. Deng, T.; Ahani, G.; Fan, P.; Yuan, D. Cost-optimal caching for D2D networks with user mobility: Modeling, analysis, and
computational approaches. IEEE Trans. Wireless Commun. 2018, 17, 3082–3094. [CrossRef]
12. Cai, Y.; Chen, Y.; Ding, M.; Cheng, P.; Li, J. Mobility Prediction-Based Wireless Edge Caching Using Deep Reinforcement Learning.
In Proceedings of the IEEE/CIC International Conference on Communications in China (ICCC), Xiamen, China, 28–30 July 2021;
pp. 1036–1041.
13. Chen, M.; Hao, Y.; Hu, L.; Huang, K.; Lau, V.K.N. Green and Mobility-Aware Caching in 5G Networks. IEEE Trans. Wirel.
Commun. 2017, 16, 8347–8361. [CrossRef]
14. Hasslinger, G.; Heikkinen, J.; Ntougias, K.; Hasslinger, F.; Hohlfeld, O. Optimum caching versus LRU and LFU: Comparison and
combined limited look-ahead strategies. In Proceedings of the 16th International Symposium on Modeling and Optimization in
Mobile, Ad Hoc, and Wireless Networks (WiOpt), Shanghai, China, 7–11 May 2018; pp. 1–6.
15. Chen, Z.; Lee, J.; Quek, T.Q.S.; Kountouris, M. Cooperative caching and transmission design in cluster-centric small cell networks.
IEEE Trans. Wirel. Commun. 2017, 16, 3401–3415. [CrossRef]
16. Zhang, H.; Liao, Y.; Song, L. Device-to-device communications underlaying cellular networks in unlicensed bands. In Proceedings
of the IEEE International Conference on Communications (ICC), Paris, France, 21–25 May 2017; pp. 1–6.
17. Ahmed, M.; Shi, H.; Chen, X.; Li, Y.; Waqas, M.; Jin, D. Socially Aware Secrecy-Ensured Resource Allocation in D2D Underlay
Communication: An Overlapping Coalitional Game Scheme. IEEE Trans. Wirel. Commun. 2018, 17, 4118–4133. [CrossRef]
18. Zhao, Y.; Li, Y.; Cao, Y.; Jiang, T.; Ge, N. Social-Aware Resource Allocation for Device-to-Device Communications Underlaying
Cellular Networks. IEEE Trans. Wirel. Commun. 2013, 14, 6621–6634. [CrossRef]
19. Lei, L.; Zhong, Z.; Lin, C.; Shen, X. Operator controlled device-to-device communications in LTE-advanced networks. IEEE Wirel.
Commun. 2012, 19, 96–104. [CrossRef]
20. Niesen, U.; Maddah-Ali, M.A. Coded caching with nonuniform demands. In Proceedings of the IEEE Conference on Computer
Communications Workshops (INFOCOM WKSHPS), Toronto, ON, Canada, 27 April–2 May 2014; pp. 221–226.
21. Sun, R.; Wang, Y.; Lyu, L.; Cheng, N.; Zhang, S.; Yang, T.; Shen, X. Delay-Oriented Caching Strategies in D2D Mobile Networks.
IEEE Trans. Veh. Technol. 2020, 69, 8529–8541. [CrossRef]
22. Liu, J.; Bai, B.; Zhang, J.; Letaief, K.B. Cache Placement in Fog-RANs: From Centralized to Distributed Algorithms. IEEE Trans.
Wirel. Commun. 2017, 16, 7039–7051. [CrossRef]
23. Nemhauser, G.; Wolsey, L.; Fisher, M.L. An analysis of approximations for maximizing submodular set functions—I. Math.
Program. 1978, 14, 265–294. [CrossRef]

You might also like