Professional Documents
Culture Documents
Optimality Conditions For Nonsmooth Fuzzy Optimization Models Under The Gh-Weak Subdifferentiability
Optimality Conditions For Nonsmooth Fuzzy Optimization Models Under The Gh-Weak Subdifferentiability
https://doi.org/10.1007/s40314-023-02522-4
Abstract
This paper is concerned with optimality conditions for a class of nonsmooth fuzzy optimiza-
tion problems based on gH-weak subdifferentiability. To this end, we first define the gH-weak
subdifferential of fuzzy-valued functions and derive its fascinating properties such as con-
vexity and closeness. The relationship between gH-weak subdifferentiability and gH-lower
Lipschitz is then discussed. Finally, a necessary and sufficient condition for the existence of
weak efficient solutions to a class of nonsmooth fuzzy optimization problems is established
under gH-weak subdifferentiability, and two necessary optimality conditions are further
deduced. Moreover, numerical examples are presented to show the effectiveness of the the-
oretical results.
B Wei Liu
liuwhhu@163.com
Fangfang Shi
15298387799@163.com
Guoju Ye
yegj@hhu.edu.cn
Dafang Zhao
dafangzhao@163.com
1 School of Mathematics, Hohai University, Nanjing 210098, Jiangsu, China
2 School of Mathematics and Statistics, Hubei Normal University, Huangshi 435002, China
123
10 Page 2 of 18 F. Shi et al.
1 Introduction
Optimization problems widely occur in various fields of engineering, requiring finding the
minimum or maximum points of objective functions. The data collection and quantification
process for optimization modelling often contains a lot of uncertain data due to estimation
errors, prediction errors or lack of information. Representing the data with fuzzy numbers is
an effective method to solve such uncertainty. With the development of fuzzy number theory,
fuzzy optimization has been extensively studied by many scholars. Interested readers can
refer to Anshika et al. (2023), Ghosh et al. (2019, 2020a, b, 2022a, b), Ghosh (2022), Shi
et al. (2023), Fu (2008), Tung and Tam (2022), Treanţǎ (2022a, b, c) and Guo et al. (2022)
for more details.
The differentiability of the objective function plays a crucial role in deriving the opti-
mality conditions for fuzzy optimization models. Chalco-Cano et al. (2015) generated
the Newton’s method for finding non-dominant solutions under gH-differentiability. Qiu
and Ouyang (2022) presented the optimality conditions for weak LU-minimum solutions
using g-differentiability and directed g-differentiability. Zhang et al. (2022) established the
Karush–Kuhn–Tucker type optimality conditions for relative optimal solutions based on gr-
differentiability. In fact, the objective functions often encountered in practical problems are
non-differentiable, and how to solve the nonsmooth optimization problem is a hot research
topic at present. Zhang et al. (2005) defined the subdifferential of fuzzy-valued functions and
discussed the optimality conditions and dual theory of fuzzy optimization problems. This is
an extremely meaningful work, but the definition was given on the premise that H-difference
exist, while H-difference of any two fuzzy numbers does not always exist. Moreover, subd-
ifferentiability exists only when the function satisfies the convexity condition, which leads
to the inability of subdifferential methods to solve nonconvex optimization problems. There-
fore, the investigation of optimality conditions for nonsmooth nonconvex fuzzy optimization
models is a motivating factor for this research.
It is worth noting that weak subdifferential is an important approach for addressing opti-
mal solutions of nonconvex and nonsmooth problems in traditional optimization models,
which was first introduced by Azimov and Gasimov (1999, 2002) through a special super-
linear function and cone diagram. This construction method makes any form of convexity
unnecessary in minimum problems, and its application to nonconvex problems has yielded
many excellent results. Kasimbeyli and Mammadov (2011) provided a necessary and suf-
ficient condition for the optimality of nonsmooth convex optimization by using variational
inequalities and extended it to the nonconvex case based on weak subdifferentials. Yalcin
and Kasimbeyli (2021) proposed a nonconvex optimization approach under the weak sub-
gradient. For more applications of weak subdifferential in optimization, interested readers
are referred to Son and Wen (2020), Kasimbeyli and Mammadov (2009) and Küçük et al.
(2015). We can observe that the basis of all these generalizations to obtain optimality con-
ditions for nonconvex optimization problems is weak subdifferential. On the other hand, the
gH-difference is the most widely used in fuzzy number space, which leads to the primary
aim of this paper being to research the gH-weak subdifferential of fuzzy-valued functions
under the gH-difference and then apply it to a class of nonsmooth and nonconvex fuzzy
optimization models.
The remainder of this article is organized as follows. The preliminaries are introduced in
Sect. 2. Section 3 defines the gH-weak subdifferential of fuzzy-valued functions and proves
its some basic properties. As an application of gH-weak subdifferential, Sect. 4 investigates
a class of nonsmooth nonconvex composite fuzzy optimization models, some optimality
123
Optimality conditions for nonsmooth fuzzy optimization… Page 3 of 18 10
conditions for the existence of weak efficient solution are established. In Sect. 5, conclusions
are drawn.
2 Preliminaries
For any [h, h] ∈ RI , len([h, h]) = h − h is called the length of [h, h].
The fuzzy set m : Rn → [0, 1] is a mapping on Rn , its ϑ-cut set is [m]ϑ = {λ ∈ Rn :
m(λ) ≥ ϑ} for any ϑ ∈ (0, 1], and 0-cut set is [m]0 = {λ ∈ Rn : m(λ) > 0}, where X is the
closure of X ⊆ Rn .
FC is denoted as the set of all fuzzy numbers. For m ∈ FC , the ϑ-cut sets of m are given
[m]ϑ = [m ϑ , m ϑ ] ∈ RI , where m ϑ , m ϑ ∈ R for any ϑ ∈ [0, 1]. Note that every h ∈ R can
be written as h̃ ∈ FC , i.e., [h̃]ϑ = [h, h] for all ϑ ∈ [0, 1].
For example, for a triangular fuzzy number m = h 1 , h 2 , h 3 ∈ FC , h 1 , h 2 , h 3 ∈ R with
h 1 ≤ h 2 ≤ h 3 , its ϑ-cut set is [m]ϑ = [h 1 + (h 2 − h 1 )ϑ, h 3 − (h 3 − h 2 )ϑ], ϑ ∈ [0, 1].
Given m, l ∈ FC , the distance between m and l is
ϑ
D(m, l) = sup max |m ϑ − l ϑ |, |m ϑ − l | .
ϑ∈[0,1]
is a norm on FC .
ϑ
For any m, l ∈ FC expressed as [m ϑ , m ϑ ] and [l ϑ , l ], and for every ∈ R,
123
10 Page 4 of 18 F. Shi et al.
123
Optimality conditions for nonsmooth fuzzy optimization… Page 5 of 18 10
M L ⇔ m i li , ∀ i = 1, 2, . . . , n,
M L ⇔ m i li , ∀ i = 1, 2, . . . , n,
M ≺ L ⇔ m i ≺ li , ∀ i = 1, 2, . . . , n.
is a norm on FnC .
This section focuses on the gH-weak subdifferential of fuzzy-valued functions under the
gH-difference.
The functions
(1 + ϑ)λ, if 0 ≤ λ ≤ 1,
P ϑ (λ) =
(3 − ϑ)λ, if − 1 ≤ λ ≤ 0,
and
ϑ (3 − ϑ)λ, if 0 ≤ λ ≤ 1,
P (λ) =
(1 + ϑ)λ, if − 1 ≤ λ ≤ 0,
123
10 Page 6 of 18 F. Shi et al.
are depicted in Fig. 1 with blue and red regions, respectively. We compute the gH-weak
subdifferential of P at 0 is
∂ w P (0) = (Q; κ) ∈ FC × R+ : λ Q g H κ λ R 1, 2, 3 |λ|, λ ∈ [−1, 1]
ϑ
= {(Q; κ) ∈ FC × R+ : λ [Qϑ , Q ] gH κ λ R LU [1 + ϑ, 3 − ϑ]|λ|,
for all ϑ ∈ [0, 1] and λ ∈ [−1, 1]}.
Next, we discuss about λ in the following two cases:
Case I. When λ ∈ [0, 1], we have
ϑ
∂ w P (0) = {(Q; κ) ∈ FC × R+ : λ [Qϑ , Q ] gH κ λ R
LU [1 + ϑ, 3 − ϑ]|λ|, ϑ ∈ [0, 1], λ ∈ [0, 1]}
= {(Q; κ) ∈ FC × R+ : λQϑ − κλ ≤ (1 + ϑ)λ and
ϑ
λQ − κλ ≤ (3 − ϑ)λ, ϑ ∈ [0, 1], λ ∈ [0, 1]}
ϑ
= {(Q; κ) ∈ FC × R+ : Qϑ − κ ≤ 1 + ϑ and Q − κ ≤ 3 − ϑ, ϑ ∈ [0, 1]}
= {(Q; κ) ∈ FC × R+ : Q 1 + κ, 2 + κ, 3 + κ }.
Case II. When λ ∈ [−1, 0], similarly,
∂ w P (0) = {(Q; κ) ∈ FC × R+ : −3 + κ, −2 + κ, −1 + κ Q}.
Hence, we obtain
∂ w P (0) = {(Q; κ) ∈ FC × R+ : −3 + κ, −2 + κ, −1 + κ Q 1 + κ, 2 + κ, 3 + κ }.
123
Optimality conditions for nonsmooth fuzzy optimization… Page 7 of 18 10
Proof Suppose that (Q; κ) ∈ ∂ w (P )(λ̆) and > 0, for every λ ∈ X,
(λ − λ̆) Q gH κ λ − λ̆ Rn P (λ) gH P (λ̆)
⇒ (λ − λ̆) Q gH κ λ − λ̆ Rn (P (λ) gH P (λ̆))
Q κ
⇒ (λ − λ̆) gH λ − λ̆ Rn P (λ) gH P (λ̆).
κ
Therefore, ( Q w w
; ) ∈ ∂ P (λ̆), i.e., (Q; κ) ∈ ∂ P (λ̆).
We can similarly obtain ∂ w P (λ̆) ⊆ ∂ w (P )(λ̆). That is, ∂ w P (λ̆) = ∂ w (P )(λ̆).
Rearrange the components of λ − λ̆ so that the first r components are non-negative and the
others are negative. Thus,
r n
(λi − λ̆i )Q1i + (λ j − λ̆ j )Q1 j gH κ1 λ − λ̆ Rn P (λ) gH P (λ̆),
i=1 j=r +1
and
r n
(λi − λ̆i )Q2i + (λ j − λ̆ j )Q2 j gH κ2 λ − λ̆ Rn P (λ) gH P (λ̆).
i=1 j=r +1
and
r n
(1 − )(λi − λ̆i )Q2i + (1 − )(λ j − λ̆ j )Q2 j gH (1 − )κ2 λ − λ̆ Rn
i=1 j=r +1 (3)
(1 − )(P (λ) gH P (λ̆)).
By adding (2) to (3), we get
r n
(λi − λ̆i )(Q1i + (1 − )Q2i ) + (λ j − λ̆ j )(Q1 j + (1 − )Q2 j )
i=1 j=r +1
123
10 Page 8 of 18 F. Shi et al.
Proof For any sequence {(Ys ; κs )} in ∂ w P (λ̆) and suppose that {(Ys ; κs )} converging to
(Q; κ) ∈ FnC × R+ , where Ys = (Ys1 , Ys2 , . . . , Ysn ) and Q = (Q1 , Q2 , . . . , Qn ) . Since
(Ys ; κs ) ∈ ∂ w P (λ̆), then for any λ ∈ X, we have
which implies
n
(λi − λ̆i )Ysi gH κs λ − λ̆ Rn P (λ) gH P (λ̆).
i=1
Rearrange the components of λ − λ̆ so that the first r components are non-negative and the
others are negative. Thus,
r n
(λi − λ̆i )Ysi + (λ j − λ̆ j )Ys j gH κs λ − λ̆ Rn P (λ) gH P (λ̆).
i=1 j=r +1
Therefore,
r n
ϑ ϑ ϑ
(λi − λ̆i )Y ϑ
si + (λ j − λ̆ j )Y s j − κs λ − λ̆ Rn ≤ min{P ϑ (λ) − P ϑ (λ̆), P (λ) − P (λ̆)},
i=1 j=r +1
r n
ϑ ϑ ϑ
(λi − λ̆i )Y si + (λ j − λ̆ j )Y ϑ ϑ ϑ
s j − κs λ − λ̆ Rn ≤ max{P (λ) − P (λ̆), P (λ) − P (λ̆)}.
i=1 j=r +1
The fact that the sequence {(Ys ; κs )} converges to (Q; κ) indicates that for any ϑ ∈ [0, 1]
ϑ ϑ
and all i, the sequences {(Y ϑsi ; κsi )} and {(Y si ; κsi )} converge to (Qiϑ ; κi ) and (Qi ; κi ),
respectively. Therefore,
r n
ϑ
(λi − λ̆i )Y ϑsi + (λ j − λ̆ j )Y s j − κs λ − λ̆ Rn
i=1 j=r +1
r n
ϑ
→ (λi − λ̆i )Qiϑ + (λ j − λ̆ j )Q j − κ λ − λ̆ Rn
i=1 j=r +1
ϑ ϑ ϑ ϑ
≤ min{P (λ) − P (λ̆), P (λ) − P (λ̆)},
123
Optimality conditions for nonsmooth fuzzy optimization… Page 9 of 18 10
and
r n
ϑ
(λi − λ̆i )Y si + (λ j − λ̆ j )Y ϑs j − κs λ − λ̆ Rn
i=1 j=r +1
r n
ϑ
→ (λi − λ̆i )Qi + (λ j − λ̆ j )Qϑj − κ λ − λ̆ Rn
i=1 j=r +1
ϑ ϑ
≤ max{P ϑ (λ) − P ϑ (λ̆), P (λ) − P (λ̆)}.
Hence,
⎡ ⎤
r n r n
⎣ (λi − λ̆i )Qϑ + ϑ ϑ
i (λ j − λ̆ j )Q j , (λi − λ̆i )Qi + (λ j − λ̆ j )Qϑj ⎦
i=1 j=r +1 i=1 j=r +1
g H κ λ − λ̆ Rn LU Pϑ (λ) g H Pϑ (λ̆)
r n
ϑ ϑ
⇒ (λi − λ̆i )Qiϑ , (λi − λ̆i )Qi + (λ j − λ̆ j )Q j , (λ j − λ̆ j )Qϑj g H κ λ − λ̆ Rn
i=1 j=r +1
LU Pϑ (λ) g H Pϑ (λ̆)
r n
ϑ ϑ
⇒ (λi − λ̆i ) Qiϑ , Qi + (λ j − λ̆ j ) Qϑj , Q j g H κ λ − λ̆ Rn
i=1 j=r +1
LU Pϑ (λ) g H Pϑ (λ̆)
r n
⇒ (λi − λ̆i )Qi ϑ + (λ j − λ̆ j )Q j ϑ g H κ λ − λ̆ Rn LU Pϑ (λ) g H Pϑ (λ̆)
i=1 j=r +1
That is,
− λ Rn M Fn ≤ λ M FC .
C
l FC = λ1 m 1 + λ2 m 2 + . . . + λn m n FC
≤ λ1 m 1 FC + λ2 m 2 FC + . . . + λn m n FC
= |λ1 | m 1 FC + |λ2 | m 2 FC + . . . + |λn | m n FC
n
≤ λ Rn mi FC
i=1
= λ Rn M Fn .
C
123
10 Page 10 of 18 F. Shi et al.
ϑ
Since l FC = sup max{|l ϑ |, |l |}, then for all ϑ ∈ [0, 1], we have
ϑ∈[0,1]
ϑ
|l ϑ | ≤ λ Rn M Fn and |l | ≤ λ Rn M Fn
C C
ϑ
⇒− λ Rn M Fn ≤ l ϑ and − λ Rn M Fn ≤ l
C C
ϑ
⇒− λ Rn M Fn ≤ |l ϑ | and − λ Rn M Fn ≤ |l |
C C
ϑ
⇒− λ Rn M Fn ≤ max |l ϑ |, |l |
C
ϑ
⇒− λ Rn M Fn ≤ sup max |l ϑ |, |l | = l FC .
C
ϑ∈[0,1]
123
Optimality conditions for nonsmooth fuzzy optimization… Page 11 of 18 10
This implies that P is gH-lower Lipschitz on Nδ (λ̆) with L = h 3 . From the arbitrariness of
λ and λ̆, we conclude that P is gH-lower Lipschitz on [1, +∞).
Theorem 3 Let P : X → FC be a fuzzy-valued function and λ̆ ∈ X. If P is gH-lower Lipschitz
at λ̆, then P is gH-lower locally Lipschitz at λ̆ and there exist K ≥ 0 and G ∈ FC such that
for any λ ∈ X,
G gH K λ Rn P (λ). (7)
Proof Since P is gH-lower Lipschitz at λ̆, then there exists a L ≥ 0 such that
−L λ − λ̆ Rn P (λ) gH P (λ̆).
123
10 Page 12 of 18 F. Shi et al.
In this section, let us consider the following fuzzy optimization model (FOM, for short):
∂ w P1 (λ̆) ⊂ ∂ w P2 (λ̆).
123
Optimality conditions for nonsmooth fuzzy optimization… Page 13 of 18 10
Proof The gH-weak subdifferentiability of P1 at λ̆ implies that there exists (Q; κ) ∈ FnC ×R+
such that
(λ − λ̆) Q gH κ λ − λ̆ Rn P1 (λ) gH P1 (λ̆) for any λ ∈ X.
Since P2 gH P1 gets the weak efficient value 0̃ at λ̆, then for any λ ∈ X,
0̃ (P2 gH P1 )(λ)
⇒ 0̃ P2 (λ) gH P1 (λ)
(11)
⇒ P1 (λ) P2 (λ)
⇒ P1 (λ) gH P1 (λ̆) P2 (λ) gH P2 (λ̆).
Consequently, (11) implies that
(λ − λ̆) Q gH κ λ − λ̆ Rn P2 (λ) gH P2 (λ̆).
For any λ ∈ [−1, 1], we have 0̃ P2 (λ) gH P1 (λ) and P1 (0) = P2 (0) = 0̃.
Let (Q; κ) ∈ ∂ w P1 (0), then
λ Q gH κ λ R 1, 2, 3 λ2 , ∀λ ∈ [−1, 1].
Since 1, 2, 3 λ2 2, 3, 5 |λ| for any λ ∈ [−1, 1], then
λ Q gH κ λ R 2, 3, 5 |λ|.
Hence, (Q; κ) ∈ ∂ w P2 (0), i.e., ∂ w P1 (0) ⊂ ∂ w P2 (0).
We can easily obtain that λ̆ = 0 is an efficient solution of P2 (λ) gH P1 (λ), but P2 (0) =
P1 (0). The results can also be presented in Fig. 2.
By calculation, it can be obtained that
and
w
∂ P2 (0) = {(Q; κ2 ) ∈ FC × R+ : −5, −2, −1 (1 − λ1 ) − κ2 Q 1, 2, 3 1 + λ1 + κ2 ,
λ ∈ [−1, 0) ∪ (0, 1]}.
Hence, ∂ w P1 (0) ⊂ ∂ w P2 (0). Therefore, P1 (0) = P2 (0) is an essential condition.
123
10 Page 14 of 18 F. Shi et al.
Since the condition P1 (λ̆) = P2 (λ̆) is very strict, we will give more flexible conditions in
the following theorem to make the conclusion of Theorem 6 hold.
Theorem 7 Let P1 , P2 : X → FC be gH-weak subdifferentiable at λ̆ ∈ X, which is a weak
efficient solution of P2 g H P1 . Then,
∂ w P1 (λ̆) ⊂ ∂ w P2 (λ̆),
provided that len([P1 (λ)]ϑ ) ≥ len([P2 (λ)]ϑ ) or len([P1 (λ)]ϑ ) ≤ len([P2 (λ)]ϑ ) for any
λ ∈ X and every ϑ ∈ [0, 1].
Proof The gH-weak subdifferentiability of P1 at λ̆ implies that there exists (Q; κ) ∈ FnC ×R+
such that for any λ ∈ X,
(λ − λ̆) Q gH κ λ − λ̆ Rn P1 (λ) gH P1 (λ̆). (12)
Since λ̆ is a weak efficient solution of P2 gH P1 , then for any λ ∈ X,
123
Optimality conditions for nonsmooth fuzzy optimization… Page 15 of 18 10
That is,
and
ϑ ϑ
P ϑ1 (λ) − P ϑ1 (λ̆) ≤ max{P ϑ2 (λ) − P ϑ2 (λ̆), P 2 (λ) − P 2 (λ̆)}.
That is,
If len([P1 (λ)]ϑ )
≤ len([P2 (λ)]ϑ ) for any λ ∈ X and every ϑ ∈ [0, 1], by a similar manner
as above, we have
Then,
and
We can easily obtain λ̆ = 0 is an efficient solution of P2 (λ) g H P1 (λ) and P2 (0) = P1 (0),
but len([P1 (λ)]ϑ ) ≤ len([P2 (λ)]ϑ ) for any λ ∈ X and every ϑ ∈ [0, 1]. The results can also
be presented in Figs. 3 and 4.
By calculation, it can be obtained that
and
123
10 Page 16 of 18 F. Shi et al.
5 Conclusions
This paper investigated the gH-weak subdifferentiability of fuzzy-valued functions and dis-
cussed its some interesting properties. As an application of gH-weak subdifferentiability, the
necessary and sufficient condition and the necessary conditions for finding weak efficient
solutions of nonsmooth composite optimization models were established. We intend to con-
tinue researching related algorithms for addressing fuzzy optimization problems based on
gH-weak subdifferentiability in the future.
123
Optimality conditions for nonsmooth fuzzy optimization… Page 17 of 18 10
Acknowledgements The authors are grateful to the referees for their suggestions, which greatly improved
the readability of the article.
Funding This work is supported by the Postgraduate Research & Practice Innovation Program of Jiangsu
Province (KYCX23_0668).
Declarations
Conflict of interest The authors declare that they have no conflict of interest.
References
Anshika A, Ghosh D, Mesiar R, Yao HR, Chauhan RS (2023) Generalized–Hukuhara subdifferential analysis
and its application in nonconvex composite interval optimization problems. Inform Sci 622:771–793
Azimov A, Gasimov R (1999) On weak conjugacy, weak subdifferentials and duality with zero gap in non-
convex optimization. Int J Appl Math 1:171–192
Azimov A, Gasimov R (2002) Stability and duality of nonconvex problems via augmented Lagrangian. Cyber-
net Syst Anal 3:403–417
Chalco-Cano Y, Silva GN, Rufián-Lizana A (2015) On the Newton method for solving fuzzy optimization
problems. Fuzzy Set Syst 272:60–69
Fu G (2008) A fuzzy optimization method for multicriteria decision making: an application to reservoir flood
control operation. Expert Syst Appl 34(1):145–149
Ghosh D (2022) Interval-valued value function and its application in interval optimization problems. Comput
Appl Math 41:137
Ghosh D, Singh A, Shukla KK, Manchanda K (2019) Extended Karush–Kuhn–Tucker condition for con-
strained interval optimization problems and its application in support vector machines. Inform Sci
504:276–292
Ghosh D, Chauhan RS, Mesiar R, Debnath AK (2020a) Generalized Hukuhara Gâteaux and Fréchet derivatives
of interval-valued functions and their application in optimization with interval-valued functions. Inform
Sci 510:317–340
Ghosh D, Debnath AK, Pedrycz W (2020b) A variable and a fixed ordering of intervals and their application
in optimization with interval-valued functions. Int J Approx Reason 121:187–205
Ghosh D, Debnath AK, Chauhan RS, Castillo O (2022a) Generalized Hukuhara-gradient efficient-direction
method to solve optimization problems with interval-valued functions and its application in least squares
problems. Int J Fuzzy Syst 24:1275–1300
Ghosh D, Debnath AK, Mesiar R, Chauhan RS (2022b) Generalized Hukuhara subgradient and its application
in optimization problem with interval-valued functions. Sādhanā 47(2):1–16
Guo Y, Ye G, Liu W, Zhao D, Treanţǎ S (2022) On symmetric gH-derivative applications to dual interval-valued
optimization problems. Chaos Solitons Fractals 158:112068
Kasimbeyli R, Mammadov M (2009) On weak subdifferentials, directional derivatives, and radial epiderivatives
for nonconvex functions. SIAM J Optim 20(2):841–855
Kasimbeyli R, Mammadov M (2011) Optimality conditions in nonconvex optimization via weak subdifferen-
tials. Nonlinear Anal Theory 24(7):2534–2547
Küçük M, Urbański R, Grzybowski J, Küçük Y, Güvenç RA, Tozkan D, Soyertem M (2015) Weak subdiffer-
ential/superdifferential, weak exhausters and optimality conditions. Optimization 64(10):2199–2212
Nehi HM, Daryab A (2012) Saddle point optimality conditions in fuzzy optimization problems. Int J Fuzzy
Syst 14(1):11–21
Qiu D, Ouyang C (2022) Optimality conditions for fuzzy optimization in several variables under generalized
differentiability. Fuzzy Sets Syst 434:1–19
Shi F, Ye G, Liu W, Zhao D (2023) A class of nonconvex fuzzy optimization problems under granular
differentiability concept. Math Comput Simulat 211:430–444
Son T, Wen C (2020) Weak-subdifferentials for vector functions and applications to multiobjective semi-infinite
optimization problems. Appl Anal 99(5):840–855
123
10 Page 18 of 18 F. Shi et al.
Stefanini L (2010) A generalization of Hukuhara difference and division for interval and fuzzy arithmetic.
Fuzzy Set Syst 161(11):1564–1584
Treanţǎ S (2022a) LU-optimality conditions in optimization problems with mechanical work objective func-
tionals. IEEE Trans Neur Net Lear 33(9):4971–4978
Treanţǎ S (2022b) Characterization results of solutions in interval-valued optimization problems with mixed
constraints. J Global Optim 82(4):951–964
Treanţǎ S (2022c) Saddle-point optimality criteria involving (ρ, b, d)-invexity and (ρ, b, d)-pseudoinvexity
in interval-valued optimization problems. Int J Control 95(4):1042–1050
Tung LT, Tam DH (2022) Optimality conditions and duality for continuous-time programming with multiple
interval-valued objective functions. Comp Appl Math 41:347
Wu H (2007) Duality theory in fuzzy optimization problems formulated by the Wolfe’s primal and dual pair.
Fuzzy Optim Decis Mak 6:179–198
Yalcin GD, Kasimbeyli R (2021) Weak subgradient method for solving nonsmooth nonconvex optimization
problems. Optimization 70(7):1513–1553
Zadeh LA (1965) Fuzzy sets. Inf Control 8:338–353
Zhang C, Yuan X, Lee ES (2005) Duality theory in fuzzy mathematical programming problems with fuzzy
coefficients. Comput Math Appl 49(11–12):1709–1730
Zhang J, Chen X, Li L, Ma X (2022) Optimality conditions for fuzzy optimization problems under granular
convexity concept. Fuzzy Sets Syst 447:54–75
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under
a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted
manuscript version of this article is solely governed by the terms of such publishing agreement and applicable
law.
123