Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Reliability Engineering and System Safety 234 (2023) 109172

Contents lists available at ScienceDirect

Reliability Engineering and System Safety


journal homepage: www.elsevier.com/locate/ress

Multi-objective robust optimization for enhanced safety in large-diameter


tunnel construction with interactive and explainable AI
Penghui Lin a, Limao Zhang b, c, *, Robert L.K. Tiong a
a
School of Civil and Environmental Engineering, Nanyang Technological University, 50 Nanyang Avenue 639798, Singapore
b
National Center of Technology Innovation for Digital Construction, Huazhong University of Science and Technology, Wuhan, Hubei, China
c
School of Civil and Hydraulic Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, China

A R T I C L E I N F O A B S T R A C T

Keywords: Robust optimization is an ideal solution for enhancing safety in tunnel construction in the presence of unpre­
Multi-objective optimization dictable soil conditions, especially in large-diameter tunnel construction, since it requires the least amount of
Robust optimization information about uncertainties. However, the application of robust optimization to real-world projects is greatly
Tunnel construction
hampered by its dependence on mathematical models. To address this issue, this study builds a pipeline machine
Building information modeling
Explainable AI
learning model to forecast tunnel-induced damage that can be addressed using the robust optimization (RO)
algorithm with high accuracy. The optimization process is integrated into a building information modeling (BIM)
platform and analyzed using the Shapley Additive ExPlanations (SHAP) technique, allowing the designer to
understand and interact with the algorithm. The average improvement of testing samples using an ellipsoidal
uncertainty set with a size of 0.05 is 23.8 and 4.9% on the two selected criteria, which is more conservative than
using deterministic optimization (DO) and stochastic optimization (SO). This study establishes an interactive and
explainable optimization platform that enables designers to make judgments under the most unfavorable soil
conditions with the least amount of accessible information about the uncertainties during tunneling.

1. Introduction existing tunnels.


There have been many successful applications of multi-objective
Along with developments in shield tunneling technology, more and optimization (MOO) techniques in tunnel excavation problems, with
more large-diameter tunnels are being constructed to accommodate the the elitist non-dominant sorting genetic algorithm (NSGA-II) [7] being
rising demand for subterranean space. Conventionally, a tunnel is one of the most frequently used techniques. In order to improve opti­
regarded as large-diameter if its diameter exceeds 8 m [1], which can be mization performance, researchers frequently integrate machine
expanded to 15 m in some extreme cases [2,3]. Current large-diameter learning techniques to create hybrid artificial intelligence (AI) ap­
tunnel construction, on the other hand, is rarely designed to pass proaches [8,9]. Despite the fact that the AI-based optimization ap­
beneath existing tunnels, in which case, in addition to the supporting proaches achieve high accuracy and efficiency, as AI becomes more
pressure on the excavation face [4] and ground surface settlement [5], sophisticated, designers are facing the risk of making unreliable de­
damage to existing tunnels plays a significant role in the cisions blindly due to a lack of interaction and knowledge alienation
decision-making process and further complicates the system [6]. Given from the AI system, particularly in the construction industry [10]. How
that this situation is becoming inescapable due to the complexity of the can we make the AI system more understandable? How shall we interact
urban metro network, appropriate decisions must be made to mitigate with AI? These are the two most important issues to be solved in order to
the damage caused by large-diameter tunnel excavation beneath make AI techniques more applicable to engineers.

Abbreviations: MOO, multi-objective optimization; NSGA-II, elitist non-dominant sorting genetic algorithm; AI, artificial intelligence; XAI, explainable artificial
intelligence; SHAP, shapley additive explanations; LIME, local interpretable model-agnostic explanations; BIM, building information modeling; SA, sensitivity
analysis; SO, stochastic optimization; RO, robust optimization; MORO, multi-objective robust optimization; FEM, finite element method; DO, deterministic opti­
mization; ANN, artificial neural network; SVM, support vector machine; GP, gaussian process; RF, random forest; XGBOOST, extreme gradient boosting; RNN,
recurrent unit network; LSTM, Long short-term memory network.
* Corresponding author.
E-mail address: zlm@hust.edu.cn (L. Zhang).

https://doi.org/10.1016/j.ress.2023.109172
Received 30 August 2022; Received in revised form 7 December 2022; Accepted 16 February 2023
Available online 18 February 2023
0951-8320/© 2023 Published by Elsevier Ltd.
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

The earliest response to the first question dates back to 2004 when uncertainty. The method is developed using a multi-objective robust
Van et al. [11] officially introduced the concept of the explainable AI optimization (MORO) algorithm implemented in BIM software and
(XAI) system. The XAI system is characterized as a system that “presents explained using XAI techniques. The optimization objective is simulated
the designer with an easily understood chain of reasoning from the de­ using pipeline models based on data from Finite Element Method (FEM)
signer’s order, through the system’s knowledge and inference, to the simulations, and the SHAP technique is used to explain the optimization
resulting behavior”. If the AI approach is completely opaque, external outcome. As a crucial component, the influence of uncertainty sets on
procedures known as "post-hoc explainability techniques" [12] must be the optimization result is addressed using various types and sizes.
used to facilitate understanding of the AI process. In recent years, as one Grasshopper, a Rhino plug-in, serves as the platform for designers to
of the representatives of post-hoc methods, the Shapley Additive Ex­ modify the AI process and view the output. The novelty of this study can
Planations (SHAP) [13] method was developed to assess the effect of be summed up in two aspects, including the implementation of MORO
each feature in the model by calculating additive feature importance using machine learning models in tunnel problems, and the enhance­
values [14]. Similarly, other techniques such as the Local Interpretable ment of interaction and understanding of the optimization process using
Model-Agnostic Explanations (LIME) [15] and sensitivity analysis (SA) BIM and SHAP.
are also commonly used to explain black-box models. For instance, The remaining sections are arranged as follows: Section 2 reviews
Panati, Wagner [16] compared the performance of SHAP and LIME on related studies on RO techniques and their applications. Section 3 ex­
the explanation of a CNN classification model, indicating that SHAP has plains the methodology employed, including data acquisition in FEM
a clearer interpretation than LIME. However, regardless of the type of analysis, pipeline model training, the MORO process together with its
XAI technique used, the majority of current research focuses on explanation, and its integration in BIM. Section 4 is a case study about
explaining the machine learning model, with very few studies the large-diameter tunnel in the Shanghai Beiheng Passageway project.
attempting to explain systems such as the AI-based optimization process. Section 5 discusses the effects on the optimization result using different
In response to the second question, building information modeling optimization techniques, including deterministic optimization (DO),
(BIM) has been combined with machine learning techniques in many RO, and SO. Section 6 finishes with a summary of the study’s findings, its
studies. BIM aims to achieve full automation in the serving life of a contribution, and the next work that must be completed.
building as its ultimate goal. However, BIM techniques are yet suffi­
ciently “intelligent” to enable full automation due to current technical 2. Literature review
limitations or practical realities. Many researchers integrate machine
learning techniques into BIM models to make them capable of observing, 2.1. Machine learning in tunneling problems
identifying, monitoring and detecting tasks [17]. The AZURE ML [18]
platform, which is composed of decision trees, logistic regression, and Machine learning techniques have been widely employed in
neural networks was developed to differentiate the room functions. To tunneling problems in recent years. Based on the target of the study,
avoid construction delays, Roh et al. [19] integrated as-built project data related applications include tunnel boring machine (TBM) automation
extracted from site photos into the BIM models to monitor the con­ [25], tunnel condition prediction [26], abnormality detection [27],
struction process. Similarly, Golparvar-Fard et al. [20] developed a tunnel profile assessment [28], resilience evaluation [29], structural
classification model for determining whether the construction is in imperfection identification [30], tunnel face stability [31], rock burst
progress based on photos depicting various construction statuses. All of prediction [32], intelligent BIM models [33], and others. To give a more
these studies demonstrate the compatibility of these two methodologies logistic review of these works, three main categories of application of
by utilizing AI as a tool to enhance the intelligence of BIM. Nevertheless, machine learning techniques are reviewed, including damage control,
BIM models and AI techniques can be used in the opposite manner with TBM operation, and construction management.
BIM serving as a platform for AI modification and visualization. This can As a typical underground construction work, tunnel excavation may
be another right answer to the second question but there is still a large lead to severe damages from ground settlement [34], rock burst [32],
gap in this field. water inrush [35], and unbalanced pressure [36], among which the
Uncertainties in the soil are an unavoidable aspect of the optimiza­ ground settlement is the most prevalent target to be predicted and
tion of the tunnel construction process. Practically, measured soil pa­ controlled, as it ultimately reflects the deformation of the tunnel wall,
rameters can be erroneous due to improper investigation and natural settlement of the ground surface and damage to adjacent buildings at the
spatial fluctuations. In sight of this, it is prudent to develop an approach same time [37]. Traditional machine learning techniques like the arti­
to make decisions based on a thorough evaluation of all potential risks. ficial neural network (ANN) and support vector machine (SVM) have
With the identification of the distribution types, mean values, and co­ been widely used for settlement prediction [38]. For instance, Ocak and
efficients of variance, it has been proved that the variation of soil Seker [39] contrasted ANN, SVM, and Gaussian processes (GP) for
properties can be characterized by probability distributions [21]. The forecasting surface settlement produced by earth pressure balance de­
description of probability distribution supports researchers conducting vices, where SVM was found to outperform the others. In recent years,
stochastic optimization (SO) [22] to control the damages. However, this the superiority of ensemble learning methods, such as random forest
approach is still impractical as it is unlikely to obtain sufficient data for (RF) and extreme gradient boosting (XGBoost) in settlement prediction
determining the probability distribution in real-world projects. As an was proved by a number of studies [40–42].
alternative, the robust optimization (RO) method does not necessitate a Machine learning has also been extensively used to machine oper­
comprehensive understanding of the uncertain parameter, whose utility ating data, particularly in TBM tunneling, including advance rate, cut­
has been proved in some tunneling studies [23,24]. However, the cur­ ting force, thrust load, cutter torque, and penetration rate. For example,
rent use of RO largely relies on mathematical models that are impossible Benardos and Kaliampakos [43] developed a model for forecasting the
to implement in real projects. Recalling the hybrid AI approaches TBM advance rate using an ANN to determine the impact of factors on
mentioned above, if RO can be used to substitute frequently used algo­ TBM performance. Mahdevari et al. [44] developed a TBM penetration
rithms like NSGA-II to solve machine learning models, the rat forecasting model based on the SVM algorithm, with projected values
decision-making process will become significantly more reliable and that closely match measured values. Moreover, deep learning algorithms
practical. have been used to forecast the position of tunneling equipment in order
Consequently, the purpose of this study is to develop an explainable to improve construction quality and update as-built data. Gao et al. [45]
and interactive AI-based method to assist decision-making in tunnel used three types of recurrent neural networks (RNNs) (conventional
location planning that takes into account uncertainties in soil properties, RNN, long short-term memory network (LSTM), and gated recurrent
where the designer does not need a comprehensive understanding of the unit network (GRU)) to forecast the real-time operating parameters

2
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

(torque, velocity, thrust, and chamber pressure) using in-situ operating in the kinematic positioning of end-effectors. Bhattacharjya and Chak­
data. Zhou et al. [46] suggested an integrated deep learning model raborty [68] developed an improved feature importance based MORO to
comprised of the wavelet transform, convolutional neural network optimize the behavior and cost of a vibrating platform. Boindala et al.
(CNN), and LSTM for forecasting the shield machine’s attitude and [69] employed MORO to minimize the construction cost and maximize
location. Mahmoodzadeh et al. [47] developed a hybrid LSTM model the resilience index of a water distribution network, regarding the nodal
enhanced by grey wolf optimization to forecast the penetration rate. demands as the uncertain parameter. However, the application of
As for construction management, related studies mainly focus on MORO is still rare to enhance safety in tunnel construction, especially
time planning [48], cost management [49], and energy optimization for large-diameter tunnels, which becomes the major novelty of this
[50]. For instance, Mahmoodzadeh and Zare [51] developed a proba­ paper.
bilistic prediction method of ground conditions and construction time
and costs in tunnels based on Monte-Carlo simulation. Mahmoodzadeh 3. Methodology
et al. [52] conducted a dynamic optimization method to control the
construction time and costs considering geological uncertainties. Acar­ The MORO approach assisted with interactive and explainable AI for
oglu et al. [53] developed a fuzzy logic model to estimate energy con­ enhanced safety in tunnel excavation consists of three phases. As pre­
sumption to describe TBM performance. All these applications consider sented in Fig. 1, the three steps include: (a) Data acquisition from FEM
the uncontrollable aspect of time, costs, or energy from different sources analysis and pipeline model training; (b) Multi-objective robust opti­
of uncertainties, serving as valuable guidance on construction man­ mization, and (c) Establishment of the interactive and explainable AI
agement in tunnel excavation works. system. In the first step, repetitive FEM simulations will be conducted to
simulate tunnel excavation induced damages under various variable
2.2. Robust optimization values. The simulated data will then be processed to train pipeline
models predicting the tunnel excavation induced damages. Next, the
As previously mentioned, instead of a probabilistic approach, RO models will be set as objectives and constraints of the MORO algorithm,
measures the uncertainties by defining an uncertainty set containing where the Pareto fronts and corresponding optimal solutions can be
possible realizations based on only partial knowledge of the uncertain obtained. Finally, the optimization result will be explained by the SHAP
parameter (i.e. the mean and the variation). Therefore, RO aims to seek technique to figure out the sensitivity of damage improvement to
the optimal solution that is still feasible under the most unfavorable different features. Additionally, the algorithms will be integrated into
realization in the uncertainty set by deriving tractable formulations of BIM software for the designer to interact with and visualize the results.
uncertain constraints. The first publication of RO dates all the way back Details of each step will be introduced in the following sections.
to the 1970s [54]. Improvements have been made based on the funda­
mental approach to dealing with different situations, forming branches 3.1. Data acquisition and pipeline model training
including adjustable robust optimization (ARO) [55], distributionally
robust optimization (DRO) [56], and RO with GP constraint [57]. In 2019, Hu et al. [70] pointed out that measured historical settle­
The basic RO is applicable when decisions must be made before a ment data is always inadequate in either its scale or dimension to sup­
comprehensive understanding of the uncertain parameters. However, in port deep learning algorithms. As an alternative, FEM can be employed
some cases, the uncertain parameters are not related to decision-making for data acquisition, which is a way of physics-informed machine
and are capable of self-tuning, where decisions can be postponed until learning [71] technique where physics-based laws or techniques are
the information on the uncertainty parameters is revealed [58]. That is used during data acquisition. Besides using commercial software, similar
where ARO contributes its value. As for DRO, the designer may have approaches include numerical methods using physical laws, such as
access to certain statistical attributes and be able to ambiguously cap­ partial differential equations (PDEs) [72] and ordinary differential
ture the probabilistic distribution in some cases, where it is a waste of equations (ODEs) [73], and empirical methods where data are generated
resources to discard the information but optimization estimated prob­ from experiments [74]. However, using numerical or empirical methods
ability distribution is still imprecise due to ambiguity. To solve this requires that the physical rules or experiments can accurately match the
problem, DRO finds solutions that are still realizable under the worst studied circumstance. Considering the case of a large-diameter tunnel
probability distribution among a collection of ambiguous probability being built beneath existing double-track tunnels, it is unlikely to
distributions to solve this problem. Therefore, some researchers also accurately simulate the situation using numerical or empirical methods
refer to DRO as ambiguous stochastic optimization, as an ambiguous and subsequently fails to provide reliable and efficient guidance in the
version of SO [59]. In recent years, DRO has been applied in a variety of design stage. As the most prevalent method used in civil engineering,
fields, including energy planning [60], supply network design [61], and FEM has shown its reliability in plenty of projects over the past decades,
economic dispatch [62], along with some extensions combining DRO which is the main reason for using it to solve the lack of data.
and ARO to perform a two-stage distributionally robust optimization Specifically, Ansys is chosen as the media to generate 500 samples
[63,64]. As for RO with GP constraint, Wiebe et al. [57] managed to considering various tunnel locations and geological conditions.
define the uncertainty set from trained GP models and extended it to Compared with other FEM software, Ansys shows the highest compati­
applications of wrapped GP models. However, due to the nature of logic bility with parametric modeling and design of experiments (DOE),
in RO, which transforms uncertain functions into solvable deterministic which achieves high efficiency in generating repetitive samples. The
functions, black-box models are not applicable when using the RO DOE file can be imported into the software and the outputs can be
method [65]. This considerably limits the applications of RO to catch up automatically computed. In this study, we focus on three important
with the trend of modern industry where AI plays a significant role. criteria to be improved in tunnel construction, including the excavation
Therefore, in this study, we innovatively design a pipeline to generate face pressure (EFP), existing tunnel displacement (ETD), and ground
machine learning models with high prediction performance and concave surface settlement (GSS). Higher EFP leads to more hazards in the
uncertainty parameters to be tractably reformulated and solved by RO, excavation process, which causes higher risks and costs, and is extremely
which is a significant research gap filled by this study. unbeneficial in soft and alluvial soils [75]. GSS, as underlined in many
Specifically focusing on MORO, Pazouki et al. [66] applied a fuzzy studies [76,77], also needs to be controlled as it may cause considerable
MORO method to optimize the strategy for building energy retrofit, damage to adjacent structures [78]. In a particular case where excava­
which involves uncertainties in economic criteria using mathematical tion occurs underneath existing tunnels, optimization of ETD is as
modeling. Lara-Molina and Dumur [67] adopted MORO to enhance the essential as GSS to guarantee the stability of underground structures.
performance of a parallel manipulator system, considering uncertainties Therefore, an improvement in these criteria enhances the safety of

3
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 1. Flowchart of the interactive and explainable MORO process.

tunnel construction from the stabilization of soil structures and less processing by min-max normalization, which is further followed by
damage to adjacent buildings and tunnels. Throughout the simulation feature construction and feature selection and ends with elastic net
process, the most unfavorable conditions observed throughout the regression with cross-validation (ElasticNetCV) [79] model training. As
excavation will be recorded. ElasticNetCV is an extension of linear regression, data normalization is
Fig. 2 depicts the user-designer interface of Ansys. Material proper­ essential to keep the input variables on the same scale to eliminate
ties including soil and tunnel properties are stored in ‘Engineering Data’. abnormal influence on the results. Denoting x a sample in the training
The geometry of the model is built in ‘Geometry’. Material assignment, data, the min-max normalized sample can be computed as:
element type, and meshing are set in ‘Model’. Boundary conditions,
x − min(x)
loadings, and time steps are controlled by the ‘Setup’ section, and the (1)

x =
max (x) − min(x)
results can be viewed in ‘Results’. Section 4.1 provides further infor­
mation on the model’s settings.
where x represents the normalized data, min(x) and max (x) stand for

The next stage, after collecting sufficient data from the FEM analysis,
the minimum and maximum value of x in the training dataset.
is to develop machine learning models for each criterion. However, the
As for feature construction, the method of polynomial features is
model cannot be a black box to be reformulated due to the requirements
employed. For every two features {xi , xj }, a new set of features including
on the objective and constraint functions that they should be concave at
{1, xi , xj , xi xj , x2i , x2j } from a 2-power polynomial will be generated. This
least in the uncertain parameters [65]. Therefore, to fulfill these re­
quirements and guarantee the model accuracy, a pipeline model as expansion of features can largely enhance the model accuracy, however,
presented in Fig. 3 is established. The pipeline starts with data as the function must be concave in uncertain parameters, all non-linear

Fig. 2. Settings of FEM analysis in Ansys.

4
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 3. Illustration of the proposed pipeline model.

terms of uncertain parameters are excluded. Therefore, feature selection parameters, U is the corresponding uncertainty set of ξ, f(x, ξ) and h(x, ξ)
is used to exclude all higher-order terms of uncertain parameters, while are the objective function and constraint equation under uncertainty,
allowing combinations of an uncertain parameter and a deterministic respectively. Without losing generality, problem (1) can be rewritten
parameter. into an equivalent form:
As the core of the pipeline, ElasticNetCV combines the regularization
minF
terms in ridge regression [80] and the least absolute shrinkage and se­
lection operator (Lasso) regression [81] to build the objective function.
s.t. maxf (x, ξ) ≤ F, ∀ξ ∈ U
As a linear regression method, ElasticNetCV guarantees that the trained
model can be reformulated by the RO algorithm. Compared with other h(x, ξ) ≤ 0, ∀ξ ∈ U (4)
basic linear regression methods, ElasticNetCV performs variable selec­
tion and regularization at the same time and is most appropriate to deal where the original objective function is transformed into an uncertain
with high dimensional problems, which perfectly matches our case. The constraint, in accordance with the theory that robust optimization op­
objective function can be presented as below. timizes the target in worst-case scenarios, and the new objective func­
( ) tion F is completely deterministic.
1∑ m
( ( (i) ) )2 ∑n ⃒ ⃒ ∑n
J(θ) = hθ x − y(i) + λ p ⃒θj ⃒ + (1 − p) θ2j (2) Nevertheless, the new restriction is difficult to answer directly,
2 i=1 j=1 j=1 necessitating the usage of the reformulation approach according to the
∑ uncertainty set’s specification. In this study, three basic types of un­
where the first term 12 m (i) 2
i=1 (hθ (x ) − y ) represents nothing but the
(i)
certainty sets will be considered [84]:
loss function obtained from the ordinary least squares (OLS) method, θj
U∼ = {ξ : ξ∼ ≤ τ} = {ξ : |ξi | ≤ τ} (5a)
denotes the regularization term controlled by hyperparameters λ and p,
m, and n describe the dimension of the input data, where m measures the { ∑ }
number of samples and n counts the number of features. U2 = {ξ : ξ2 ≤ Ω} = ξ : ξ2i ≤ Ω (5b)

{ ∑ }
U1 = {ξ : ξ1 ≤ Γ, |ξ| ≤ e} = ξ : ξi ≤ Γ, |ξ| ≤ e (5c)
3.2. Multi-objective robust optimization
where Eq. (5a), (5b), and (5c) represent the box uncertainty set, the
Depending on the understanding of the uncertain parameters, SO, ellipsoid uncertainty set, and the polyhedron uncertainty set, respec­
DRO and RO are three possible solutions. In ideal cases where an ac­ tively. Defining the vector of uncertain parameters as C, different types
curate probabilistic distribution can be obtained, SO is an applicable of uncertainty sets can be summarized in a form displayed below [65]:
technique. As a classic non-deterministic optimization algorithm, SO
{ }
makes use of the probability distributions to obtain the corresponding Up = C ̃ i = Ci + C
̂ i ⋅ξi | ‖ξi ‖p ≤ r (6)
distribution of the objective [82] or to construct probability constraints
[6]. However, the main shortage of SO is that it totally depends on a where Ci is the nominal value of an uncertain parameter C ̃i , C
̂ i is the
precise understanding of the uncertainty inherent in the probability maximum deviation, r controls the degree of the uncertainty, which can
distribution, which strongly affects the result if assumptions are made be referred to as the size of the uncertainty set, and p controls the type of
and are not always available in real projects. In the second level where uncertainty. The type of uncertainty set defined in Eq. (5a), (5b), and
there is adequate data to inaccurately estimate the probabilistic distri­ (5c) correspond to cases when p = ∞, p = 2, and p = 1. Based on
bution, DRO is useful to find out the optimal solution satisfying a specific settings of the uncertainty set, the uncertain parameters are
collection of ambiguous probability distributions generated to represent selected from values varying around their nominal value in a particular
the estimated one. Adequate historical data to estimate the distribution range. The distance between the uncertain parameter and its nominal
is important, otherwise, DRO will be equivalent to RO when all possible value is C
̂ i ⋅ξi , which depends on the maximum deviation of the param­
probabilistic distributions are considered without a reference estimation eter and the type and size of the uncertainty set.
[59]. When there is no information about the probability distributions Reformulation aims to transform the uncertain constraint into
for uncertain parameters, RO stands out as it does not rely on under­ equivalent deterministic functions with strong duality. As discussed in
standing the probabilistic distributions. Compared with SO which is Section 3.1, the model is trained to guarantee the linearity of all un­
always used for reliability assessment and risk measure, RO focuses certain parameters. The corresponding deterministic constraint stated in
more on conservativeness and risk control, and DRO serves as a mod­ Eq. (4) can be expanded into:
erate solution between them [83]. In tunneling problems, as geological ∑
data is obtained from boreholes, which are too discrete and sparse to maxD + ̃ i ⋅ai ≤ F, ∀C
C ̃ i ∈ Up (7)
obtain adequate information about the probabilistic distributions, RO is
the most suitable solution as no assumptions are needed and the result where D is the sum of all deterministic terms and ai is the coefficient of
will be the most reliable. each uncertain parameter, which can be a function of the deterministic
Consider a generic optimization problem under uncertainty: variables. An equivalent deterministic constraint can be subsequently
rewritten as:
minf (x, ξ)

D+ Ci ⋅ai ≤ F − r⋅λi (8)
s.t.h(x, ξ) ≤ 0, ∀ξ ∈ U (3)
where λi is a dual equation depending on the type of uncertainty set
where x is the vector of decision variables, ξ is the vector of uncertain

5
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

selected, which is summarized in Table 1. Algorithm 1


Given that robust optimization is solvable after reformulation, the ε-constraint method for a minimizing MOO problem with 2 objectives.
next step is to obtain the Pareto optimal solutions for multi-objectives. In Input: Original objectives (f1 (x), f2 (x)) and constraint functions;
this study, the augmented ε-constraint method developed by Mavrotas
Minimize f1 (x) (SOO with f2 (x) deactivated)
[85] is used. Starting from the conventional ε-constraint method, f2,max = f2 (x) | x : minf1 (x)
consider a MOO problem defined as follows: Minimize f2 (x) (SOO with f1 (x) deactivated)
f2,min = minf2 (x)
min(f1 (x), f2 (x), …, fn (x)) s.t.x ∈ S (9) Define n (Define the number of steps)
Initialize eps = 10− 4 , s ∈ R + , F 1 = F 2 =∅
where f1 (x) to fn (x) are n objective functions, S is the feasible region of For i = 1 to n
variable x. The basic principle of the ε-constraint method is to optimize ε = f2,min + i × (f2,max − f2,min )/n
one objective function while treating the others as constraints, trans­ x = x | x : min(f1 (x) − eps × s), s.t. f2 (x) + s ≤ e

ferring problem (9) into:



F 1 = F 1 ∪ {f1 (x )}

⎧ F 2 = F 2 ∪ {f2 (x )}
⎪ f2 (x) ≤ e2 Output: Pareto optimal solutions minimizing f1 (x) and f2 (x).



⎨ f3 (x) ≤ e3
min f1 (x) s.t. … (10)


⎪ fn (x) ≤ en comprises all input features and setting-up parameters for the prediction


x∈S and optimization algorithm, which are then imported into those algo­
rithms. The fourth component, “3D Modelling”, collects all geometric
where ei is the optimal result obtained from a single-objective optimi­ inputs and the optimization result to automatically build up a self-
zation (SOO) problem merely targeting fi (x). It is obvious to tell that the adjusting 3D model. The last section, “UI Design”, integrates all previ­
conventional ε-constraint method aims to transfer a MOO problem to an ous sections into a designer interface, allowing the designer to modify
SOO problem, guaranteeing the definition of the Pareto front where one inputs and algorithm settings, and view the prediction, optimization,
objective cannot be improved without sacrificing the others [6]. and explanation result without any coding operations. A presentation of
However, the standard-constraint technique is likely to provide this UI is plotted in Fig. 8 in Section 4.2.
findings with the dominated solutions near the boundary. To solve this As indicated in Section 3.1, the pipeline model is comprised of a
problem, the augmented ε-constraint method further revises the prob­ linear regression technique that yields a polynomial function. Therefore,
lem (10) as: an explanation of the prediction model is necessary as the model is
( ( )) completely transparent. However, the optimization algorithm is still a
min f1 (x) − eps × s2 + s3 + … + sp black box, where designers always lack an understanding of the process.
⎧ If the contribution of each feature to the potential of improvement after

⎪ f2 (x) + s2 ≤ e2 optimization can be figured out, decisions can be made more effectively


⎨ f3 (x) + s3 ≤ e3
by concentrating on those most crucial features. In this study, the SHAP
s.t. … (11)

⎪ fn (x) + sn ≤ en method is applied due to its power in analyzing various types of models.



x∈S While dealing with a complex model, SHAP uses the additive feature
attribution method [86] to simplify the model into a linear form as
where si and eps are parameters to screen out the dominated solutions, presented in Eq. (12) and Fig. 5.
and the value of eps should be adequately small (between 10− 3 to 10− 6). ∑
K
The rationale of (11) will not be proved in this study to avoid lengthi­ (12)
′ ′
g(x ) = φ0 + φi x i
ness, but the pseudocode of the augmented ε-constraint method to solve i=1

a 2-objective optimization problem is presented in Algorithm 1.


where K is the total number of features, x i is the simplified form of the

i-th feature, which can be transferred back to its original value through a
transformation function x = hx (x ), φi is the Shapley value [87] of the

3.3. Interactive and explainable AI system
corresponding feature, which is used to describe the importance of that
The interactive and explainable AI system consists of two segments, feature, and φ0 , computed from the mean of the estimated values, is the
including the explainability from SHAP and interactivity from BIM. baseline of the function.
Between the two segments, the explainable AI segment is directly related In the simplification process, the transformation function transfers
to the RO algorithm as it is used to explain the feature importance to the each feature into a value of x ∈ {0, 1}K . If x i = 0, the corresponding has
′ ′

improvement of optimized objectives. As for the interactive segment, the no impact on the output, implying that φi = 0. Under this circumstance,
BIM software serves as a platform embedded with prediction, RO, and a random feature value will be selected from the dataset. If x i = 1, the

SHAP algorithms, enabling the designer to directly adjust the algorithm value will be directly extracted from the sample. In more general cases,
and view the results from the UI. the SHAP value can be computed as follows:
For the purpose of achieving interaction, we employed Grasshopper, ∑ |S|!(K − 1 − |S|)!
a Rhino plug-in, as a platform for integrating prediction, optimization, φi = [fx (S ∪ {i}) − fx (S)] (13)
K!
explanation, and automatic modeling. As shown in Fig. 4, the core S⊆N\{i}

features of Grasshopper consist of five functional clusters of visual Where S is a set containing all non-zero entries, N is the set of all
programming batteries. The first component, “Inputs & Settings”, features, fx (S) stands for the outcome from the original model, and
fx (S) = E[f (x)|xS ].
Table 1
Reformulation for box, ellipsoid, and polyhedron uncertainty set.
4. Case study
Type of uncertainty set p λi

Box ∞ max Ĉ i ⋅ai 4.1. Case background


√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
Ellipsoid 2 ∑ ̂2 2
C i ⋅ai
∑̂ The case study is based on the Shanghai Beiheng Passageway project
Polyhedron 1 C i ⋅ai
[88], in which a large-diameter tunnel is proposed to pass beneath a

6
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 4. Development of the interactive and explainable AI system in Grasshopper.

Fig. 5. Working scenario of the SHAP algorithm.

double-track tunnel in Metro Line 11. As shown in Fig. 6, the angle one layer only. The properties of the simplified soil model are summa­
between the existing tunnel and the large-diameter tunnel is 68 ◦ The rized in Table 3, with the large-diameter tunnel located in the sixth
depth of the tunnels, measured from the tunnel center, as well as their layer, and the existing tunnels in the fifth layer. Detailed simulation
sizes, is summarized in Table 2. The large-diameter tunnel uses C60 steps in FEM are introduced below.
grade concrete with a 36.5 GPa elastic modulus in its lining. The con­
crete used in grouting material reaches an elastic modulus of 14.75 MPa (1) A 180 m × 100 m × 99 m block consisting of 6 soil layers is
after 28 days. In the case of the existing tunnel, the lining is made of C55 initialized in Ansys. Considering the boundary effects, the existing
concrete with a 35.5 GPa elastic modulus, and the grouting material is tunnels are designed to rotate instead of the large-diameter tunnel,
omitted due to a lack of information. and the length of the soil block is determined by 6D extending from
At the intersection of the tunnels, the underlying conditions are the center of the large-diameter tunnel to both sides. However, there
extremely complex with 10 soil strata stretching 99 m deep. In the FEM is no direct way to determine the width as the existing tunnel is
simulation, the last five layers of soil will be averaged based on their rotating. Therefore, we did several trials and found 100 m width is
thickness to simplify the model and allow the new tunnel to move within adequate to have a reliable result. As for the boundary conditions,

7
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 6. (a) Map of Beiheng Passageway and Metro line 11, and (b) Layout of Shanghai Beiheng Passageway project (Reproduced from [88]).

grouting, or shield elements will be deactivated in the first time step,


Table 2 followed by removing the soil elements at the location of the existing
Physical properties of tunnels.
tunnels and activating the lining elements of the existing tunnel. The
Tunnel Features Value Units soil consolidates again to reach its steady state before the large-
Large-diameter tunnel Outer diameter 15.56 m diameter tunnel is excavated. The displacement after this step will
Inner diameter 13.7 m be used as the initial case in the case combination procedure later,
Lining thickness 0.65 m which will be subtracted by the following load steps simulating the
Tunnel ring width 2
large-diameter tunnel excavation.
m
Tunnel depth 39 m
Existing tunnels Outer diameter 6.2 m (3) The excavation is simulated by every 2 lining rings (4 m), which
Inner diameter 5.5 m occurs from step 3 to step 28. In step 3, the first soil segment to be
Lining thickness 0.35 m excavated is deactivated and the first shield shell segment is acti­
Clear track spacing 7.2 m
vated. The shield shell segment is then deactivated, and the first
Tunnel Depth 21 m
lining and grouting elements of the large-diameter tunnel are acti­
vated. Simultaneously, to simulate the advancement of the TBM
settings proposed by Alagha and Chapman [89] are used, where the machine, the next soil and shield shell elements are deactivated and
boundaries (top surface excluded) are all restricted in the direction activated accordingly. This excavation sequence will be repeated to
normal to the faces for any transition. Soil parameters are set strictly the end of the excavation.
following Table 3. (4) The EFP, ETD, and GSS in each step will be recorded for further
(2) The lining, grouting of the tunnels, and shield of the TBM ma­ data extraction. As for EFP, the maximum pressure can be directly
chine are modeled as shell elements. To initialize the soil, all lining, used as the EFP of the current sample. As for ETD and GSS, the initial

8
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Table 3
Geological properties of different soil layers.
Soil layers Thickness (m) Unit Weight (kN/m3) Young’s Modulus (MPa) Poisson’s Ratio Cohesion Friction Angle (∘ )
(kPa)

1- Landfill 2.0 18.3 15.0 0.40 15.0 18.0


2- Clay 1.0 18.2 24.2 0.33 16.0 14.5
3- Muddy clay 6.0 17.2 15.6 0.38 11.0 14.0
4- Muddy clay 8.4 16.8 10.5 0.38 10.0 10.8
5- Silty clay 10.0 17.8 14.2 0.33 15.0 15.4
6- Sand (Averaged) 71.6 18.9 17.5 0.35 5.5 28.6

settlements are subtracted from the result in each excavation step by existing tunnels are well eliminated.
case combinations. The maximum EFP ETD and GSS in the basic case
are plotted in Fig. 7. 4.2. Model development

The maximum EFP, ETD, and GSS under the introduced setup over As described in previous sections, the system is complicated and a
the excavation process are plotted in Fig. 7. The largest ETD and GSS large number of variables can be selected as the input for the machine
both occur in the 19th time step (the 17th excavation step) with a learning model. Considering the aim of this study, only 16 of them are
maximum value of 21.805 mm and 28.304 mm, respectively. The more selected as the input features, which can be classified into geometric
unfavorable EFP happens in the 8th time step (the 6th excavation step) parameters and geological parameters. As this study aims to optimize
with a maximum value of 94.161 kPa. To examine the reliability of the the layout of the newly built tunnel, the tunnel depth (D) and the relative
FEM simulation result, the simulated maximum deformation of the angle (θ) between tunnels are two significant variables to be selected,
double-track tunnel is compared with the measured data. As the purpose while the size and material of the tunnels are assumed constant. As for
of the FEM simulation is to guide the design, it is favorable to have more soil parameters, as the settlement is highly dependent on the Young’s
conservative results as in Fig. 8, which are more reliable for safety modulus (E) and the Poisson’s ratio (v), these two parameters of each soil
concerns. As for the validation of the simulation results, the deformation layer will be taken into account. On the other hand, as the EFP primarily
in FEM analysis follows the same trend as the measured data. For resists the passive lateral soil pressure, the friction angle (φ) and the
instance, the maximum deformation always occurs in the middle of the cohesion strength (c) of the corresponding layer will also be significantly
tunnels, and there is a slight upward deformation before severe down­ important. Therefore, as summarized in Table 4, there will be 16 vari­
ward deformation. Additionally, the maximum deformation in tunnel 1 ables in total, including 2 geometric variables and 14 geological
is larger than that in tunnel 2 in both simulation and measured data. variables.
Most importantly, the deformation at both ends of the tunnels is very Considering the efficiency of data acquisition and the accuracy of the
close to 0, which strongly proves that the boundary effects on the prediction result, 500 samples are generated following the optimal

Fig. 7. FEM analysis results of (a) maximum EFP over the excavation process, (b) maximum ETD over the excavation process, and (c) maximum GSS over the
excavation process.

9
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 8. Comparison of ETD between FEM simulation and measured data.

criterion. As reflected in Fig. 11, both the training score and validation
Table 4
score reach stability using 80% of the 500 samples, which implies that
Selected variables for machine learning model training.
500 samples are adequate for all models to converge and avoid
Features Variables Minimum Mean Maximum Units overfitting.
Change of tunnel depth x1 37 63 89 m 80% (400) of the samples are set as the training dataset in the model
(D) training process, while the remaining 20% serve as the testing dataset.
Relative angle (θ) x2 0 45 90
To evaluate the performance of the model, we use a mixture of the root

Young’s modulus (1st x3 5 15 25 MPa


layer) (E1)
mean squared error (RMSE), the variance account for (VAF), the a20 −
Poisson’s ratio (1st layer) x4 0.2 0.35 0.5 – index, and the R2 as a mixture to obtain a comprehensive understanding
(v1) of model performance [90–92]. The computation of all three criteria is
Young’s modulus (2nd x5 9.3 24.2 39.1 MPa shown below:
layer) (E2)
Poisson’s ratio (2nd layer) x6 0.2 0.35 0.5 –
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
1∑ n
(v2) RMSE = ( ŷi − yi )2 (14)
Young’s modulus (3rd x7 5.9 15.6 25.3 MPa n i=1
layer) (E3)
Poisson’s ratio (3rd layer) x8 0.2 0.35 0.5 – [ ]
var(y − ̂y )
(v3) VAF = 1 − × 100 (15)
Young’s modulus (4th x9 4 10.5 17 MPa var(y)
layer) (E4)
Poisson’s ratio (4th layer) x10 0.2 0.35 0.5 – m20
(v4) a20 − index = (16)
M
Young’s modulus (5th x11 5.4 14.2 23 MPa
layer) (E5) ∑n
( ŷi − yi )2
Poisson’s ratio (5th layer) x12 0.2 0.35 0.5 – R2 = 1 − ∑i=1
n 2
(17)
(v5) i=1 (y − yi )
Young’s modulus (6th x13 6.7 17.5 28.3 MPa
layer) (E6) where n is the number of samples, ̂ y and y are the predicted and
Poisson’s ratio (6th layer) x14 0.2 0.35 0.5 –
(v6)
experimental value, y is the mean of all samples, m20 measures the
Friction angle (φ) x15 10.9 28.6 46.3 ∘ number of samples with predicted values not exceeding a deviation of ±
Cohesion strength (c) x16 2 5.5 9 kPa 20% compared with the experimental values. From Table 5 and Fig. 12,
it can be observed that even if the core of the pipeline is a simple
regression, the performance of the model is quite satisfactory, reaching a
spacing filling (OSF) technique in the DOE. In a high-dimensional sys­
R2 of 0.97, 0.95, and 0.95 examining the testing data, which is
tem with 16 variables, OSF produces samples in the most efficient way
adequately accurate for optimization use.
such that all samples are independent and uniformly allocated in the
Fig. 13 presents the UI of the interactive and explainable AI system in
design space. The Pearson correlation matrix of the independent vari­
Grasshopper and the corresponding 3D model generated. The UI con­
ables is plotted in Fig. 9. It can be told that the highest correlation is only
tains two parts including the “Settings” section and the “Results” section.
0.26, which indicates that there is no high correlation between any two
In the “Settings” section, designers are allowed to set up the initialization
variables and hence proves the independence of the variables. As for the
of all input features and adjust the parameters of the optimization al­
dependent variables, a violin plot of simulated EFP, ETD and GSS is
gorithm, including the type and size of the uncertainty set. The “Results”
presented in Fig. 10, where the three outputs have a mean value of
section is further subdivided into three elements, including the predic­
125.3 kPa, 19.2 mm, and 24.3 mm, respectively. To examine the validity
tion results of EFP, ETD, and GSS, the MORO result and the corre­
of 500 samples, a cross-validation test under different sample sizes is
sponding optimal layout, and the SHAP analysis result. Detailed analysis
conducted. Ten levels of sample size ranging from 10% to 100% of 500
of the MORO and SHAP results will be discussed in the next section.
samples are tested and the cross-validation is set up with 5 folds (20% as
validation set). The mean squared error (MSE) is set as the scoring

10
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 9. Pearson correlation matrix of independent variables.

minEFP(x, u),

ETD(x, u)


⎪ EFP(x, u) ≤ 200, ∀u ∈ U


⎨ 0 ≤ ETD(x, u) ≤ 30, ∀u ∈ U
s.t. 0 ≤ GSS(x) ≤ 40 (18)



⎪ 37 ≤ x1 ≤ 89

0 ≤ x2 ≤ 90

where x1 and x2 denote the depth and angle of the new tunnel,
respectively, and u stands for the uncertain soil parameters. The
robustness of GSS is not considered a constraint.
As for the uncertain sets, recalling the definition in Eq. (6), we as­
sume that the deviation equals the nominal value of the parameter
(C
̂ i =Ci ). The three types of uncertainty sets can be defined following the
constraints below:
|(ui − xi ) / xi | ≤ r (19a)
Fig. 10. Violin plot of dependent variables. ̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
√∑
((ui − xi )/xi ) ≤ r
2
(19b)
4.3. Analysis of results ∑
|(ui − xi ) / xi | ≤ r (19c)
In the previous section, we simulated three criteria including the
EFP, ETD, and GSS. As both ETD and GSS assess settlement and are where Eq. (19a), (19b), and (19c) represent the box uncertainty set, the
highly relevant, only ETD is selected as the optimization target together ellipsoid uncertainty set, and the polyhedron uncertainty set, respec­
with the EFP. Nevertheless, GSS is still used to serve as a constraint. tively, and r is the uncertainty set size. The baseline of our study is
Specifically, the upper limits of EFP, ETD, and GSS are set as 200 kPa, 30 conducted under an ellipsoid uncertainty set with r = 0.5.
mm, and 40 mm, respectively, and no uplift is allowable on the ground The Pareto front generates a set of non-dominating solutions, from
surface. The new tunnel is allowed to be constructed at any depth in the which the designer needs to further select the most desirable solution
last soil layer and the angle between the existing tunnel varies from according to practical needs. The selection process can be totally
0 ∘ to 90 ∘ Among all 14 geological parameters, only the last 4 (x13 to x16 ) manually controlled, but in this study, to set up a uniform standard, we
describing the 6th layer are considered uncertain. Such uncertainties adopt the Technique for Order Preference by Similarity to an Ideal So­
include the natural anomalies from the soil as well as the anomalies from lution (TOPSIS) method to evaluate the improvement and make proper
human influence like improper operations. As both sources are hard to selections. For a Pareto front containing n candidates and m objects,
be described by exact models or functions, it is reasonable to simulate each candidate will be evaluated with a score defined as follows:
them using uncertainty sets. Mathematically, the problem can be
D−i
defined as follows: Si = (20)
D+
i + Di

11
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 11. Cross-validation test using different sample sizes training model of (a) EFP, (b) ETD, and (c) GSS.

Table 5
Training results of EFP, ETD, and GSS.
Output Training Data Testing Data

RMSE VAF a20-index R2 RMSE VAF a20-index R2

EFP 4.51 99.00% 1.00 0.99 5.39 97.38% 1.00 0.97


ETD 0.91 96.76% 0.97 0.97 1.10 94.52% 0.94 0.95
GSS 0.84 95.83% 0.98 0.96 1.19 94.51% 0.97 0.95

Fig. 12. Linear regression between the predicted and original value of (a) EFP, (b) ETD, and (c) GSS with a 95% prediction interval.

where D+ j is the maximum solution considering the j-th objective, Zj is


where Z+ −
i represents the geometry distance between the value of each
candidate in the Pareto front and the maximum solution among all the minimum solution considering the j-th objective, and zij is normal­
candidates considering all objectives, and D−i geometry distance be­ ized from the i-th candidate in the j-th objective, which is defined as:
tween the value of each candidate in the Pareto front and the minimum /√̅̅̅̅̅̅̅̅̅̅̅̅
∑ n
solution among all candidates considering all objectives. D+
i and Di are

zij = xij x2ij (22)
computed as: i=1

√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
√∑
√ m ( + )2 where xij is the solution of the i-th candidate considering the j-th
Di = √
+
Z j − zij (21a) objective.
j=1
The optimization solver is Gurobi 9.5.0. The coding environment is
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅ Python 3.8.8. All algorithms are implemented in Windows 11 system
√∑ with an AMD 3600X CPU (6 cores and 12 processing threads) and 16 GB
√ m ( − )2
D−i =√ Z j − zij (21b) RAM. The Pareto fronts for various levels of uncertainty and variable
dimensions of the uncertainty set are presented in Fig. 14. Fig. 15 il­
j=1

lustrates the evolution of the optimal solution in relation to the size of

12
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 13. (a) UI of the interactive and explainable AI system (b) 3D view of the optimized model.

Fig. 14. Pareto front of EFP v.s. ETD under (a) different types of uncertainty set with r = 0.5 and (b) Ellipsoidal uncertainty set as r ranges from 0 to 1.

Fig. 15. Optimization results under different sizes of Ellipsoidal uncertainty set optimizing (a) EFP and (b) ETD.

13
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

the uncertainty collection. Fig. 16 illustrates the final designer interface However, if the uncertainty level is set too low, the optimization
after optimization with several types of uncertainty sets. In Fig. 17, we process will tend to be deterministic.
used a SHAP analysis to determine the sensitivity of improvement to (3) The improvement potential of EFP is more reliant on tunnel depth
each feature and hence which is more worthy of adjustment. The than that of ETD, which is more dependent on tunnel angle. In terms
following conclusions are drawn from the results: of EFP, it is evident from Fig. 17(a) that both depth and angle
contribute significantly to the improvement percentage, between
(1) The box uncertainty set is the most conservative of the three types which the depth contributes more. This indicates that it is more
of uncertainty sets, followed by the ellipsoidal and polyhedral un­ efficient to change the depth for better EFP performance. Addition­
certainty sets. As seen in Fig. 14(a), the Pareto front generated by a ally, as a high feature value in x1 corresponds to a greater SHAP
box uncertainty set is more in the rear, whereas the polyhedral un­ value in the improvement, a deeper buried tunnel can be more easily
certainty set generates a “real” front. This conclusion cannot in­ altered to improve the EFP, demonstrating also that a deeper exca­
dicates, however, that the polyhedral uncertainty set is preferable as vation results in a greater EFP. However, as illustrated in Fig. 17(b),
it gives a more optimum solution. In accordance with the definitions the picture is quite different in ETD. The improvement potential is
in Eq. (19), the restriction gets more stringent as the uncertainty set more sensitive to the tunnel’s angle. As the angle increases,
is polyhedral, which can be proved by a simple mathematical deri­ achieving a higher performance becomes more difficult. When it is
vation. Nonetheless, it should be noted that a tighter constraint im­ unlikely to directly follow the best solution due to some practical
plies higher confidence in the uncertain parameter, which is not constraints, the SHAP analysis gives an extremely useful reference
always the case in practice. Additionally, since a box uncertainty for the designer to select the most effective feature to alter. Mean­
only set limits to the maximum value, no interactions between the while, it assists in avoiding the waste of effort from adjusting inef­
uncertain parameters will occur. When the designer is uncertain ficient features.
about all parameters, a box uncertainty set is more tractable and
apparent to utilize. As a result, it is critical that the designer selects 5. Discussions
the appropriate form of uncertainty for the given circumstance.
(2) Regardless of the kind of uncertainty set, the optimization process In this section, the performances of RO, DO, and SO are compared.
gets more conservative as the size of the uncertainty set rises, while Among these algorithms, DO considers no uncertainty, and RO is
various forms of uncertainty sets respond to size changes at different implemented under a moderate setting using an ellipsoidal uncertainty
rates. As illustrated in Fig. 14(b), the Pareto front steadily retreats as set with r = 0.5 and 0.05. As for SO, we employ an approach that re­
the size of the uncertainty set rises. Referring back to Fig. 15, the places the original constraint with a probabilistic constraint, which
optimal solution from a box uncertainty set responds fastest to the transfers a DO problem into:
size of the uncertainty set, followed by the ellipsoidal and polyhedral
Min f (x)
uncertainty sets. In light of our findings in (a), it can be concluded
that an uncertainty set with looser constraints is more sensitive to
s.t. p(x) ≤ λ (23)
changes in the size of the uncertainty set. Additionally, once the size
approaches a sufficiently small level, the distinction between where p(x) is the probability that the original constraint is satisfied after
different forms of uncertainty sets becomes imperceptible (about counting a large number of samples generated from a Monte-Carlo
0.01). In other words, as long as the designer is sufficiently confident simulation, λ is the threshold of the probabilistic constraint. We as­
in the parameters, the choice of uncertainty set does not control. sume that all uncertain parameters follow normal distributions with the

Fig. 16. Optimal solution plotted in the UI using (a) ellipsoid uncertainty set, (b) polyhedron uncertainty set, and (c) box uncertainty set.

14
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 17. SHAP analysis result of the improvement percentage on target (a) EFP (b) ETD.

mean equal to the sample value and a coefficient of variance (COV) of forms an entirely distinct line. Consequently, it can be inferred that
0.1. The threshold λ is set as 0.95, which means that a solution can pass probability restrictions in SO determine whether a solution can be
the constraint only when 95% of their Monte-Carlo samples satisfy the chosen as a Pareto optimal solution with no effect on the outcome of
design limits. The number of Monte-Carlo samples is set as 1000. eligible solutions compared with DO. However, as RO develops so­
Under the settings mentioned above, the corresponding Pareto front lutions that satisfy the most unfavorable condition, designers are
obtained from RO, DO and SO is plotted in Fig. 18. The optimal solutions forced to make decisions in a worse scenario resulting in completely
in UI are presented in Fig. 19. Fig. 20 summarizes the improvement different decisions. This also explains why the distributions of
percentage of EFP and ETD from 100 testing samples, whose average is improvement percentages using DO and SO in Fig. 20 are very
computed as in Table 6. From these results, the conclusions can be similar to each other, but largely differ from those using RO.
drawn below. (2) RO yields more conservative solutions than DO and SO, but there
is no discernible difference in the conservativeness of the best solu­
(1) The Pareto front and the corresponding optimal solution, created tions generated by DO and SO. As illustrated in Fig. 18, despite the
by RO is notably different from those generated by DO and SO. magnitude of the uncertainty set, the Pareto front of RO is more
Recalling the conclusion of the previous section, the conservatism of rearward. Statistically, the average improvement of testing samples
optimum solutions produced from various types and sizes of uncer­ using RO is 13.60% on EFP and − 1.40% on ETD as r = 0.5, 23.80%
tainty sets distinguishes them from one another. As indicated in on EFP, and 4.90% on ETD as r = 0.05. The numbers increase to
Fig. 19, the ideal location from DO and SO is significantly different 21.83 percent and 7.69 percent for SO and then shift to 24.11 percent
than the optimal solution from RO, but the difference between DO and 6.89 percent for DO. The distinction between DO and SO can be
and SO is not clear. Additionally, as illustrated in Fig. 18, the Pareto explained by the way optimal solutions are chosen, where DO opti­
front generated by DO and SO follows the same path, whereas RO mizes more in EFP, but SO optimizes more in ETD. The rationale is
that the ETD constraint is more stringent than the EFP requirement,
and hence solutions with a higher ETD are more likely to be filtered
out by probabilistic solutions. In terms of using RO, even if the
outcome is always more conservative, the mean improvement per­
centage starts to approach DO when the size of the uncertainty is
adjusted to 0.05 as illustrated in Table 6. Additionally, there is a
tendency for EFP improvement to be greater than ETD improvement,
independent of the optimization strategy and parameters employed.

Based on the findings summarized above, the MORO method pro­


posed in this paper serves as guidance for large-diameter tunnel con­
struction in the design stage. In this method, it is essential to build the
uncertainty set to simulate the uncertainties from soil properties. With
adequate data obtained, the nominal value Ci can be set as the mean of
soil samples and the maximum deviation C ̂ i can be naturally taken as the
variation of these samples. However, considering cases where only one
soil sample is obtained at the target spot, it is suggested that take the
investigated value as Ci and set C
̂ i = Ci . As for the type and size of the
uncertainty set, the designer is highly recommended to examine the
optimization result generated using different settings and make de
Fig. 18. Pareto front of EFP v.s. ETD using RO, DO and SO.

15
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

Fig. 19. The optimal solution plotted in the UI using (a) RO with r = 0.5, (b) RO with r = 0.05, (c) DO, and (d) SO.

Fig. 20. Statistical analysis of the improvement percentage of 100 testing samples using RO, DO and SO optimizing (a) EFP and (b) ETD.

construction, particularly for large-diameter tunnels, by utilizing a


Table 6
MORO method assisted by an interactive and explainable AI system. 500
Average improvement percentage of EFP and ETD optimized by RO, DO and SO.
samples are simulated in FEM analysis considering 16 variables and 3
Method Average improvement Average improvement outputs, followed by an ElasticNetCV cored pipeline model training. EFP
percentage of EFP percentage of ETD
and ETD are handled as objectives in the MORO process, and GSS is
RO (r = 13.60% − 1.40% added to the list of constraints. The TOPSIS technique is used to build up
0.5) an equal standard to select the optimal solution from the Pareto front.
RO (r = 23.80% 4.90%
0.05)
The contribution of each feature to the improvement potential is
DO 24.11% 6.93% analyzed using the SHAP technique. The AI algorithms are assembled
SO 21.83% 7.69% into Grasshopper, which serves as an interface and visualization plat­
form and integrates prediction, optimization, explanation, and auto­
matic modeling. The key findings from this study can be summarized
cisions according to the real condition. However, starting with an
below:
ellipsoidal set with r = 0.05 can save time for tedious trials with a
relatively moderate result.
(1) In terms of the MORO method’s conservatism, a box uncertainty
set generates the most conservative solution, followed by the
6. Conclusions
ellipsoidal uncertainty set and the polyhedral uncertainty set. As
for the uncertainty set size, a larger uncertainty set makes the
6.1. Summary of findings
result more conservative. After the uncertainty set size reaches

This study proposes a safety enhancement approach in tunnel

16
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

0.01 or less, there is no significant difference between the solu­ project.


tions generated from different types of uncertainty sets.
(2) According to the SHAP analysis results, alterations to the tunnel CRediT authorship contribution statement
depth have a greater impact on EFP, whereas the angle has a
greater impact on ETD. In this scenario, it is more effective to Penghui Lin: Writing – original draft, Methodology, Visualization,
change the tunnel depth to reduce EFP and the tunnel angle to Data curation, Investigation, Validation. Limao Zhang: Writing – orig­
reduce ETD. inal draft, Conceptualization, Methodology, Writing – review & editing,
(3) RO gives the most conservative solutions among RO, DO, and SO. Funding acquisition. Robert L.K. Tiong: Supervision, Writing – review
However, if the uncertainty set size is set adequately small, the & editing.
degree of improvement from RO will be close to DO and SO. As
the uncertainty set size is set as 0.05, the average improvement of Declaration of Competing Interest
EFP and ETD is 23.8% and 4.9%, respectively. These numbers are
quite close to those of DO (24.11% and 6.93%) and SO (21.83% The authors declare that they have no known competing financial
and 7.69%). interests or personal relationships that could have appeared to influence
(4) Compared with DO and SO, RO stands out due to its reasonable the work reported in this paper.
conservativeness, which perfectly matches the worst-case sce­
nario design logic in engineering problems. Designing based on Data availability
DO or SO cannot eliminate the risk from uncertainties, but the
designer takes no risks using MORO as it is guaranteed that even The data that has been used is confidential.
the worst possible from the uncertainties still fulfills the allow­
able limits. Considering the complexity of underground condi­
tions, there is no doubt that a zero-risk method significantly Acknowledgment
enhances the safety of tunnel construction.
This work is supported in part by the National Natural Science
6.2. Potential applications and future works Foundation of China (No. 72271101), the Outstanding Youth Fund of
Hubei Province (No. 2022CFA062), and the Start-Up Grant at Huazhong
The MORO approach proposed in this study fully considers the un­ University of Science and Technology (No. 3004242122). The 1st author
certainties in soil with low requirements on the detail of information. is grateful to Nanyang Technological University, Singapore for his Ph.D.
The explanation of the improvement percentage serves as a reference for research scholarship.
the designer to adjust the interesting variables efficiently. The
enhancement in interactivity eliminates the knowledge alienation from References
engineering to machine learning techniques, making the whole
[1] Yang YY, Li HA. Failure mechanism of large-diameter shield tunnels and its effects
approach more implementable. Based on these characteristics, this
on ground surface settlements. J Cent South Univ 2012;19:2958–65.
method can be potentially applied to other civil construction areas. For [2] Martínez PJI. Street tunnel Madrid M-30. Bezp Tech Pozar 2013;30:127–35.
instance, in other types of construction works (i.e. buildings, bridges, [3] Talmon AM, Bezuijen A. Calculation of longitudinal bending moment and shear
and roads), it is also valuable to consider the geological uncertainties in force for Shanghai Yangtze River Tunnel: application of lessons from Dutch
research. Tunn Undergr Space Technol 2013;35:161–71.
the foundation design. Additionally, the BIM based interactive and [4] Zhang F, Gao Y, Wu Y, Wang Z. Face stability analysis of large-diameter slurry
explainable AI system can be widely used as BIM is one of the most shield-driven tunnels with linearly increasing undrained strength. Tunn Undergr
prevalent techniques in modern construction works. With the assistance Space Technol 2018;78:178–87.
[5] Wang Z, Yao W, Cai Y, Xu B, Fu Y, Wei G. Analysis of ground surface settlement
of BIM, the AI methods can be easily implemented and understood by induced by the construction of a large-diameter shallow-buried twin-tunnel in soft
civil engineers and considerably enhance the efficiency of design. As all ground. Tunn Undergr Space Technol 2019;83:520–32.
techniques in the MORO method can be generalized to other situations, [6] Zhang L, Lin P. Multi-objective optimization for limiting tunnel-induced damages
considering uncertainties. Reliab Eng Syst Saf 2021;216:107945.
the only modifications of the methodology will be determining design [7] Deb K, Pratap A, Agarwal S, Meyarivan T. A fast and elitist multiobjective genetic
targets, selecting corresponding variables, obtaining training samples, algorithm: NSGA-II. IEEE Trans Evol Comput 2002;6:182–97.
and ultimately training new models. [8] Guo K, Zhang L. Data-driven optimization for mitigating tunnel-induced damages.
Appl Soft Comput 2022;115:108128.
The method proposed in this study has produced very positive re­
[9] Guo K, Zhang L. Multi-objective optimization for improved project management:
sults, but there are still some potential improvements that can be ach­ Current status and future directions. Autom Constr 2022;139:104256.
ieved by further studies. First of all, this study only considers one source [10] Gunning D, Stefik M, Choi J, Miller T, Stumpf S, Yang GZ. XAI-explainable artificial
of uncertainty from the soil parameters. However, there are two more intelligence. Sci Robot 2019;4:eaay7120.
[11] Van Lent M, Fisher W, Mancuso M. An explainable artificial intelligence system for
sources of uncertainties existing in the system. Considering high accu­ small-unit tactical behavior. In: Proceedings of the national conference on artificial
racy cannot be guaranteed in every problem, it is valuable to consider intelligence; 2004. p. 900–7.
the aleatoric uncertainties in the prediction models. The reliability of the [12] Barredo Arrieta A, Díaz-Rodríguez N, Del Ser J, Bennetot A, Tabik S, Barbado A,
et al. Explainable artificial intelligence (XAI): concepts, taxonomies, opportunities
result can be measured by computing the prediction intervals and the and challenges toward responsible AI. Inf Fusion 2020;58:82–115.
safety can be further enhanced by constraints on the failure probability. [13] Lundberg SM, Lee SI. A unified approach to interpreting model predictions. In:
Additionally, in civil engineering problems, the control of damage is not Proceedings of the 31st international conference on neural information processing
systems; 2017. p. 4768–77.
“the smaller the better”. The designer normally only needs to control the [14] Van den Broeck G, Lykov A, Schleich M, Suciu D. On the tractability of SHAP
damage not exceeding the design limit, extremely small damage may explanations. J Artif Intell Res 2022;74:851–86.
cause much higher costs on the contrary. Therefore, it is valuable to [15] Ribeiro MT, Singh S, Guestrin C. Explaining the predictions of any classifier. In:
Proceedings of the 22nd ACM SIGKDD international conference on knowledge
employ fuzzy logic to simulate uncertain decisions from the designer. discovery and data mining; 2016. p. 1135–44.
Moreover, the current mainly focuses on the design stage of the [16] Panati C, Wagner S, Brüggenwirth S. Feature relevance evaluation using grad-CAM,
tunneling problem, and only features related to materials and geometry LIME and SHAP for deep learning SAR data classification. In: Proceedings of the
23rd international radar symposium (IRS)2022; 2022. p. 457–62.
are selected. In other words, this method serves as guidance before
[17] Pan Y, Zhang L. Integrating BIM and AI for smart construction management:
construction. To make it applicable in the construction stage, it is Current status and future directions. Arch Comput Methods Eng 2022:1–30.
necessary to consider the TBM parameters (i.e. machine properties, https://doi.org/10.1007/s11831-022-09830-8.
thrust forces, earth pressure, grouting, etc.). With full consideration of
these parameters, this method will be useable throughout the whole

17
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

[18] Barga R, Fontama V, Tok WH, Barga R, Fontama V, Tok WH. Introducing microsoft [47] Mahmoodzadeh A, Nejati HR, Mohammadi M, Hashim Ibrahim H, Rashidi S,
azure machine learning. Predictive analytics with microsoft azure machine Ahmed Rashid T. Forecasting tunnel boring machine penetration rate using LSTM
learning. Berkeley, CA: Apress; 2015. p. 21–43. deep neural network optimized by grey wolf optimization algorithm. Expert Syst
[19] Roh S, Aziz Z, Peña-Mora F. An object-based 3D walk-through model for interior Appl 2022;209:118303.
construction progress monitoring. Autom Constr 2011;20:66–75. [48] Vargas JP, Koppe JC, Pérez S. Monte Carlo simulation as a tool for tunneling
[20] Golparvar-Fard M, Peña-Mora F, Savarese S. Automated progress monitoring using planning. Tunn Undergr Space Technol 2014;40:203–9.
unordered daily construction photographs and ifc-based building information [49] Guan Z, Deng T, Jiang Y, Zhao C, Huang H. Probabilistic estimation of ground
models. J Comput Civil Eng 2015;29:04014025. condition and construction cost for mountain tunnels. Tunn Undergr Space Technol
[21] Taskari O, Sextos A. Probabilistic assessment of abutment-embankment stiffness 2014;42:175–83.
and implications in the predicted performance of short bridges. J Earthq Eng 2015; [50] Wang L, Khishe M, Mohammadi M, Mahmoodzadeh A. Extreme learning machine
19:822–46. evolved by fuzzified hunger games search for energy and individual thermal
[22] Shapiro A, Dentcheva D, Ruszczynski A. Lectures on stochastic programming: comfort optimization. J Build Eng 2022;60:105187.
modeling and theory. SIAM; 2021. [51] Mahmoodzadeh A, Zare S. Probabilistic prediction of expected ground condition
[23] Wang JC, Zhang F, Lin JX. Robust optimization design of bolt-shotcrete support and construction time and costs in road tunnels. J Rock Mech Geotech Eng 2016;8:
structure in tunnel. Teh Vjesn 2018;25:1538–45. 734–45.
[24] Reyes RR, Esati S, Bauschert T. Traffic protection in multilayer core networks by [52] Mahmoodzadeh A, Mohammadi M, Nariman Abdulhamid S, Hashim Ibrahim H,
optimum thinning of MPLS tunnel capacities. In: Proceedings of the International Farid Hama Ali H, Ghafoor Salim S. Dynamic reduction of time and cost
Conference on Optical Network Design and Modeling (ONDM)2021; 2021. p. 1–6. uncertainties in tunneling projects. Tunn Undergr Space Technol 2021;109:
[25] Mokhtari S, Navidi W, Mooney M. White-box regression (elastic net) modeling of 103774.
earth pressure balance shield machine advance rate. Autom Constr 2020;115: [53] Acaroglu O, Ozdemir L, Asbury B. A fuzzy logic model to predict specific energy
103208. requirement for TBM performance prediction. Tunn Undergr Space Technol 2008;
[26] Zhu M, Zhu H, Guo F, Chen X, Ju JW. Tunnel condition assessment via cloud 23:600–8.
model-based random forests and self-training approach. Comput Aided Civ [54] Soyster AL. Technical note—convex programming with set-inclusive constraints
Infrastruct Eng 2021;36:164–79. and applications to inexact linear programming. Oper Res 1973;21:1154–7.
[27] Sheil BB, Suryasentana SK, Cheng WC. Assessment of anomaly detection methods [55] Delage E, Iancu DA. Robust multistage decision making. Oper Res Revolut
applied to microtunneling. J Geotech Geoenvironmental Eng 2020;146:04020094. INFORMS 2015:20–46.
[28] Xue YD, Zhang S, Correia AG, Tinoco J, Cortez P, Lamas L. A fast metro tunnel [56] Staib M, Jegelka S. Distributionally robust optimization and generalization in
profile measuring method based on close-range photogrammetry. Information kernel methods. Adv Neural Inf Process Syst 2019;32:1–11.
technology in geo-engineering. Cham: Springer International Publishing; 2020. [57] Wiebe J, Cecílio I, Dunlop J, Misener R. A robust approach to warped Gaussian
p. 57–69. process-constrained optimization. Math Progr 2022;196:805–39.
[29] Khetwal S, Pei S, Gutierrez M, Correia AG, Tinoco J, Cortez P, Lamas L. A data- [58] Yanıkoğlu İ, Gorissen BL, Den Hertog D. A survey of adjustable robust
driven approach for direct assessment and analysis of traffic tunnel resilience. optimization. Eur J Oper Res 2019;277:799–813.
Information technology in geo-engineering. Cham: Springer International [59] Rahimian H, Mehrotra S. Frameworks and results in distributionally robust
Publishing; 2020. p. 168–77. optimization. Open j math optim 2022;3:1–85.
[30] Ding H, Liu S, Cai S, Xia Y, Correia AG, Tinoco J, Cortez P, Lamas L. Big data [60] Guevara E, Babonneau F, Homem-de-Mello T, Moret S. A machine learning and
analysis of structural defects and traffic accidents in existing highway tunnels. distributionally robust optimization framework for strategic energy planning under
Information technology in geo-engineering. Cham: Springer International uncertainty. Appl Energy 2020;271:115005.
Publishing; 2020. p. 189–95. [61] Wang C, Chen S. A distributionally robust optimization for blood supply network
[31] Hayashi H, Miyanaka M, Gomi H, Tatsumi J, Kawabe N, Shinji M, Correia AG, considering disasters. Transp Res E Logist Transp Rev 2020;134:101840.
Tinoco J, Cortez P, Lamas L. Prediction of forward tunnel face score of rock mass [62] Liu L, Hu Z, Duan X, Pathak N. Data-driven distributionally robust optimization for
classification for stability by applying machine learning to drilling data. real-time economic dispatch considering secondary frequency regulation cost. IEEE
Information technology in geo-engineering. Cham: Springer International Trans Power Syst 2021;36:4172–84.
Publishing; 2020. p. 268–78. [63] Huang R, Qu S, Gong Z, Goh M, Ji Y. Data-driven two-stage distributionally robust
[32] Liu Y, Hou S, Correia AG, Tinoco J, Cortez P, Lamas L. Rockburst prediction based optimization with risk aversion. Appl Soft Comput 2020;87:105978.
on particle swarm optimization and machine learning algorithm. Information [64] Zhao P, Gu C, Huo D, Shen Y, Hernando-Gil I. Two-stage distributionally robust
technology in geo-engineering. Cham: Springer International Publishing; 2020. optimization for energy hub systems. IEEE Trans Ind Inf 2020;16:3460–9.
p. 292–303. [65] Wiebe J, Cecı́lio I, Misener R. Robust optimization for the pooling problem. Ind Eng
[33] Zhao W, Wei Y, Liu B, Liu S, Xiao L, Correia AG, Tinoco J, Cortez P, Lamas L. Chem Res 2019;58:12712–22.
Design and application of automatic monitoring and BIM technology to the [66] Pazouki M, Rezaie K, Bozorgi-Amiri A. A fuzzy robust multi-objective optimization
construction of shield-bored underneath building. Information technology in geo- model for building energy retrofit considering utility function: a university
engineering. Cham: Springer International Publishing; 2020. p. 493–501. building case study. Energy Build 2021;241:110933.
[34] M.R. Moghaddasi, M. Noorian-Bidgoli. ICA-ANN and multiple regression models [67] Lara-Molina FA, Dumur D. Robust multi-objective optimization of parallel
for prediction of surface settlement caused by tunneling. Tunnelling and manipulators. Meccanica 2021;56:2843–60.
Underground Space Technology. 2018;79:197–209. [68] Bhattacharjya S, Chakraborty S. An improved robust multi-objective optimization
[35] Zhang Y, Yang L. A novel dynamic predictive method of water inrush from coal of structure with random parameters. Adv Struct Eng 2018;21:1597–607.
floor based on gated recurrent unit model. Nat Hazards 2021;105:2027–43. [69] Boindala SP, Jaykrishnan G, Ostfeld A. Robust multi-objective optimization of
[36] Mahmoodzadeh A, Nejati HR, Mohammadi M, Ibrahim HH, Rashidi S, Ibrahim BF. water distribution systems. World environmental and water resources congress
Forecasting face support pressure during EPB shield tunneling in soft ground 2022:1066–75.
formations using support vector regression and meta-heuristic optimization [70] Hu M, Li W, Yan K, Ji Z, Hu H. Modern machine learning techniques for univariate
algorithms. Rock Mech Rock Eng 2022;55:6367–86. tunnel settlement forecasting: a comparative study. Math Probl Eng 2019;2019:
[37] Chang L, Zhang L, Fu C, Chen Y. Transparent digital twin for output control using 7057612.
belief rule base. IEEE T CYBERNETICS 2022;52:10364–78. [71] Karniadakis GE, Kevrekidis IG, Lu L, Perdikaris P, Wang S, Yang L. Physics-
[38] Santos OJ, Celestino TB. Artificial neural networks analysis of São Paulo subway informed machine learning. Nat Rev Phys 2021;3:422–40.
tunnel settlement data. Tunn Undergr Space Technol 2008;23:481–91. [72] Z. Li, N. Kovachki, K. Azizzadenesheli, B. Liu, K. Bhattacharya, A. Stuart, et al.
[39] Ocak I, Seker SE. Calculation of surface settlements caused by EPBM tunneling Fourier neural operator for parametric partial differential equations. arXiv e-print.
using artificial neural network, SVM, and Gaussian processes. Environ Earth Sci 2020;arXiv:201008895.
2013;70:1263–76. [73] Lu L, Jin P, Pang G, Zhang Z, Karniadakis GE. Learning nonlinear operators via
[40] Zhou J, Shi X, Du K, Qiu X, Li X, Mitri HS. Feasibility of random-forest approach for DeepONet based on the universal approximation theorem of operators. Nat Mach
prediction of ground settlements induced by the construction of a shield-driven Intell 2021;3:218–29.
tunnel. Int J Geomech 2017;17:04016129. [74] Peck RB. Deep excavations and tunneling in soft ground 1969:225–90.
[41] Dindarloo SR, Siami-Irdemoosa E. Maximum surface settlement based [75] Fattahi H, Ghaedi H, Malekmahmodi F. Presenting a new methodology in
classification of shallow tunnels in soft ground. Tunn Undergr Space Technol 2015; estimating TBM-EPB machine face pressure: a case study. Tunn Undergr Space Eng
49:320–7. 2021;10:287–310.
[42] Kim D, Pham K, Oh JY, Lee SJ, Choi H. Classification of surface settlement levels [76] Chehade FH, Shahrour I. Numerical analysis of the interaction between twin-
induced by TBM driving in urban areas using random forest with data-driven tunnels: influence of the relative position and construction procedure. Tunn
feature selection. Autom Constr 2022;135:104109. Undergr Space Technol 2008;23:210–4.
[43] Benardos AG, Kaliampakos DC. Modelling TBM performance with artificial neural [77] Chapman D, Ahn S, Hunt DV. Investigating ground movements caused by the
networks. Tunn Undergr Space Technol 2004;19:597–605. construction of multiple tunnels in soft ground using laboratory model tests. Can
[44] Mahdevari S, Shahriar K, Yagiz S, Akbarpour Shirazi M. A support vector Geotech J 2007;44:631–43.
regression model for predicting tunnel boring machine penetration rates. Int J [78] Hao D, Zhu R, Wu K, Chen R. Analysis of ground settlement caused by double-line
Rock Mech Min Sci 2014;72:214–29. TBM tunnelling under existing building. Geotech Geol Eng 2022;40:899–911.
[45] Gao X, Shi M, Song X, Zhang C, Zhang H. Recurrent neural networks for real-time [79] Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, et al. Scikit-
prediction of TBM operating parameters. Autom Constr 2019;98:225–35. learn: machine learning in Python. the. J Mach Learn Res 2011;12:2825–30.
[46] Zhou C, Xu H, Ding L, Wei L, Zhou Y. Dynamic prediction for attitude and position [80] Liu Y, Shi H, Huang S, Chen X, Zhou H, Chang H, et al. Early prediction of acute
in shield tunneling: a deep learning method. Autom Constr 2019;105:102840. xerostomia during radiation therapy for nasopharyngeal cancer based on delta
radiomics from CT images. Quant Imaging Med Surg 2019;9:1288–302.

18
P. Lin et al. Reliability Engineering and System Safety 234 (2023) 109172

[81] Ranstam J, Cook JA. LASSO regression. Br J Surg 2018;105:1348. [88] ZM, Y.Z.; Xiaochun, X.; Huiming, W.; Zhen, B. Analysis on the influence on the
[82] Zakaria A, Ismail FB, Lipu MSH, Hannan MA. Uncertainty models for stochastic existing metro tunnel by super-large shield tunneling underpassing at different
optimization in renewable energy applications. Renew Energy 2020;145:1543–71. angles. railway standard design. 2021;65:112–117.
[83] Kazemzadeh N, Ryan SM, Hamzeei M. Robust optimization vs. stochastic [89] Alagha ASN, Chapman DN. Numerical modelling of tunnel face stability in
programming incorporating risk measures for unit commitment with uncertain homogeneous and layered soft ground. Tunn Undergr Space Technol 2019;94:
variable renewable generation. Energy Syst 2019;10:517–41. 103096.
[84] Li Z, Ding R, Floudas CA. A comparative theoretical and computational study on [90] Xu H, Zhou J, Asteris PG, Jahed Armaghani D, Tahir MM. Supervised machine
robust counterpart optimization: I. robust linear optimization and robust mixed learning techniques to the prediction of tunnel boring machine penetration rate.
integer linear optimization. Ind Eng Chem Res 2011;50:10567–603. Applied Sciences 2019;9:3715.
[85] Mavrotas G. Effective implementation of the ε-constraint method in multi-objective [91] Skentou AD, Bardhan A, Mamou A, Lemonis ME, Kumar G, Samui P, et al. Closed-
mathematical programming problems. Appl Math Comput 2009;213:455–65. form equation for estimating unconfined compressive strength of granite from
[86] Lundberg SM, Erion GG, Lee SI. Consistent individualized feature attribution for three non-destructive tests using soft computing models. Rock Mech Rock Eng
tree ensembles. ArXiv 2018;03888:1–9. abs/1802. 2022;56:487–514.
[87] Fryer, D.; Strümke, I. and Nguyen, H. Shapley Values for Feature Selection: The [92] Chai T, Draxler RR. Root mean square error (RMSE) or mean absolute error (MAE)?
Good, the Bad, and the Axioms. IEEE Access. 2021;9:144352-144360. – arguments against avoiding RMSE in the literature. Geosci Model Dev 2014;7:
1247–50.

19

You might also like