Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Computer Communications 158 (2020) 32–38

Contents lists available at ScienceDirect

Computer Communications
journal homepage: www.elsevier.com/locate/comcom

Intelligent algorithm of geotechnical test data based on Internet of Things


Yawei Ma ∗, Guihong Guo
School of Civil Engineering and Mechanics, Lanzhou University, Lanzhou730000, China
Key Laboratory of Mechanics on Disaster and Environment in Western China, Ministry of Education, China

ARTICLE INFO ABSTRACT


Keywords: In the geotechnical engineering geological survey industry, geotechnical test data is the basic data for analyzing
Internet of Things and evaluating geotechnical engineering geology, forming reports, graphics, and survey reports. It plays an
Geotechnical tests important role in the calculation of the bearing capacity, deformation calculation and physical and mechanical
Intelligent algorithms
characteristics of the foundation soil. The purpose of this article is to solve the problems of tedious, inefficient
Data analysis
and error-prone data collection, processing and analysis of geotechnical test data in the geotechnical and
geological surveying industry of geotechnical engineering. By using the BP neural algorithm and selecting the
intelligent algorithm, the SVM is used to solve the sample problem. The algorithm establishes an intelligent
algorithm for geotechnical test data based on the Internet of Things. Then take the geological characteristics
of the Ganjiang River Basin as an example, analyze the geotechnical test data to verify the feasibility of the
intelligent algorithm for data analysis. The research results show that the algorithm realizes the automatic
collection and processing of geotechnical test data, reduces the tester’s workload and the influence of human
factors on the test results, makes up for the shortcomings of traditional acquisition algorithm hardware fixation,
and solves the problem of simultaneous multitasking. Difficult problems have promoted the development of
innovative experiments.

1. Introduction mobile Internet of Things with a large user scale, the high level of mu-
tual heterogeneity of devices and application networks, and the large
At present, China’s computer information technology industry is number of network events spontaneously and automatically generated
developing rapidly. Effectively realize the mutual communication and by these things, will promote future diversified network applications
information exchange between all kinds of people and things and and The process of designing and developing network services has
people, people and things and things and things [1]. The rapid rise become a difficult technical task [6]. In general, middleware can sim-
of the Internet of Things will also once again promote the continuous plify the development process by integrating heterogeneous computing
and rapid growth of industry data. For many data industries, this and communication devices and supporting interoperability in various
is not only a more severe market challenge, but also undoubtedly
applications and services [7].
a more valuable development opportunity. The Internet of Things is
In many geotechnical chemical engineering and geotechnical test
profoundly changing the daily habits and working lifestyles of modern
survey technology industries, the analysis of geotechnical test measure-
people [2]. The future of the Internet will consist of a variety of
ment data is an extremely important one to assist geotechnical chemical
heterogeneous interconnected physical device networks. These network
devices will further connect and extend the real-world network bound- engineers to carry out survey analysis, evaluation and evaluation of
aries through these physical network entities and other virtual network geological surveys, forming geological reports, graphics and geological
components [3]. The Internet of Things (IoT) will provide technical survey test reports Basic technical data [8]. Geotechnical measure-
support for all IoT applications of mobile internet through these new ment parameters in rock and soil usually have significant probability
functions. The current major revolution of the Internet, mobile and uncertainty, which is often treated as a random variable parameter
intelligent machine-to-mobile machine (m2m) information technology in probability analysis. Various probabilistic changes of geotechnical
can be seen as the first new phase of the Internet of Things [4]. It also structural parameters are estimated and quantitatively analyzed by data
plays an important role in the analysis and calculation of geological analysis in the application laboratory [9]. The quality of geotechnical
bearing capacity, deformation strength calculation, and description of test analysis data is highly discrete, and the number is generally small,
physical and mechanical geological characteristics of building founda- but it contains a lot of historical and technical information that can
tion soils [5]. Many features of the Internet of Things, including the be fully utilized [10]. The estimation includes geotechnical treatment

∗ Corresponding author at: School of Civil Engineering and Mechanics, Lanzhou University, Lanzhou730000, China.
E-mail addresses: mayw@lzu.edu.cn (Y. Ma), ggh7941@edu.com (G. Guo).

https://doi.org/10.1016/j.comcom.2020.04.028
Received 26 January 2020; Received in revised form 20 March 2020; Accepted 14 April 2020
Available online 18 April 2020
0140-3664/© 2020 Elsevier B.V. All rights reserved.
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

engineering and monitoring of the geological settlement conditions of status of relevant scientific research and technology fields, correspond-
civil land area, which are important components of the design of the ing to the theoretical introduction of the mobile Internet of Things,
tunnel pile foundation structure and the calculation of the pile capacity geochemical engineering, and intelligent algorithms, and the current
and geological settlement depth [11]. Determining the characteristics geotechnical experiments and human intelligence algorithms. Based on
of the measured site Soil moisture measurement characteristics Site the comprehensive analysis of the data quality in the geotechnical sam-
curve measurement is an indispensable step for implementing soil fluid ple detection in the reservoir area, and the simultaneous analysis and
dynamics of non-saturated sites in the practice of site geotechnical comparison with the current foagrnn and fobpnn algorithms to fully
engineering technology, which can be performed through various sites verify the current svm intelligent algorithm’s high standard accuracy
and/or This kind of laboratory field test directly performs site measure- application characteristics for small-scale sample detection training.
ment [12]. However, due to the various high standards of measurement The fourth part is mainly an in-depth study of related experiments and
equipment and program control and the strict restrictions imposed a comprehensive analysis of the results. Through the introduction of
on the equipment by automatic test machines, this direct method the above tests, we understand how the parameters such as maximum
of measurement is both expensive and time-consuming [13]. Intelli- dry density, clay content and powder content, limit moisture content,
gent algorithm is a computational science method based on people’s and specific gravity are measured through experiments. The fifth part
continuous learning from the whole nature, imitating the structural is the conclusion, which is the summary and recommendations of this
and physiological habits of other organisms in nature [14]. Such as article, and the summary of the results of the article.
simulated neural network, genetic algorithm, simulated annealing intel-
ligent algorithm, simulated annealing algorithm technology, intelligent 2. Proposed method
algorithms supporting multiple vector computers and groups [15].
The swarm insect intelligence algorithm is usually inspired by some 2.1. Geotechnical testing method
computer scientists who are directly inspired by the swarm insect
algorithm. It is generated by intelligent simulation analysis of different Geotechnical engineering construction site survey as a technical
social population insects [16]. basis for geotechnical engineering design and planning construction,
In this paper, a combination of natural particle radiation image according to the current technical requirements of geotechnical en-
metrology and velocimetry control technology (piv) and close-up gineering construction, the specific geological conditions and current
stereoscopic image measurement velocimetry technology is developed
geotechnical conditions in the key areas of geotechnical engineering
to develop a set of applications for natural rocky soil physical soil
construction Geotechnical testing of the state of the project can be
modeling using natural synthetic transparent concrete technology. The
used to obtain a large number of scientific and reasonable geotechnical
results show that Compared with the modeling of natural synthetic
testing data reports, which provide favorable testing data for the entire
soil [17]. Through the model test of typical geotechnical centrifugal
process of geotechnical engineering construction testing. And timely
force, Luo deeply studied the direct influence of the stress gradient
formulate and improve the corresponding policy solutions to ensure the
of sandy soil slope and the moisture content of typical soil on the
smooth progress of geotechnical protection engineering construction in
deformation stability of typical sandy soil slope, and analyzed the
scenic spots.
centrifugal force and load of typical sandy soil slope [18]. Magner
Test range of permeability coefficient Test method the range of
proposed a method to determine the number of samples as a function
soil permeability coefficient is 10–1 ∼ 10–8 cm/s. The test method
of the coefficient of variation to fully characterize soil and rock units.
of permeability coefficient in the standard is generally divided into
A large number of tests reduced uncertainty and increased confidence
the method with normal head and the method with variable head.
in the results of design parameters [19]. Lingwanda uses standard
The reservoir area design standard of the Xiajiang River to increase
infiltration tests, static cone infiltration tests, photodynamic detection,
the permeability coefficient of the drainage area of the reservoir area
and laboratory manometer tests to quantify the total uncertainty in
requires 10–4 cm/s.
predicting the restricted modulus of sandy soil from data from field
tests side by side [20]. Primod proposed an adaptive network fuzzy The following is a brief introduction to the on-site analysis and
inference algorithm for engineers to evaluate the number of survey sampling preparation method of the permeability coefficient of the
points, field tests and laboratory tests to correctly describe the con- transformer water head flushing method: firstly prepare the entire 500
struction site, identified the best survey points in the reference case, cm3 ring-edged soil knife, Vaseline, pick, soil cutter, hammer, etc.;
and determined each Optimal number of field tests and laboratory apply a small amount of Vaseline solution to the 500 cm3 ring-shaped
tests [21]. Based on the analysis of nonlinear bi-level programming knife. The inner wall side of the knife can increase the welding lubricity
problems, FAN proposed a hybrid intelligent algorithm combining of the soil sample and the whole ring knife after being painted, so that
particle swarm optimization (PSO) and variable neighborhood search it can easily enter and exit from the whole ring knife. Use a pick or a
(VNS). This method combines the fast search capability of PSO with knife to plan the ring-shaped soil sample that has occupied 20 cm of
global search [22]. Grubišić proposed a detection intelligent algorithm the measuring point surface. Use a ring hammer to knock the soil with
that combines motion detection, edge detection, spectrum analysis and a ring-shaped cover into the soil with a ring cutter at a constant speed.
motion shape analysis algorithms to match to improve the detection The compaction test method is used to analyze and determine the
rate and reduce the false alarm rate [23]. Wang proposed a remote correlation between the trunk density and the water content of the
intelligent fault diagnosis algorithm based on wireless networks. A soil in the project, so as to determine the maximum trunk density
data transmission method based on wireless network is analyzed, and and ensure the optimal water content of the soil. There are two main
an intelligent fault diagnosis scheme based on Bayesian network is methods of compaction test on compaction test: light compaction and
proposed [24]. Mazin solves cloud security issues by using genetic heavy compaction. The difference between light compaction and heavy
algorithms as intelligent algorithms for the wall to provide secure cloud compaction is based on the maximum particle size of the soil in the
data. All services provided by the cloud must detect who receives and project.
registers it in order to create a user list based on behavior [25]. The particle size analysis test method is a test method for determin-
This article’s main current scientific research technology field ing the percentage of the total particle mass of the soil in various grain
knowledge content can be roughly divided into five basic knowledge groups in the dry soil. An example is used to illustrate the reasonable
parts: Chapter I The main content of this part is the current relevant sci- distribution and gradation of the particle size of the dry soil. User
entific research technology field content introduction, the current basic classification and grading outlines are used to judge the nature of
scientific research basic technology The purpose and important techni- the construction work of the soil and to select the soil. In the test
cal significance and an in-depth analysis of the current development specifications, according to the soil particles and their gradation, it is

33
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

divided into: sieve analysis method, density meter method, pipette sieve (5) bp is the training of the neural network: after the initialization
analysis method and a combination of several test methods. of the weights and thresholds, the initialization of the weights of the
Absolute mass specific gravity of soil particles The mass ratio of soil neural network is performed. During the initialization training, the
particles refers to the absolute specific gravity of pure water and soil initialization weights and thresholds of the network are repeatedly
when the mass of soil particles is the same as the volume of concrete adjusted and performed multiple times. Modified to effectively reduce
particles when the temperature of the water concrete is constant at the interference and error performance function to the network; the
105 ∼ 110 ◦ C to 200 ◦ C. Grain mass absolute weight ratio. performance function train used during bp training, the function call
The liquid limit refers to the limit that the fine-grained soil material form is Net = train (Net, Y, X); simulation test: BP neural network uses
as a whole is in a semi-completely plastic overall state. The upper sim Regardless of one-dimensional output or high-dimensional output,
limit of the volume part is the excessive water content, and the plastic you only need to change the corresponding matrix form. Its function is
limit refers to the limit that the fine-grained soil is in a non-semi- called as follows:
(5) Intelligent algorithm bionics is an intelligent computing method
plastic overall state. The lower limit of the volume part is excessive
used to simulate modern biological evolution and develop bioengineer-
water content, and the shrinkage limit refers to the overall state of
ing communities. Its basic design idea: based on and inspired by the
the fine-grained soil from semi-plastic to a solid state. The water
principle of bionics, using traditional and modern computer technolo-
continues to over-evaporate and the water continues to shrink into a
gies and tools, to simulate biological phenomena such as neural cell
solid boundary volume without excessive shrinkage. Moisture content
immunity, neural cell networks and the evolution of life systems, to
where the overall state is not within limits.
obtain biological information, Data processing and bionic calculations.
Because natural creatures and humans have different degrees and levels
2.2. BP Neural network of artificial intelligence, they must have strong self-selection and syn-
ergy in the way they evolve and the characteristics of their behaviors.
Hierarchical learning processing of bp artificial neural network is a They survive in the environment through this synergy Came down.
learning processing algorithm of artificial neural network. Its network
structure is a hierarchical artificial neural network composed of three 2.3. The theoretical basis of SVM algorithm
layers of neural input end layer, middle layer and output end. The basic
physical structure of the bp artificial neural network is mainly com- Support model vector machine statistical learning is a new method
posed of an input network layer, an intermediate layer, and an output applied to machine statistical learning. The basic concept of statistics
terminal. The number of influencing factors of the input variables of the is based on vapnick, which was created in the 1960s and 1970s and
hidden input level is implied. The layer can often use s-type neurons in continuously developed in the mid-1990s. The theory. In fact, the
the input layer, and the number of neurons in the output layer is called statistical learning methodology of support vector machines is based on
the output variable. The output layer function generally uses a linear applying the model to statistical machine learning theory. Generalized
function. computing power, and this method has no restrictions on the data
dimension. It is precisely because the two types of svm and svm have
The network in grin itself has strong calculation capabilities for non-
such statistical reasoning characteristics. When linear classification
linear typical sample optimization mapping, has a good optimization of
is performed on these two classes, the simple classification plane is
radial and non-flexible sample network processing structures, and has
taken at a large sample distance and the small sample classification
a high degree of fault tolerance and robustness for optimized sample
distance between the two classes is larger. One place; when performing
mapping. The final result of the network can converge to an optimized
non-linear classification of these two classes, because the problem of
sample size that is densely accumulated. The sample optimization
non-linear classification distance cannot be directly solved by simple
point and the continuous regression of the sample are above, and non-linear classification surface transformation, svm can adopt high-
when the amount of data densely accumulated during the continuous dimensional space transformation to classify non-linear machines the
approximation of an optimized sample is relatively small, the predicted problem is transformed into a high-dimensional space to solve the
sample effect is also better. problem of nonlinear machine classification. This machine learning
Matlab includes a toolbox based on bp neural network, and bp theory addresses the problem of small samples. Establish a new set
neural network is mainly used for inversion data analysis and other of machines learning theoretical systems. The rules of progressive
artificial intelligence algorithms. The basic steps for internship on statistical machine learning inference under this new theoretical system
MATLAB are basically similar. not only fully consider the basic requirements for the performance of
(1) Extract training samples: Propose training samples X, Y from the progressive vector machines, but also fully pursue the existing limited
original data, similar to SVM and FOOGRNN. statistical information. The best results can also be obtained under the
(2) Import preprocessing of normalized training data: Use the func- conditions.
tion to import a normalized training sample into the database and When SVM is used in regression fitting analysis, the basic idea
perform preprocessing of the normalized import database. is no longer that it needs an optimal classification hyperplane for
(3) Initialization training bp feed-forward neural network: bp feed- classification, but an optimal classification surface to minimize the
forward neural network is a feed-forward neural network. When ini- error of all training samples from the optimal classification surface.
tializing training feed-forward neural network, you must consider the As mentioned above, a typical classification is a problem in regres-
weight of the initialization neural network structure and Threshold. sion. Assume that a parameter subset for training samples is x, the value
(4) How to build a matlab neural network: The main calling function of xi is the sample vector corresponding to the input of the sample and
of how to build a neural network is a built-in function in matlab, which the value of yi is the output value of the corresponding sample. We
is the newff function. The main calling form of the newff function is can assume that in a very high-dimensional regression eigenfunction
space we can directly build a non-linear high-dimensional regression
net = newff (p, t, s, tf, btf , blf), the specific calling parameters and
eigenfunction:
meanings are briefly introduced: p and t are two matrices of input and
output respectively, that is, the model x and y of the training sample 𝑓 (𝑥) = 𝜔𝜙 (𝑥) + 𝑏 (1)
is selected. Special attention must be paid to the two matrices x and y
Among them, 𝜙 (𝑥) is a non-linear mapping function. Linear insen-
in the model. The dimensions must be exactly matched, otherwise the
sitive loss function:
training program cannot run normally; s is the node transfer function {
of the hidden layer in the model. Generally, experience accumulation 0, |𝑦 − 𝑓 (𝑥) | ≤ 𝜀
𝐿 (𝑓 (𝑥) , 𝑦, 𝜀) = (2)
can be taken into consideration. |𝑦 − 𝑓 (𝑥)| − 𝜀, |𝑦 − 𝑓 (𝑥) | > 𝜀

34
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

Among them, 𝑓 (𝑥) is the predicted value returned by the regres- Table 1
Soil sample quality, etc.
sion function; 𝑦 is the corresponding real value. Similar to the SVM
classification, if the optimal hyperplane cannot completely solve the Level Degree of disturbance Test content

problem, the introduction of relaxation variables requires that the 1 Not disturb Soil classification, moisture content, density,
strength test, consolidation test
optimal hyperplane function be mathematically expressed:
2 Slight disturbance Soil classification, moisture content, density
‖ 2‖ 3 Significant disturbance Soil classification, moisture content
‖𝑤 ‖ 𝐿
𝑚𝑖𝑛 ‖ ‖ + 𝑐𝑖=12
𝜉𝑖 + 𝜉𝑖∗ (3) 4 Completely disturbed Soil classification
2
Solve the above formula, introduce Lagrange, function and convert
to dual form, and finally get the regression function:
south. There is no possibility of adverse physical and geological effects

𝐿
( ) ( ) near the groundwater borrowing area. The overlying bedrock strata
∗ ∗
𝑓 (𝑥) = 𝑤 𝜙 (𝑥) + 𝑏 = 𝛼𝑖 − 𝑎∗𝑖 𝐾 𝑥1 ; 𝑥 + 𝑏∗ (4)
in the field are mainly clay, sand and loose gravel alluviums formed
𝑖=1
by the Quaternary alluvium (alq4) . The types of groundwater in the
The solution method of support vector machine is finally converted
field are mainly bedrock pores and fault diving and quaternary bedrock
into a quadratic programming problem with constraints. When there
formation cracks and fault diving. Bedrock pore water is mainly present
are few training samples, the traditional Newton method, interior point
in the Quaternary loose glutenite and pebble layers. It has good water
method, and conjugate gradient method can be used to solve it. When
permeability, abundant water content, shallow water depth, and some
there are a large number of samples, the disadvantages of traditional
low-lying plains are pressure-bearing. Closely. The bedrock and fault
algorithms become apparent. In order to reduce the computational
fissure water in the borrowing area are mainly taken out and stored
complexity, many support vector machine algorithms for large-scale
in the fractures formed by the Quaternary bedrock and the fractured
training samples are proposed, such as block algorithm, Osuna al-
zone of the bedrock formation. They receive recharge from atmospheric
gorithm, sequence minimum optimization algorithm, and increment
precipitation and surface water in the low-lying plains. The bedrock
Learning algorithms and more.
forms a fault fracture zone and migrates in the direction of seepage. It
is excreted in the form of spring water to rivers, lakes and low-lying
3. Experiments
plains, and the water content is not abundant.
The quality grade of the soil sample taken shall meet the require-
3.1. Geological features and sampling tests
ments of Table 1, as shown in Table 1.
Holes should be cleared in time before the lowering of the lowering
The climate of the upper reaches of the Ganjiang River belongs to
device, and the thickness of the residual floating soil at the bottom of
a continental subtropical humid climate, and the eastern subtropical
the hole should not exceed or exceed the length of the abandonment
monsoon region. The main climatic characteristics of the year are as
section of the lowering device. For hard soils, it is advisable to consider
follows: more rain in spring and summer, less rainfall in autumn and
using two or triple pipe rotary earthmovers for drilling and sampling.
winter, shorter spring and autumn, longer winter and summer, and
When you have regional experience, you can consider using heavy
spring and cold summer. Hot, cool in autumn and cold in winter, short
chain less-hitting method for sampling. When it is necessary to take
icing period, long frost-free period and long average sunshine time,
Class 1 and 2 sand samples in the powder, sand, and sandy soil
high relative humidity, and obvious climate change in four seasons.
Climatic situation The annual rainfall in the upper reaches of the layers, a triple-pipe as-sampling drilling drill can be considered; and
Ganjiang River is abundant. The average precipitation in the basin the disturbance samples in the soil breaking process can be taken from
for many years is between 1400 m and 1800 mm, and the annual the penetrator. Directly using the rock drilling sample extractor can be
distribution of precipitation is extremely uneven. According to the directly made using the core during the rock drilling process.
statistics of the representative meteorological stations in the upper
reaches of the Ganjiang River in the provincial meteorological bureau. 3.2. Application of the algorithm
Rainstorms have been frequenting for many years in the upper
reaches of the Ganjiang River. According to the distribution of the Currently in many SVM toolboxes, it is more widely recognized and
major rainfall stations in the Ganjiang River Basin and the average accepted by the public. This software package is very effective for SVM
daily stormstorm evaporation statistics measured over the years, the pattern recognition and regression, and the software package uses the
maximum daily stormstorm evaporation occurred mostly from April results of convergence proof to improve the algorithm and achieve very
to September. The occurrence of heavy rains has made the heavy good results. There are 5 types of Libsvm: (1) C-SVC (2) nu-SVC (3)
rain more frequent and concentrated, and the heavy rains and heavy one-class SVM (4) epsilon-SVR (5) nu-SVR.
typhoons that occurred in July to September mainly occurred. The Regardless of whether the SVM is used for classification or re-
measured annual maximum daily evaporative rainfall distribution of gression problems, its basic ideas and steps include the following: (1)
stations in the Ganjiang River Basin is 1294 mm ∼ 1765 mm, the Generate training and test sets (2) Data preprocessing (3) Create and
average annual temperature is between 17.2 ◦ C ∼ 19.3 ◦ C, the extreme train SVM (4) Simulation tests.
maximum average temperature is 41 ◦ C, and the extreme minimum Generally, from the training data, N sets of data are randomly se-
temperature is −14.3 ◦ C. Humidity is 76%∼82%, the minimum relative lected as training data, and the rest are used as test data. For regression
humidity is 6%, the average extreme maximum wind speed distribution problems, the rows represent the attributes of the input samples, and
for many years is 1.1 ∼ 2.9 m/s. the columns represent the number of input samples. If our data is stored
The alluvial soil located on the right bank of the middle reaches in Excel or stored in a MATLAB variable in the form of a variable,
of the Ganjiang River also borders the hillock plain on the right bank assuming the variable name is X, we only need to place the variable
of the Ganjiang River. It belongs to the low mountain plains and hilly in the current directory of libsvm, and the data can be read directly
areas. The ground elevation is generally about 42–68 m, and the terrain with the following code.
is uneven. Gantian District of Ganzhou City is surrounded by mountains For the regression problem, the input sample may have multiple
on the east, north, and west. Its terrain is generally high in the east attributes, and the value of each attribute may not be on the same order
and low in the west. It belongs to the first-level alluvial terrace on the of magnitude. Before the training model is established, the data should
right bank of the middle reaches of the Ganjiang River. The elevation of be pre-processed by normalization, so as to ensure that the weights
the ground is generally 42–82 ∼ 47.18 m. Slightly tilted from north to are not affected by the magnitude of this attribute affects the output

35
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

Table 2
Different model calculation parameters for each soil layer.
Soil layer 𝜇 K 𝜔
Silty clay 0.35 0.87 30%
Mucky soil 0.25 0.89 45%
Clay 0.35 0.84 36%

variable. Commonly used normalization function to normalize X, X1:


normalized variable X1PS: normalized structure [X1, X1PS] = mapmin- Fig. 1. A, B, C, D four sample processing accuracy.
max (X), Mapminmax is a function that comes with MATLAB, you can
When writing a normalization program, you do not have to use the
mapminmax function to normalize the data. It is worth mentioning that
the normalization of the data is not necessarily performed. According
to relevant data, it is possible to obtain more satisfactory output results
from the data without normalization.
Creating and training SVMs: Before creating an SVM, first determine
the model parameters. This step is the key to the SVM. For regression
problems, the key problem is to determine the optimal parameters c
and g, that is, the penalty function factor and the kernel function factor.
Cross validation is generally used. Method, first determine the search
interval of c and g, and find the c and g that are most suitable for
training data through cyclic judgment. The parameters are substituted Fig. 2. Comparison of test function results convergence.
into the model for training. If the training result is too large or the final
prediction result is not satisfactory, you can set the parameters of the
model.
paper designs and implements a series of methods and tools for seam-
The method of parameter optimization adopts cross validation (CV,
lessly importing data into the geotechnical survey data results table
Cross Validation). Common CV methods include Hold-out Method,
of the project. As shown in software block diagram 1, when the user
K-fold Cross Validation (K-CV), and Leave-One-Out Cross Validation.
selects the triaxial compression geotechnical test geological survey
(1) Hold-out Method: Simply put, the data is divided into two
database and results table of a geological survey project, and the
groups, one is used as a training sample for training the model, and
relevant engineering information and geological survey test database
the other is used as a test sample for simulation prediction. Strictly
of the project, he selects the three intelligently imported three The
speaking, this method is not really CV, because it does not have
shaft compresses the geotechnical information and experimental geo-
cross-validation, and the simulation results are more dependent on the
logical survey data to its corresponding geological survey data table.
grouping of the original data.
Taking the introduction of the triaxial compression geotechnical survey
(2) K-CV: The basic idea is to use the original data for all K groups,
geological survey data table as the main example, it is possible to
and each group is used as a test set, and the remaining K-1 groups are
used as training sets respectively. The number is an index for judging obtain the project related information directly from the geotechnical
performance. information and experimental geological survey database of the project,
(3) Leave-One-Out Cross Validation: each raw data is used as a as well as the results table and triaxial compression engineering test
test sample, and the rest as test samples, so that there are as many geological survey database Engineering test data information and its
models as there are sets of raw data, and the average is used as an related constant data information, and then you can choose the method
index to determine new energy Although this method completes cross- of intelligently importing the corresponding geological survey data of
validation, when there are many samples, the calculation amount is too the project, such as directly creating new, covering index constant data,
large and the calculation time is too long. or directly adding index constant data, directly by importing the data
table. It can easily complete the acquisition and import analysis of the
4. Discussion indicator constant data of this project (see Fig. 1).
This algorithm fully compensates for the deficiency of the current
4.1. Analysis of geotechnical test data version of the algorithm and the geological evaluation and survey
analysis software used in geotechnical engineering that cannot import
This article selects the algorithm simulation standards commonly all the same types and indicators of geotechnical test analysis data in
used by epc equipment in Europe and the United States, uses the batches, which significantly improves the batch size of geotechnical
epcuhf radio frequency simulation frequency band, and 96-bit coded test data. Import accuracy and productivity. Originally, it took at least
electronic tags as samples for algorithm simulation and simulation. The 10 days to enter the geotechnical test data to complete the workload.
maximum number of tags to be identified by the algorithm is 200, and Now the staff only needs a total of about 90 min of work and time to
the minimum increase is 20, The increase rate of tag transmission data complete the batch, making the geotechnical test data more standard,
is 20, and the throughput rate and the amount of tag transmission data and the error rate is almost zero. This algorithm allows all report
are simulated separately, and the correct feasibility of the algorithm analysis data writing staff to be completely relieved from the tedious
simulation design is determined. For the data of actual geotechnical de- work of inputting geotechnical test data, and uses its precious large
sign and test, the most commonly applied in practice are soil moisture amount of work time, energy, and long project preparation period
content, density, soil grain volume specific gravity, particle structure to prepare for something more meaningful. Research and evaluation
analysis, boundary moisture content, permeability, consolidation and of geological analysis software used in geotechnical engineering and
triaxial compression, and More than a dozen physical and mechanical preparation of survey report analysis data.
indicators such as direct shear were used as experimental data. The data As shown in Fig. 2, the problem of minimizing the fitness of using
of some geotechnical tests mainly rely on excel, as shown in Table 2: the extreme point as the benchmark test function is to perform 10
Through the intelligent analysis of data import workflow and the independent experiments on each test function and find the optimal
introduction of the three-axis compression intelligent algorithm, this value, the worst value, the average value and the basic value. The svm

36
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

The test sample is substituted into the SVM simulation function


to obtain the output value of the prediction sample. If the original
prediction sample does not have a corresponding true value, the pre-
diction value output by the model is not compared. Then, only the
empty vector of the corresponding dimension needs to be input when
entering Yes, The 32 sets of original data are used as training samples
for training in the BP neural network model, and 16 sets of samples are
selected as the test samples of the model to determine and analyze the
accuracy of the calculation of the next set of models and the accuracy
Fig. 3. Comparison of training set predictions with real values. of the experimental values. Among them, the bp algorithm uses the
parameter settings: the number of iterations is 100 times, the average
value of the learning rate is 0.1, and the error is 10–6. Some other
algorithm is compared with the basic bp experimental algorithm. The types of parameters use the default values.is 10–6. Some other types of
function fitness of the algorithm for the test function through inde- parameters use the default values (see Fig. 4).
pendent experimental algorithms, its average value, and the optimal Fully understand that bp belongs to neural networkThe training for
value are considered to be the lowest among the three independent small samples is the same as the FOAGRNN algorithm, and the results
experimental algorithm’s fitness and comparison results, and the effect
are not very satisfactory, indicating that they have high requirements
is the best when looking for advantages. For the unimodal function f1
for the number of training data, and the BP algorithm, when the train-
(x) without continuous local step extreme points and the discontinuous
ing samples are relatively small, the algorithm has good stability, and
local step multimodal function f2 (x), neither the svm algorithm nor
the parameter settings have a large For empirical purposes, different
the basic bp algorithm obtains the optimal solution through searching,
parameters should be set according to different data.
which is the global best the surrounding functions have many local
best features. They are often used to test the function’s ability to
jump out and the local optimal operation ability. The characteristics 5. Conclusions
of the random traversal of the improved multimodal function algo-
rithm greatly reduce the danger of the improved algorithm falling
into local convergence. A theoretically optimal solution is obtained (1) This paper introduces an effective data preprocessing technology
for each function, where the extreme point of the concentric circle that performs conceptual layering on discrete data, and mines the as-
function is a multimodal function, and the concentric circle function sociation rules of the database preprocessed by the algorithm, ensuring
of multiple extreme points has the characteristics of extreme values as that the association rule mining algorithm can be used locally. And
a local function. Difficult to search. From the experimental results, this overall can mine more accurate strong association rules.
intelligent algorithm works well, especially for big data. (2) The main method of geotechnical engineering survey is geotech-
nical engineering survey and geotechnical test. The test results will
4.2. Coefficient regression analysis of intelligent algorithms based on inter- directly determine the results of geotechnical engineering survey and
net of things the quality of geotechnical engineering construction process. In order
to solve the problems existing in the geotechnical test, the quality of
Because the analysis data of the permeability coefficient training
the geotechnical test must be improved, so as to guarantee the quality
model in this paper is relatively small, the svm model algorithm is
of the geotechnical engineering investigation and the progress of the
mainly used to perform a parameter inversion on the permeability
geotechnical engineering.
coefficient model. The data comparison and analysis of the models are
performed through the combination of the foagrnn parameter inversion (3) Based on the Internet of Things and intelligent algorithms, com-
algorithm and the fobprnn algorithm. The training sample x extracted bined with sensors and Advantech acquisition cards and other hardware
from the above comparative analysis data is substituted into the svm equipment, develop a geotechnical test data acquisition algorithm.
permeability coefficient model of this article. Since the training data Through experimental verification, the algorithm is simple to operate,
is relatively small, if the training data is divided into two major parts: convenient to maintain, highly open, and reliable. Practicality.
part of the data is used as the model prediction, and part of the data (4) The algorithm realizes the automatic collection and process-
is used as Model prediction training; these two parts will seriously ing of geotechnical test data, reduces the tester’s workload and the
affect the analysis accuracy and requirements of the data on the model. influence of human factors on the test results, makes up the short-
Therefore, the 25 sets of training data plus some of them are used as comings of the traditional acquisition algorithm hardware fixation,
model prediction training samples, and the remaining training samples and solves the problem of simultaneous multi-task Research on remote
plus some training samples are a total of 20 sets of data. As a model control algorithms for experiments was carried out, which promoted
prediction training sample. the development of innovative experiments.
Substituting the two predicted values of a corresponding simulated
training sample into a corresponding svsvm sample model and per-
forming numerical comparison on the simulated training samples can Declaration of competing interest
directly obtain a prediction substituted by the actual simulated training
sample into the output value. Data comparison is performed between
The authors declare that they have no known competing finan-
the corresponding simulated training experimental sample prototype
and the corresponding real training sample predicted output value, that cial interests or personal relationships that could have appeared to
is, the two predicted values of the real training sample and the two real influence the work reported in this paper.
values are respectively numerically compared and analyzed to analyze
the accuracy of the SVM model. As shown in Fig. 3, it can be seen from
CRediT authorship contribution statement
the above figure that the model obtained by the SVM substituting the
training samples into the model has a very high accuracy and can well
meet our requirements . Yawei Ma: Writing - review & editing. Guihong Guo: Supervision.

37
Y. Ma and G. Guo Computer Communications 158 (2020) 32–38

Fig. 4. Comparison of influencing factors and real values in BP neural network.

References [18] Q. Luo, J. Zhu, R. Zhang, et al., Geotechnical centrifugal model test on sandy
soil slope stability, Yanshilixue Yu Gongcheng Xuebao/Chin. J. Rock Mech. Eng.
[1] S. Li, L.D. Xu, S. Zhao, The internet of things: A survey, Inf. Syst. Front. 17 (2) 37 (5) (2018) 1252–1259.
(2015) 243–259. [19] K. Magner, N. Maerz, I. Guardiola, et al., Determining optimum number of
[2] A. Al-Fuqaha, M. Guizani, M. Mohammadi, et al., Internet of things: A survey geotechnical testing samples using Monte Carlo simulations, Arab. J. Geosci. 10
on enabling technologies, protocols and applications, IEEE Commun. Surv. Tutor. (18) (2017) 406–407.
17 (4) (2015) 38–42. [20] M.I. Lingwanda, Anders Prästings, S. Larsson, et al., Comparison of geotechnical
[3] Palade Andrei, Cabrera Christian, Li Fan, Middleware for internet of things: An uncertainties linked to different soil characterization methods, Geomech. Geoeng.
evaluation in a small-scale IoT environment, J. Reliable Intell. Environ. 2 (4) 12 (2) (2016) 1–15.
(2018) 1–21. [21] Jelušič Primod, Žlender Bojan, Predicting geotechnical investigation using the
[4] Lin, Xingqin, Bergman, Johan, Gunnarsson, Fredrik, Positioning for the Internet knowledge based system, Adv. Fuzzy Syst. 20 (1) (2016) 10–11.
of Things: A 3GPP perspective, IEEE Commun. Mag. 55 (12) (2017) 179–185. [22] Cheng-li Fan, Qing-hua Xing, Qiang Fu, A hybrid intelligent algorithm by
[5] Perera Charith, Liu Chi Harold, Jayawardena Srimal, The emerging internet of combining particle swarm optimization with variable neighborhood search for
things marketplace from an industrial perspective: A survey, IEEE Trans. Emerg. solving nonlinear bilevel programming problems, Syst. Eng.-Theory Pract. 35 (2)
Top. Comput. 3 (4) (2015) 585–598. (2015) 473–480.
[6] Qin Yongrui, Quan Z. Sheng, Nickolas J.G. Falkner, When things matter: A [23] Grubišić, I. Kolarić, D. Skala, et al., Intelligent algorithm for smoke extraction
survey on data-centric Internet of Things, J. Netw. Comput. Appl. 64 (25) (2016) in autonomous forest fire detection, 25(2) (2016) 32–36.
137–153. [24] H. Wang, S. Sha, The intelligent fault diagnosis based on Bayesian network
[7] Li Shancang, Tryfonas Theo, Li Honglei, The Internet of Things: A security point inference algorithm solution, Appl. Mech. Mater. 72 (23) (2015) 884–887.
of view, Int. Res. 26 (2) (2016) 337–359. [25] H. Mazin, A.L.-Shaikhly Razuky, Intelligent cloud computing security using
[8] H. Hirata, Y. Kanda, S. Ohashi, Testing and specification of recycled materials genetic algorithm as a computational tools, J. Phys. Conf. 13 (1) (2018) 1–5.
for sustainable geotechnical construction. 270(8) (2015) 781–784.
[9] P. Bossart, D. Jaeggi, C. Nussbaum, Experiments on thermo-hydro-mechanical
behaviour of Opalinus Clay at Mont Terri rock laboratory, Switzerland, J. Rock
Yawei Ma was born in Xi’an, Shaanxi. P.R. China, in 1979.
Mech. Geotech. Eng. 9 (3) (2017) 120–128.
She is working in School of Civil Engineering and Me-
[10] Zhussupbekov Askar, Omarov Abdulla, Modern advances in the Fiel-d geotech-
chanics, Lanzhou University. Her research interest include
nical testing investigations of pile foundations, Procedia Eng. 165 (12) (2016)
Geotechnical and civil engineering. E- mail: mayw@lzu.edu.
88–95.
cn
[11] C. Morin, T. Sedran, François de Larrard, et al., Development of an excavatability
test for backfill materials: Numerical and experimental studies, Can. Geotech. J.
55 (1) (2018) 35–38.
[12] Zame Philémon Zo’o, Assomo Philippe Samba, Onwualu Josephine Nchekwube,
Assessment of geotechnical properties of lateritic gravels from South-Cameroon
road network, Int. J. Geosci. 8 (8) (2017) 949–964.
[13] Wang Lin, Cao Zi Jun, Li Dian Qing, Determination of site-specific soil-water
characteristic curve from a limited number of test data— A Bayesian perspective,
Earth Sci. Front. 35 (14) (2018) 55–60. Guihong Guo, is working as teacher in the Department
[14] Chen Lifang, Cao Dai, Liu Yuan, A new intelligent jigsaw puzzle algorithm base of Geology Engineering, Lanzhou University, Her current
on mixed similarity and symbol matrix, Int. J. Pattern Recogn. Artif. Intell. 32 research includes seismology and Geology Engineering.
(2) (2017) 32–37. E-mail: ggh7941@edu.com
[15] Wöhlecke Andreas, Werner W. Müller, Der Zugversuch an DichtUngsbahnen,
Geotechnik 40 (5) (2017) 56–61.
[16] Son Changman, Intelligent rule-based sequence planning algorithm with fuzzy
optimization for robot manipulation tasks in partially dynamic environments,
Inform. Sci. 342 (7) (2015) 209–221.
[17] Q.I. Chang-Guang, F. Gao-Feng, C. Yun-Liang, et al., Geotechnical physical model
test using artificial synthetic transparent soil, Rock Soil Mech. 42 (7) (2015)
29–35.

38

You might also like