Outlier Generation and Anomaly Detection Based On Intelligent One-Class Techniques Over A Bicomponent Mixing System

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Outlier Generation and Anomaly

Detection Based on Intelligent One-Class


Techniques over a Bicomponent
Mixing System

Esteban Jove1,2(B) , José-Luis Casteleiro-Roca1 , Héctor Quintián1 ,


Juan Albino Méndez-Pérez2 , and José Luis Calvo-Rolle1
1
Department of Industrial Engineering, University of A Coruña,
Avda. 19 de febrero s/n, 15405 Ferrol, A Coruña, Spain
esteban.jove@udc.es
2
Department of Computer Science and System Engineering,
Universidad de La Laguna, Avda. Astrof. Francisco Sánchez s/n,
38200 S/C de Tenerife, Spain
jamendez@ull.edu.es

Abstract. One of the most important points to improve the profits in


an industrial process lies on the fact of achieving a good optimisation and
applying a smart maintenance plan. Under this circumstances an early
anomaly plays an important role. Then, the implementation of classifiers
for anomaly detection is an important challenge. As many of the anoma-
lies that can occur in a plant have an unknown behaviour, it is necessary
to generate artificial outliers to check these classifiers. This work presents
different one-class intelligent techniques to perform anomaly detection in
an industrial facility, used to obtain the main material for wind generator
blades production. Furthermore, artificial anomaly data are generated to
check the performance of each technique. The final results achieved are
successful in general terms.

Keywords: Anomaly detection · Control system · Outlier generation

1 Introduction
In general terms, a high percentage of enterprises presents complex and expensive
processes whose operation can be optimized [1,18]. The technological advances
achieved in many different industrial fields, such as instrumentation or automa-
tion has led to the optimisation and development of industrial processes [15].
In addition, factors like of the promotion of energy efficiency policies or the
increased competitiveness must be taken into consideration in any plant opti-
misation [20,40]. Then, a good operation in a specific process requires an early
detection of any deviation from its normal operation of actuators, sensors, and
so on [12,24]. This is specially important in high cost and safe-critical processes
[9,28]. From an economic point of view, avoiding a wrong performance of any
c Springer Nature Switzerland AG 2020
F. Martı́nez Álvarez et al. (Eds.): SOCO 2019, AISC 950, pp. 399–410, 2020.
https://doi.org/10.1007/978-3-030-20055-8_38
400 E. Jove et al.

part of a facility, can lead to economic savings in terms of energy consumption,


raw material waste and corrective maintenance.
From a theoretical point of view, a data pattern that represents an unex-
pected behaviour is defined as an anomaly [7]. In a specific application, the
anomaly detection task must tackle some issues like selecting a limit between
normal and anomaly data, the availability of data during anomaly operation or
the appearance of noise [7].
The previous information about the initial dataset can lead to three different
cases of anomaly detection [7,16,38]:
– Type A: in this case, the anomaly detection classifier is obtained using only
data from normal operation. In this case, semi-supervised techniques are
applied since data is pre-classified as normal. This kind of classification is
known by one-class.
– Type B: the nature of the initial dataset is not known. Then, unsupervised
techniques are used to classify data between normal and anomaly data.
– Type C: the initial dataset is composed by pre-labelled normal and anomalous
data, so supervised algorithms are used in this case to model the system [6,30]
The use of anomaly detection techniques is widely spread in the field of
fault detection in industrial processes, fraud detection of credit cards, intrusion
detection in surveillance systems or even in medicine diagnosis, where it could
represent a helpful tool to, for instance, reduce the workload of the medical staff
[13,22,22].
Given the difficulty of obtaining data from faulty situations, checking the
performance of a one classifier may require the generation of artificial outliers.
Previous works related to this task deals the outlier generation by using linear
transformation between variables [29] or setting normal data geometrically out
of the target set [37].
This works presents the anomaly generation and detection of an industrial
plant, consisting of a bicomponent mixing system used on manufacturing process
of wind generator blades. In this case, different intelligent one-class classifiers
are implemented and assessed in the anomaly detection task using real data,
obtained from the right plant operation is used. The techniques used are: the
Approximate Convex Hull (ACH), Autoencoder and Support Vector Machine
(SVM). Then, the proposed classifiers are assessed and validated using generated
artificially anomalies using the Boundary Value Method [37].
This paper is structured as follows. After the present section, The case of
study is introduced. Then, Sect. 3 explains the used techniques to generate the
outliers and obtain the classifiers. Section 4 describes the experiments and results
and finally, the conclusions and future works are presented in Sect. 5.

2 Case of Study
2.1 Bicomponent Mixing System
The installation used to develop the present work aims at mixing two different
fluids to obtain a bicomponent material, whose features are suitable for wind
Outlier Generation and Anomaly Detection 401

generator blades. The primary fluids, the epoxy resin and the catalyst, are stored
in two different tanks. They are boosted by two independent pumps, supplied
by variable frequency drives, and mixed in a valve in charge of delivering the
proper amount of bicomponent at the output [17].
Figure 1 shows a simplified scheme of the above explanation with the actua-
tors, valves, pipes and sensors of the mixing system.

Fig. 1. Process scheme

The primary fluids to be pumped have Non-Newtonian characteristics, which


means that their properties vary depending on the mechanic efforts applied on
them [10]. Factors like the viscosity variation with the pump speed or the pump
efficiency, make this system more sophisticated, and hence, in this context, the
anomaly detection is a challenge.

2.2 Dataset Description


During the right operation of this industrial plant, the values of ten variables
were registered and used as the initial dataset. This variables are: the real value
and set point of the bicomponent flow rate at the output, the flow rate of fluids
1 and 2, the pressure and speeds of pumps 1 and 2 and the pressure after the
flowmeter 1 and 2. After inspecting the initial dataset, the null measurements
were removed from the initial 9511 samples, and finally, 8549 are considered.

3 Techniques Applied to Validate the Proposed Model


This study assesses three different one-class classification techniques, that are
briefly described in Subsect. 3.1. Then, to validate the anomaly detection, outliers
402 E. Jove et al.

are artificially generated according to the procedure shown on Subsect. 3.2. To


assess the performance of each classifier, each one was checked using a k − f old
cross-validation with k = 10 that was repeated 3 times.

3.1 One-Class Techniques

Approximate Convex Hull. The Approximate Convex Hull is a one-class


classification technique, that offered successful results in previous works [4,11].
The basic idea of this method is to approximate the limits of a dataset S ∈
Rn using its convex hull. Due to the fact that the convex limits of a dataset
with N samples and d variables has a computational cost of O(N (d/2)+1 ) [4],
the convex hull is approximated using p random projections on 2D planes and
determine their convex limits on that plane. Then, after modelling the convex
hull of the training data making p projections, when a new test sample arrives, it
is considered as an anomaly if it is out of the convex hull in any of the projections.
In addition, a parameter λ can be defined as a factor that expands of reduces
the convex limits from the centroid of each projection. A value of λ higher than
1 expands the limits, while if it is lower than 1, the limits are contracted. A
proper value of the parameter λ depends on the nature of the dataset.
An example of an anomaly point in R3 space is shown in Fig. 2. In this case,
the anomaly is out of just one of the projections.

Fig. 2. Anomaly point in R3


Outlier Generation and Anomaly Detection 403

Artificial Neural Networks. Autoencoder. The use of Artificial Neural


Networks (ANN) for one-class using an Autoencoder configuration, has obtained
very good results in different applications, such as anomaly detection or data
noise reduction [14,33,36].
The most common ANN for this task is the Multilayer Perceptron (MLP)
[39], which has an input layer, an output layer and a hidden layer. Each layer
is composed by neurons connected with weighted links between adjacent layers,
which have nonlinear activation functions [3,5,19]. The approach of Autoencoder
technique is to reconstruct the patterns from the input to the output using an
intermediate nonlinear dimensional reduction in the hidden layer. Then, the
number of input neurons and output neurons are the same as the variables of
the dataset, and the hidden layer must have at least one less neuron.
The intermediate nonlinear reduction acts like a filter, that eliminates the
data that is not consistent with the dataset. Then, anomaly data must lead
to high reconstruction error, which is the difference between the input and the
output estimated by the MLP.

Support Vector Machine. The Support Vector Machine (SVM) is used in


regression and classification tasks [8,21,23]. In one class classifications, this tech-
nique maps the dataset into a high dimensional space by means of a kernel func-
tion. In this space, a hyper-plane that maximises the distance between the data
and the origin is implemented [32].
When the SVM is trained, the criteria to detect if a test data is an anomaly, is
based on the distance of that point to the hyper-plane. If the distance is negative,
the data is inside the target class and is considered as an anomaly otherwise.

3.2 Artificial Outlier Generation


This paper assesses the performance of three different one class techniques on
the task of anomaly detection of an industrial plant, used to obtain the wind
generator blades material. Then, it is necessary to generate artificial outliers,
that simulate deviations on the plant operation. For this purpose, the Boundary
Value Method [37] is applied in this case. For instance, from a dataset with M
samples and n attributes, this technique generates anomalous points shifting the
samples out of the training dataset boundary. The process followed to generate
an artificial outlier from a specific data point a ∈ Rn is described next:

1. Find out the maximum and minimum of each attribute in the initial dataset
and save these values in two vectors Vmax , Vmin ∈ Rn .
2. Select randomly 2 dimensions u and v of the n-dimensional dataset.
3. Replace the value of a(u) randomly by Vmax (u) or Vmin (u).
4. Replace the value of a(v) randomly by Vmax (v) or Vmin (v).

The example of Fig. 3 shows the transformation of a point inside the target
class (green point) in R3 into an anomaly. The transformation T x replaces the
x coordinate by the maximum registered (yellow dot), and the transformation
404 E. Jove et al.

T y changes the y value by its minimum (red dot). Then, it can be noticed that
the data generated is out of the initial set.

Fig. 3. Anomaly point in R3

4 Experiments and Results

From the initial dataset, corresponding to the right operation of the plant, 5
% of the samples were artificially modified to transform them into anomalies,
following the process described in Sect. 3.2. For each sample, two dimensions are
randomly selected and replaced by its maximum or minimum values. The one
class techniques, used to assess the anomaly generation, were validated using
a 3 times nested k − f old with k = 10. The performance of each classifier is
assessed using the Area Under Curve (AUC) parameter, that relates the true
positives and false positives [2]. In addition, the standard deviation (SD) of the
AUC obtained for each repetition and the training time is registered.

4.1 Approximate Convex Hull Classifier

The performance of this technique was evaluated using 5, 10, 50, 100, 500 and
1000 2D projections with a λ value of 0.9, 1 and 1.1. Given the geometric nature
of this technique, the dataset was not normalised. The AUC, the SD and the
training time for each configuration can be seen in Table 1. The obtained results
with λ = 0.9 were not successful and were omitted due to the wrong performance
of this configuration.
Outlier Generation and Anomaly Detection 405

Table 1. Results obtained with ACH classifier

Parameter λ Projections AUC (%) STD (%) Time (min)


1 5 97,09 0,13 0,005
10 98,34 0,16 0,013
50 98,90 0,21 0,058
100 98,74 0,32 0,109
500 98,35 0,34 0,560
1000 98,40 0,42 1,114
1,1 5 93,51 0,02 0,004
10 96,19 0,00 0,007
50 98,67 0,07 0,065
100 99,00 0,08 0,103
500 99,25 0,08 0,563
1000 99,20 0,08 1,116

4.2 Artificial Neural Network Autoencoder Classifier

The MLP Autoencoder was obtained using the Matlab function train
Autoencoder [25]. As mentioned in Sect. 3.1, the number of neurons in the hidden
layer must be less than the number of inputs. Then, to check the importance of
this parameter, it is varied from 1 to 9 with a step of 2 neurons. Different config-
uration were tried regarding the normalisation, first with a 0 to 1 normalisation,
then with z-score [35] and the results without normalisation were also checked.
The criteria to select if a test data is an anomaly, is based on the reconstruction
error. If the reconstruction error is higher than the one obtained with 99% of
the training set, it is considered as an anomaly. The results in terms of AUC,
SD and training cost for each configuration is shown in Table 2.

4.3 SVM Classifier

The SVM one-class classifier was obtained using the Matlab function f itcsvm
[26]. The kernel function was set as Gaussian, the percentage of outlier fraction
of the training data was varied from 0 to 3. The influence of the normalisation
methods was assessed as for Autoencoder. As mentioned in Sect. 3.1, the criteria
to decide if a test data is an anomaly is based on its distance to the decision
plane. This is evaluated with the Matlab function predict [27]. The results are
presented in Table 3
406 E. Jove et al.

Table 2. Results obtained with autoencoder classifier

Normalisation Number of neurons AUC (%) STD (%) Time (min)


NoNorm 1 80,82 0,60 124,050
3 96,13 0,32 210,016
5 97,20 0,85 225,571
7 97,49 0,25 224,424
9 97,56 0,25 259,265
0 to 1 1 87,27 0,20 0,341
3 88,50 0,21 1,520
5 88,34 0,16 1,234
7 88,49 0,28 1,428
9 88,35 0,18 1,261
Zscore 1 91,42 0,24 1,168
3 93,28 0,44 10,527
5 94,55 0,82 8,285
7 95,03 0,29 10,175
9 95,03 0,22 11,772

Table 3. Results obtained with SVM

Normalisation Outlier frac. (%) AUC (%) STD (%) Time (min)
NoNorm 0 95,82 0,14 2,755
1 95,32 0,35 2,580
2 94,72 0,54 2,631
3 94,53 0,54 2,621
0 to 1 0 95,79 0,15 2,571
1 95,39 0,33 2,613
2 94,83 0,60 2,582
3 94,53 0,60 2,598
Zscore 0 95,83 0,13 2,598
1 95,42 0,36 2,616
2 94,90 0,53 2,586
3 94,52 0,56 2,662
Outlier Generation and Anomaly Detection 407

5 Conclusions and Future Works


This work assesses the artificial anomaly generation in an industrial plant used
to obtain wind generator blades material with the aim of check fault detection
methods. The proposal to generate anomalies shows very good results. This
is shown as an extremely useful method to generate different anomaly situa-
tions that help to check the performance of different one-class techniques. The
implementation of the classification techniques proposed in an industrial process,
would allow its manager to detect slight variations in it. Hence, this informa-
tion can be used for different tasks, such as predictive maintenance planning or
corrective maintenance management, among others.
The outlier data generated were used to train different one-class classifiers.
The best AUC (99.25%) value is achieved using Approximate Convex Hull with
λ = 1.1 and 500 projections. However, the computational cost must be taken
into consideration. In that case, this technique achieves almost the same AUC
(98.90%) with λ = 1 and 50 projections with a significantly lower computational
cost (0.058 min). Autoencoder offers good AUC results in general terms. How-
ever, the kind of normalisation has a high influence over the training time. In
addition, an increase of the number of neurons in the hidden layer leads to a
better performance in AUC and higher computational cost. Finally, the results
obtained with SVM shows good performance regardless the kind of normalisa-
tion, or outlier fraction.
Taking into account the results of the three techniques, the Approximate
Convex Hull with 50 projections and λ = 1 is chosen as the optimal solution.
As future works, the use of different techniques to generate anomalies can
be considered. Due to the fact that the plant evolves during right operation,
a re-train process can be performed to update the classifiers. The possibility of
applying clustering algorithms to the initial dataset could be taken into consider-
ation. To reduce training time, the correlation between variables can be assessed
using dimensional reduction [31,34]. The one-class classifiers proposed detect
anomalies in the system with good performance. However, the source of the
anomaly is not identified concretely. Then, in future works, a specific one-class
classifier could be implemented for each facility point.

References
1. Alaiz Moretón, H., Calvo Rolle, J., Garcı́a, I., Alonso Alvarez, A.: Formalization
and practical implementation of a conceptual model for PID controller tuning.
Asian J. Control 13(6), 773–784 (2011)
2. Bradley, A.P.: The use of the area under the ROC curve in the evaluation of
machine learning algorithms. Pattern Recognit. 30(7), 1145–1159 (1997)
3. Calvo-Rolle, J.L., Quintian-Pardo, H., Corchado, E., del Carmen Meizoso-López,
M., Garcı́a, R.F.: Simplified method based on an intelligent model to obtain the
extinction angle of the current for a single-phase half wave controlled rectifier with
resistive and inductive load. J. Appl. Log. 13(1), 37–47 (2015)
408 E. Jove et al.

4. Casale, P., Pujol, O., Radeva, P.: Approximate convex hulls family for one-class
classification. In: International Workshop on Multiple Classifier Systems, pp. 106–
115. Springer (2011)
5. Casteleiro-Roca, J.L., Barragán, A.J., Segura, F., Calvo-Rolle, J.L., Andújar, J.M.:
Fuel cell output current prediction with a hybrid intelligent system. Complexity
(2019)
6. Casteleiro-Roca, J.L., Jove, E., Sánchez-Lasheras, F., Méndez-Pérez, J.A., Calvo-
Rolle, J.L., de Cos Juez, F.J.: Power cell soc modelling for intelligent virtual sensor
implementation. J. Sens. (2017)
7. Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: a survey. ACM Com-
put. Surv. (CSUR) 41(3), 15 (2009)
8. Chen, Y., Zhou, X.S., Huang, T.S.: One-class SVM for learning in image retrieval.
In: Proceedings of 2001 International Conference on Image Processing, vol. 1, pp.
34–37. IEEE (2001)
9. Chiang, L.H., Russell, E.L., Braatz, R.D.: Fault Detection and Diagnosis in Indus-
trial Systems. Springer, Heidelberg (2000)
10. Fan, H., Wong, C., Yuen, M.F.: Prediction of material properties of epoxy materials
using molecular dynamic simulation. In: 7th International Conference on Thermal,
Mechanical and Multiphysics Simulation and Experiments in Micro-Electronics
and Micro-Systems, EuroSime 2006, pp. 1–4, April 2006
11. Fernández-Francos, D., Fontenla-Romero, Ó., Alonso-Betanzos, A.: One-class con-
vex hull-based algorithm for classification in distributed environments. IEEE Trans.
Syst. Man Cybern.: Syst. 99, 1–11 (2018)
12. Garcia, R.F., Rolle, J.L.C., Castelo, J.P., Gomez, M.R.: On the monitoring task of
solar thermal fluid transfer systems using NN based models and rule based tech-
niques. Eng. Appl. Artif. Intell. 27, 129 – 136 (2014). http://www.sciencedirect.
com/science/article/pii/S0952197613001127
13. González, G., Angelo, C.D., Forchetti, D., Aligia, D.: Diagnóstico de fallas en el
convertidor del rotor en generadores de inducción con rotor bobinado. Revista
Iberoamericana de Automática e Informática industrial 15(3), 297–308 (2018).
https://polipapers.upv.es/index.php/RIAI/article/view/9042
14. Goodfellow, I., Bengio, Y., Courville, A., Bengio, Y.: Deep Learning, vol. 1. MIT
Press, Cambridge (2016)
15. Hobday, M.: Product complexity, innovation and industrial organisation. Res. Pol-
icy 26(6), 689–710 (1998)
16. Hodge, V., Austin, J.: A survey of outlier detection methodologies. Artif. Intell.
Rev. 22(2), 85–126 (2004)
17. Jove, E., Aláiz-Moretón, H., Casteleiro-Roca, J.L., Corchado, E., Calvo-Rolle, J.L.:
Modeling of bicomponent mixing system used in the manufacture of wind generator
blades. In: Corchado, E., Lozano, J.A., Quintián, H., Yin, H. (eds.) Intelligent Data
Engineering and Automated Learning - IDEAL 2014, pp. 275–285. Springer, Cham
(2014)
18. Jove, E., Alaiz-Moretón, H., Garcı́a-Rodrı́guez, I., Benavides-Cuellar, C.,
Casteleiro-Roca, J.L., Calvo-Rolle, J.L.: PID-ITS: an intelligent tutoring system
for PID tuning learning process. In: International Joint Conference SOCO 2017-
CISIS 2017-ICEUTE 2017, León, Spain, 6–8 September 2017, pp. 726–735. Springer
(2017)
19. Jove, E., Antonio Lopez-Vazquez, J., Isabel Fernandez-Ibanez, M., Casteleiro-
Roca, J.L., Luis Calvo-Rolle, J.: Hybrid intelligent system to predict the individual
academic performance of engineering students. Int. J. Eng. Educ. 34(3), 895–904
(2018)
Outlier Generation and Anomaly Detection 409

20. Jove, E., Casteleiro-Roca, J.L., Quintián, H., Méndez-Pérez, J.A., Calvo-Rolle,
J.L.: A new approach for system malfunctioning over an industrial system control
loop based on unsupervised techniques. In: Graña, M., López-Guede, J.M., Etxaniz,
O., Herrero, Á., Sáez, J.A., Quintián, H., Corchado, E. (eds.) International Joint
Conference SOCO’18-CISIS’18-ICEUTE’18, pp. 415–425. Springer, Cham (2018)
21. Jove, E., Gonzalez-Cava, J.M., Casteleiro-Roca, J.L., Méndez-Pérez, J.A., Antonio
Reboso-Morales, J., Javier Pérez-Castelo, F., Javier de Cos Juez, F., Luis Calvo-
Rolle, J.: Modelling the hypnotic patient response in general anaesthesia using
intelligent models. Log. J. IGPL 27, 189–201 (2018)
22. Moreno-Fernandez-de Leceta, A., Lopez-Guede, J.M., Ezquerro Insagurbe, L.,
Ruiz de Arbulo, N., Graña, M.: A novel methodology for clinical semantic anno-
tations assessment. Log. J. IGPL 26(6), 569–580 (2018). https://doi.org/10.1093/
jigpal/jzy021
23. Li, K.L., Huang, H.K., Tian, S.F., Xu, W.: Improving one-class SVM for anomaly
detection. In: 2003 International Conference on Machine Learning and Cybernetics,
vol. 5, pp. 3077–3081. IEEE (2003)
24. Manuel Vilar-Martinez, X., Aurelio Montero-Sousa, J., Luis Calvo-Rolle, J., Luis
Casteleiro-Roca, J.: Expert system development to assist on the verification of
”TACAN” system performance. Dyna 89(1), 112–121 (2014)
25. MathWorks: Autoencoder. https://es.mathworks.com/help/deeplearning/ref/
trainautoencoder.html. Accessed 29 Jan 2019
26. MathWorks: fitcsvm. https://es.mathworks.com/help/stats/fitcsvm.html.
Accessed 29 Jan 2019
27. MathWorks: predict. https://es.mathworks.com/help/stats/classreg.learning.
classif.compactclassificationsvm.predict.html. Accessed 29 Jan 2019
28. Miljković, D.: Fault detection methods: a literature survey. In: 2011 Proceedings
of the 34th International Convention on MIPRO, pp. 750–755. IEEE (2011)
29. Pei, Y., Zaı̈ane, O.: A synthetic data generator for clustering and outlier analysis.
University of Alberta, edmonton, AB, Canada, Department of Computing science
(2006)
30. Quintián, H., Casteleiro-Roca, J.L., Perez-Castelo, F.J., Calvo-Rolle, J.L., Cor-
chado, E.: Hybrid intelligent model for fault detection of a lithium iron phosphate
power cell used in electric vehicles. In: International Conference on Hybrid Artifi-
cial Intelligence Systems, pp. 751–762. Springer (2016)
31. Quintián, H., Corchado, E.: Beta scale invariant map. Eng. Appl. Artif. Intell. 59,
218–235 (2017)
32. Rebentrost, P., Mohseni, M., Lloyd, S.: Quantum support vector machine for big
data classification. Phys. Rev. Lett. 113, 130503 (2014). https://doi.org/10.1103/
PhysRevLett.113.130503
33. Sakurada, M., Yairi, T.: Anomaly detection using autoencoders with nonlinear
dimensionality reduction. In: Proceedings of the MLSDA 2014 2nd Workshop on
Machine Learning for Sensory Data Analysis, p. 4. ACM (2014)
34. Segovia, F., Górriz, J.M., Ramı́rez, J., Martinez-Murcia, F.J., Garcı́a-Pérez, M.:
Using deep neural networks along with dimensionality reduction techniques to
assist the diagnosis of neurodegenerative disorders. Log. J. IGPL 26(6), 618–628
(2018). https://doi.org/10.1093/jigpal/jzy026
35. Shalabi, L.A., Shaaban, Z.: Normalization as a preprocessing engine for data min-
ing and the approach of preference matrix. In: 2006 International Conference on
Dependability of Computer Systems, pp. 207–214, May 2006
410 E. Jove et al.

36. Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., Manzagol, P.A.: Stacked denois-
ing autoencoders: learning useful representations in a deep network with a local
denoising criterion. J. Mach. Learn. Res. 11(Dec), 3371–3408 (2010)
37. Wang, C.K., Ting, Y., Liu, Y.H., Hariyanto, G.: A novel approach to generate
artificial outliers for support vector data description. In: IEEE International Sym-
posium on Industrial Electronics, ISIE 2009, pp. 2202–2207. IEEE (2009)
38. Wojciechowski, S.: A comparison of classification strategies in rule-based classifiers.
Log. J. IGPL 26(1), 29–46 (2018). https://doi.org/10.1093/jigpal/jzx05
39. Zeng, Z., Wang, J.: Advances in Neural Network Research and Applications, 1st
edn. Springer Publishing Company, Heidelberg (2010). Incorporated
40. Zotes, F.A., Peñas, M.S.: Heuristic optimization of interplanetary trajectories in
aerospace missions. Revista Iberoamericana de Automática e Informática Indus-
trial RIAI 14(1), 1–15 (2017). http://www.sciencedirect.com/science/article/pii/
S1697791216300486

You might also like