Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Electric Power Systems Research 74 (2005) 1–7

Fault diagnosis of power transformer based on multi-layer


SVM classifier夽
L.V. Ganyun ∗ , Cheng Haozhong 1 , Zhai Haibao, Dong Lixin
Department of Electrical Engineering, Shanghai Jiaotong University, Shanghai 200030, China

Received 17 January 2004; received in revised form 13 June 2004; accepted 6 July 2004

Abstract

Support vector machine (SVM) is a novel machine learning method based on statistical learning theory (SLT). SVM is powerful for the
problem with small sampling, nonlinear and high dimension. A multi-layer SVM classifier is applied to fault diagnosis of power transformer
for the first time in this paper. Content of five diagnostic gases dissolved in oil obtained by dissolved gas analysis (DGA) is preprocessed
through a special data processing, and six features are extracted for SVMs. Then, the multi-layer SVM classifier is trained with the training
samples, which are extracted by the above data processing. Finally, the four fault types of transformer are identified by the trained classifier.
The test results show that the classifier has an excellent performance on training speed and reliability.
© 2004 Published by Elsevier B.V.

Keywords: Fault diagnosis; Multi-layer SVM classifier; Power transformer; Reliability

1. Introduction gas method, ratio method [3] and recently artificial intelli-
gence (AI) methods, such as expert system (ES) [4–5], fuzzy
Power transformers, as expensive items, need to be care- logic (FL) [6–7] and artificial neural networks (ANN) [8],
fully monitored throughout their operation. Fault diagnosis and the combinations of these methods have given promising
of it is important for safety of the device and relevant power results [9–17]. The conventional key gas and ratio methods
system. Study in the past decades has proved that the dis- are based on experience in fault diagnosis using DGA data,
solved gases in oil are related closely to internal faults. Dis- which may vary from utility to utility due to the heuristic
solved gas analysis (DGA) has gained worldwide acceptance nature of the methods and the fact that no general math-
as a diagnostic method for the detection of transformer’s in- ematical formulation can be utilized. The ES and FL ap-
ternal faults [1,2]. Fault gases are produced by degradation proaches can take DGA standards and other human exper-
of the transformer oil and solid insulating materials, such tise to form decision-making system. Information, such as
as paper, pressboard and transformer board, which are all influence of transformer size, manufacturer, volume of oil,
made of cellulose. The rate of cellulose and oil degradation gassing rates and history of the diagnosis result can be uti-
is significantly increased in the presence of a fault inside the lized. However, there are some intrinsic shortcomings, for
transformer. In the past years, various fault diagnosis tech- example, the difficulty of acquiring knowledge and main-
niques have been proposed, including the conventional key taining a database. Both methods need a large knowledge
base that must be constructed manually. In order to over-
come the shortcomings, one solution is realized to obtain its
夽 Program code of this paper is free for researcher interested. optimization result through evolutionary algorithm [9]. And
∗ Corresponding author. Tel.: +86 2162932405. a novel evolution computation enhanced method was pro-
E-mail addresses: stmc17@sina.com (L.V. Ganyun), posed to ameliorate the fuzzy diagnosis capabilities [10]. Tra-
chenghz@online.sh.cn (C. Haozhong). ditional ANN method can directly acquire experience from
1 Tel.: +86 2154742813.

0378-7796/$ – see front matter © 2004 Published by Elsevier B.V.


doi:10.1016/j.epsr.2004.07.008
2 L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7

the training data, and overcome some of the shortcomings


of the expert system. However, it suffers from a number
of weaknesses, including the need for a large number of
controlling parameters, difficulty in obtaining a stable so-
lution and the danger of over-fitting. To overcome the draw-
backs of traditional neural networks, a new type of neutral
network is proposed for incipient fault diagnosis of power
based on extension theory [11]. As ANN, ES and FL ap-
proaches have their advantages and disadvantages, hybrid
artificial intelligence approaches are also under considera-
tion. Their disadvantages could be overcome by connecting
ES, FL, ANN and evolutionary algorithms (EA) as a whole
[12–17]. Fig. 1. Optimal separating hyperplane.
Recently, support vector machine (SVM), a novel net-
work algorithm, originally developed by Vapnik and Cortes
[18] has emerged as one powerful tool for data analysis. It to generalize well as opposed to the other possible bound-
is powerful for the problem with small sampling, nonlinear aries.In most instances, problems could not be separated lin-
and high dimension. It provides a unique solution and is a early. Nonlinear classification algorithm is reviewed in this
strongly regularized method appropriate for ill-posed prob- section.
lems. SVM has been widely used for many applications, such Given a set of training data
as face recognition [19], time series forecasting [20], fault de-
tection [21,22] and modeling of nonlinear dynamic systems D = {(x1 , y1 ), . . . , (xi , yi ), . . . , (xl , yl )},
[23]. x ∈ Rn , y ∈ {−1, 1} (1)
In this paper, SVM is introduced into fault diagnosis of
power transformer for the first time, and a multi-layer SVM where xi is the training data, l the number of training data and
classifier is presented for classifying some familiar fault types yi is the class label (1 or −1) for xi .
of transformer. The structure of this paper is as follows. In Firstly, a nonlinear function is employed to map the orig-
Section 2, we take a short review of SVM. Then, fault diag- inal input space Rn to N-dimensional feature space,
nosis of power transformer based on multi-layer SVM clas-
ϕ(x) = (ϕ1 (x), ϕ2 (x), . . . , ϕN (x)) (2)
sifier is analyzed in Section 3. Application examples are in-
cluded in Section 4. Finally, in Section 5 conclusions are Then the separating hyperplane is constructed in this high
drawn. dimension feature space. The classifier takes the form as
y(x) = sgn(w · ϕ(x) + b) (3)
2. Review of SVM
where w is the weight vector and b is a scalar.
To obtain the optimal classifier, ||w|| should be minimized
Support vector machine was originally introduced by Vap-
under the following constraints
nik and co-workers in the late 1990s. While traditional sta-
tistical theory keeps to empirical risk minimization (ERM), yi [ϕ(xi ) · w + b] ≥ 1 − ξi , i = 1, 2, . . . , l (4)
SVM satisfies structural risk minimization (SRM) based on
statistical learning theory (SLT), whose decision rule could The variables ξ i are positive slack variables, which is neces-
still obtain small error to independent test sampling. SVM sary to allow misclassification.
mainly has two classes of applications, classification and re- Thus, according to principle of structural risk minimiza-
gression. In this paper, application of classification is dis- tion, the optimal problem can be formulated as minimization
cussed. of the following objective function J:
The classification problem can be restricted to consid- l
1 
eration of the two-class problem without loss of general- min J(w, ξ) = ||w||2 + C ξi (5)
ity. In this problem, the goal is to separate the two classes 2
i=1
by a function (a classifier), which is induced from avail-
able examples and work well on unseen examples, i.e. it s.t. yi [ϕ(xi ) · w + b] ≥ 1 − ξi
generalizes well. Consider the example in Fig. 1. There are
many possible linear classifiers that can separate the data,
ξi ≥ 0, i = 1, . . . , l
but there is only one that maximizes the margin (maximizes
the distance between it and the nearest data point of each where C is the margin parameter.
class). This linear classifier is termed the optimal separat- According to Lagrangian principle, the above problem can
ing hyperplane. Intuitively, we would expect this boundary be transformed to its corresponding form as follows
L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7 3

L(w, b, ξ, a, γ) This distinguishing function is the so-called SVM.


l l From the above analysis, it can be concluded that SVM is
1  
= ||w||2 + C ξi − ai (yi [ϕ(xi ) · w + b] decided by training samples and kernel function. The con-
2 struction and selection of kernel function is important to
i=1 i=1
SVM. But the kernel function is often given directly in prac-
l
 tice. Some common kernel functions are shown as follows:
− 1 + ξi ) − γi ξ i (6)
i=1 (1) The polynomial kernel function has two forms as follows
where ai ≥ 0, γ i ≥ 0 (i = 1, 2, . . ., l ) are Lagrange multipliers. d
According to the condition of optimality K(x, x ) = x, x  (16)

∂L ∂L ∂L d
= 0, = 0, =0 (7) K(x, x ) = ( x, x  + 1) (17)
∂w ∂b ∂ξ
where d is the power number of polynomial.
We have the following equations:
(2) Gaussian radial basis function is a universal kernel func-
l
 tion
ai yi = 0 (8)  
i=1 ||x, x ||2
 K(x, x ) = exp − (18)
2σ 2
w= ai yi ϕ(xi ) (9)
(3) B-splines are another popular formulation. The kernel is
C − a i − γi = 0 (10)
defined on the interval [−1,1] and has an attractive closed
Hence, from Eqs. (6), (8)–(10), the dual problem is form,
l l
1   K(x, x ) = B2N+1 (x − x ) (19)
max W(a) = − ai aj yi yj (ϕ(xi ), ϕ(xj )) + ai (11)
2
i,j=1 i=1 where B2N+1 is the uniform B-spline of 2N+1 order.
By defining kernel function (4) More complicated kernels can be obtained by summing
and multiplying kernels,
K(xi , xj ) = (ϕ(xi ), ϕ(xj )) (12)

where K (xi , xj ) is a symmetric positive definite function in K(x, x ) = Ki (x, x ) (20)
original input space Rn based on Mercy condition [24], the i

optimization problem can be rewritten as follows 


K(x, x ) = Ki (x, x ) (21)
l
 l
 i
1
max W(a) = − ai aj yi yj K(xi , xj ) + ai (13)
2
i,j=1 i=1 where Ki (x, x ) is a kernel function.
The mechanism of SVM classifier is shown in Fig. 2.
l
 From the research results of [19–22], it can be concluded
s.t. ai yi = 0 that SVM has some potential advantages, which are listed
i=1 below.

0 ≤ ai ≤ C, i = 1, . . . , l
If ai > 0, then the corresponding xi is called support vector.
In general, support vectors are only a small part of the train-
ing samples. Eq. (13) is a quadratic programming problem
constrained in a convex set, and the solution is unique.
Finally, the optimal separating hyperplane is obtained as
follows

ai yi K(xi , x) + b = 0 (14)
SV

where SV are the support vectors.


Then the nonlinear classifier is
 

y = sgn ai yi K(xi , x) + b (15)
SV Fig. 2. The mechanism of SVM classifier.
4 L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7

(1) It has clear concept, solid theory base and simple struc- Thus, six features for fault diagnosis are extracted, and
ture. compose a feature vector Y = [y1 y2 y3 y4 y5 y6 ]T for
(2) It is a strongly regularized method, which is appropriate SVMs. It reflects the information of the four identified fault
for ill-posed problems. types of transformer.
(3) It provides a unique solution and has a high training
speed. 3.2. Training the networks of SVMs

Based on the characteristics of different transformer fault


3. Fault diagnosis of power transformer based on types, three SVMs are developed to identify the four fault
multi-layer SVM classifier types: normal state, thermal heating, low-energy discharge
and high-energy discharge. With all training samples of four
The procedure of transformer fault diagnosis based on types, the first SVM (SVM1) is trained to separate normal
multi-layer SVM classifier includes three steps: extracting state from other three fault types (thermal heating, low-energy
features which include fault information of power trans- discharge and high-energy discharge). When input of SVM is
former, training SVMs and identifying transformer faults a normal state sample, output of SVM1 is set to −1; otherwise
with the trained classifier. The four types of transformer state +1. With samples of thermal heating, low- and high-energy
are to be identified, including normal state, thermal heating, discharge, the second SVM (SVM2) is trained to separates
low-energy discharge and high-energy discharge. thermal heating from the discharge fault types. When input
of SVM is a thermal heating sample, output of SVM2 is set to
3.1. Extracting features −1; otherwise +1. With samples of low- and high-energy dis-
charge, the third SVM (SVM3) is trained to separates them.
The analytical base of diagnosis is some diagnostic gas When input of SVM is a low-energy discharge sample, out-
content obtained by DGA. The content information reflects put of SVM3 is set to −1; otherwise +1. Thus, the multi-
the states of transformer. These diagnostic gases include H2 , layer SVM classifier is obtained. The basic principle of fault
CH4 , C2 H6 , C2 H4 and C2 H2 . In order to improve the effect of diagnosis of power transformer based on multi-layer SVM
diagnosis, contents of these diagnostic gases are preprocessed classifier is shown in Fig. 3.
through a special data processing, and six features for fault
diagnosis are extracted for SVMs as follows: 3.3. Testing with trained networks
(1) The relative content of these characteristic gases. In each
sample, 5 features, y1 , y2 , y3 , y4 and y5 , are obtained as It starts with constructing testing samples, which is ob-
follows tained by the above feature extracting procedure from the
original data obtained by DGA. Then testing samples are in-
ck putted to the multi-layer SVM classifier. With the outputs of
yk = (22)
5 the classifier, three-layer decision is made. With the outputs
max(ci )
i=1 of SVM1, one can distinguish normal state from the other
three fault types of transformer. Then the testing samples
where k = 1, 2,. . ., 5, ck is the absolute gas content of the of those three fault types are inputted to SVM2. With the
above five diagnostic gases in one sample. outputs of SVM2, one can distinguish thermal heating from
(2) Absolute information of different samples. It is obtained discharge. Then the testing samples of low- and high-energy
by the following formula: discharge are inputted to SVM3. With the outputs of SVM3,
5 one can distinguish low-energy discharge from high-energy
y6 = log10 (max cm ) (23) discharge. Thus, the trained multi-layer SVM classifier iden-
m=1
tifies the four fault types of transformer after three times of
where ck is the same as that in Eq. (22). identification.

Fig. 3. The scheme of power transformer fault diagnosis based on multi-layer SVM classifier.
L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7 5

4. Experimental results tory data. The original data of gas content obtained by DGA
are shown in Table 2.
Fifty history data of a 500 kV main transformer, located The program of SVM classifier was run on a PC Pentium-3
at Pingguo Substation of South China Electric Power Com- 733 MHz with 128 MB memory. The testing result is shown
pany, are adopted for training the multi-layer SVM classi- as follows
fier. The data include 25 samples of thermal heating, 15 Testing samples of SVM1 = [13 thermal heating; 2 high-
samples of high-energy discharge, 5 samples of normal energy discharge; 4 normal state; 6 low-energy discharge]
state and 5 samples of low-energy discharge. The origi- Output of SVM1 is as follows:
nal data of gas content obtained by DGA are shown in
Columns 1 through 15
Table 1.
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
All of the three SVMs of the classifier adopt Gaussian ra-
dial basis function as their kernel, and the parameter σ of the Columns 16 through 25
function is set to 1. Parameter C in Eq. (5) is set to 100. With −1 −1 −1 −1 1 1 1 1 1 1
the 50 training samples, SVM1 is trained. With samples of
With the above output, one can distinguish the 4 normal
thermal heating, low- and high-energy discharge SVM2 is
samples from the other 21 samples of the other three fault
trained. With samples of low- and high-energy discharge,
types. The 21 samples are inputted to SVM2.
SVM3 is trained. Thus, the three-layer SVM classifier is ob-
Testing samples of SVM2 = [13 thermal heating; 2 high-
tained.
energy discharge; 6 low-energy discharge]
Another 25 history data of the power transformer, includ-
Output of SVM2 is as follows:
ing 13 thermal heating samples, 2 high-energy discharge sam-
ples, 4 normal samples and 6 low-energy discharge samples, Columns 1 through 15
are adopted for testing the three-layer SVM classifier. What −1 −1 −1 −1 −1 −1 −1 −1 −1 −1 −1 −1 −1 1 1
need to be pointed out here is that those 50 training samples Columns 16 through 21
and 25 testing samples are collected randomly from the his- 1 1 1 1 1 1
Table 1
The original data of gas content of the transformer for training the classifier
Transform state Diagnostic gas content (ppm)

High-energy discharge [40.8 3 3.6 3.5 7.4] [285 36.3 75 8.4 133] [42 62 5 63 73] [260 130 29 84 92]
[23 12 12 10 61] [528 3179 320 3020 2314] [60 40 9.9 110 70] [250 63 3.8 66 120]
[57 13 0.1 11 12] [335 67 18 143 170] [240 28 6 26 85] [59 7.1 19 4.5 71]
[31 6.6 19 4.7 67] [44 12.2 3.4 3.2 17.4] [466.5 148.8 13 266 511]
Low-energy discharge [650 53 34 20 0] [1565 93 34 47 0] [160 90 27 17 5.8] [35 25 0 23 22]
[0 5.2 5.12 9.58 14.6]
Thermal heating [280 1500 150 1200 140] [1400 3000 560 3500 4] [1 7000 110000 84000 89000 16000] [249 726 278 938 0]
[228 380 82 1012 19] [47 106 28.7 242.2 6.35] [81 130 74 230 2.9] [170 330 77 340 13]
[50 90 18 260 5.9] [220 340 42 480 14] [130 440 180 730 0] [170 320 53 520 3.2]
[48 230 160 810 7] [380 190 30 280 22] [70 69 29 241 10] [11 88 83 250 8.5]
[90 160 54 330 29] [30 62 60 460 3.4] [130 440 180 730 0] [1000 4300 1100 5400 24]
[168 1353 581 3281 63] [770 1420 401 1452 3] [3606.4 1182 328.4 1604.8 6.3] [613 3240 1432 2788 0]
[577 3541 521 2928 7]
Normal state [10 4 3 33 6] [8.5 7.2 4.3 3.9 3.5] [32 31 7.5 50 1.1] [13.5 1.7 1.2 0.60]
[9.87 2.49 0.79 4.0 64.8]
Ps: data in every grid is content of H2 , CH4 , C2 H6 , C2 H4 and C2 H2 , respectively.

Table 2
The original data of gas content of one transformer for testing the classifier
Transform state Diagnostic gas content (ppm)

Low-energy discharge [35 25 0 23 22] [160 90 27 17 5] [565 93 34 47 0] [150 53 34 20 0]


[980 73 58 12 0] [176 206 47.7 75.7 68.7]
High-energy discharge [293 50 13 115 120] [443 85 9.5 103 174]
Thermal heating [73 520 140 1200 6] [42 97 157 600 0] [766 993 116 665 4] [16 237 92 470 0]
[15 125 29 574 7] [120 120 33 84 0.55] [5 217 69 523 6] [0 434 226 387 0]
[2844 8517 4422 10196 39] [117 357 92 468 4] [80 153 42 276 18] [86 110 18 92 7.4]
[8 631 254 2020 39]
Normal state [10 4 3 33 6] [14.7 3.8 10.5 2.7 0.2] [6.7 10 11 71 3.9] [0.33 0.26 0.04 0.27 0]
Ps: data in every grid is content of H2 , CH4 , C2 H6 , C2 H4 and C2 H2 , respectively.
6 L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7

With the above output, one can distinguish the 13 ther- Table 4
mal heating from the other 8 samples of discharge. These 8 Comparison between the proposed method and other AI methods
samples are inputted to SVM3. Methods Diagnosis accuracy (%) Training time (s)
Testing samples of SVM3 = [2 high-energy discharge; 6 ANN 92.76 81
low-energy discharge] ES 89.34 Absent
Output of SVM3 is as follows: FL 92.32 82
ANNES 93.54 44
Columns 1 through 8 SVMs 100 Less than 1 s
1 1 −1 −1 −1 −1 −1 −1
With the above output, one can distinguish the 2 samples to 0.01, error is down too. But when TE is small enough, due
of high-energy discharge from the other 6 samples of low- to the over-fitting, one cannot improve the result by reducing
energy discharge. the TE anymore. Meanwhile, the training time will grow up
The above results show that the three-layer SVM classifier quickly when the TE is reduced. But there is no such problem
can identify all samples of the four fault types of transformer. in multi-layer SVM classifier.
The diagnosis method is very robust, and has a very high Another comparison between the proposed method and
correct ratio. The method makes no mistake in identifying other AI methods, such as ANN, FL, ES and combined ANN
the 25 test samples. and expert system (ANNES) is made as Table 4. The results
In order to overcome lack of testing samples, 5% white in the table of these methods are dropped directly from rele-
noise is added to these 25 testing sampling factitiously. The vant papers, and the statistically best performance is adopted,
value of the feature vector is change to comparing with the proposed method.
T The result in Table 4 shows that the proposed method has
Y = [y1 y2 y3 y4 y5 y6 ] a better performance than the other AI methods in diagnosis
where yk = yk∗ (1 + 0.05∗ rand(1)) and yk is the feature ex- accuracy and training time.
tracted from the testing sampling data, k = 1, 2, . . ., 6.
Thus, another 25 modified test samples are obtained. The
test result shows that the trained three-layer SVM classifier 5. Conclusion
can identify out all modified testing samples effectively too.
SVM is a novel machine learning method based on SLT.
No mistake is found in identifying these 25 modified test
It is powerful for the practical problem with small sampling,
samples.
nonlinear and high dimension. This paper presents a method
A comparison with a BP neural network is made in order to
for fault diagnosis of power transformer based on multi-layer
evaluate the method properly. The ANN is one hidden-layer
SVM classifier. The method adopts three-layer SVM classi-
ANN, with 6 input nodes, 44 hidden nodes and four out-
fier to identify the fault types, and obtains an excellent per-
put nodes. The ANN is trained using fast back-propagation
formance. The test results show that the SVM method has
method. Training parameters are set as follows: learning rate
three advantages over BP network: (1) Since SVM satisfies
is 0.01 and momentum constant is 0.9. The weights and bi-
structural risk minimization, it is more robust and has a high
ases are initialized randomly. The BP network is trained with
identification correct ratio; (2) The training time of SVMs net-
the same training samples, and the same testing samples are
work is very short; and (3) The resolution of SVM classifier
used too in testing. The target error (TE) is set as 0.05, 0.02,
is a convex quadratic programming problem, and there is no
0.01, 0.005. The results of the comparison between the two
problem with local optimal. Since the fault diagnosis of power
methods are shown in Table 3.
transformer based on multi-layer SVM classifier needs few
The result in Table 3 shows that, compared with BP net-
samples and training time and has good reliability, it is very
work, fault diagnosis of power transformer based on multi-
suitable for online fault diagnosis of transformer. The pro-
layer SVM classifier is more robust, and needs much less
posed method has a large potential in practice. Since it is the
training time. One can also find that BP network can improve
first time to apply the multi-layer SVM classifier to fault diag-
the result by reducing TE. When TE is drop down from 0.05
nosis of power transformer, there still remain some problems,
such as the selection of kernel function and the optimization
Table 3
of parameters, which need to be studied in the future.
Comparison between BP neural network and the three-layer SVM classifier
Method TE Training Error Disturbance
time (s) identification (%) identification (%)
References
SVMs 0.01 Less than 1 0 0
BP1 0.05 About 90 10 5 [1] M.H. Wang, A novel extension method for transformer fault diag-
BP2 0.02 About 150 2.5 5 nosis, IEEE Trans. Power Deliv. 18 (1) (2003) 164–169.
BP3 0.01 About 240 2.5 7.5 [2] T.K. Saha, Review of modern diagnostic techniques for assessing in-
BP4 0.005 About 390 7.5 5 sulation condition in aged transformers, IEEE Trans. Dielectr. Electr.
Ps: disturbance identification refers to those results that are not very clear. Insul. 10 (5) (2003) 903–917.
L.V. Ganyun et al. / Electric Power Systems Research 74 (2005) 1–7 7

[3] R.R. Rogers, IEEE and IEC codes to interpret faults in transformers [17] M.H. Wang, A novel extension method for transformer fault
using gas in oil analysis, IEEE Trans. Electr. Insul. 13 (5) (1978) diagnosis, IEEE Trans. Power Deliv. 18 (1) (2003) 164–
349–354. 169.
[4] C.E. Lin, J.M. Ling, C.L. Huang, An expert system for transformer [18] C. Cortes, V. Vapnik, Support-vector Networks, Machine Learn. 20
fault diagnosis using dissolved gas analysis, IEEE Trans. Power De- (3) (1995) 273–295.
liv. 8 (1) (1993) 231–238. [19] J.W. Lu, K.N. Plataniotis, A.N. Venetsanopoulos, Face recognition
[5] P. Purkait, S. Chakravorti, An expert system for fault diagnosis in using feature optimization and ␯-support vector learning Neural Net-
transformers during impulse tests, in: Power Engineering Society works for Signal Processing XI, in: Proceedings of the 2001 IEEE
Winter Meeting, vol. 3, 23–27 Jan, 2000, pp. 2181–2186. Signal, Processing Society Workshop, 10–12 September, 2001, pp.
[6] Q. Su, C. Mi, L.L. Lai, P. Austin, A fuzzy dissolved gas analysis 373–382.
method for the diagnosis of multiple incipient faults in a transformer, [20] E.H. Tay Francis, L.J. Cao, Application of support vector machines
IEEE Trans. Power Syst. 15 (2) (2000) 593–598. in financial time series forecasting, Omega. 29 (4) (2001) 309–
[7] S.I. Mofizul, T. Wu, G. Ledwich, A novel fuzzy logic approach to 317.
transformer fault diagnosis, IEEE Trans. Dielectr. Electr. Insul. 7 (2) [21] W.W. Yan, H.H. Shao, Application of support vector machine non-
(2000) 177–186. linear classifier to fault diagnoses, in: Proceeding of the 4th World
[8] Y. Zhang, X. Ding, Y. Liu, P.J. Griffin, An artificial neural network Congress Intelligent Control and Automation, Shanghai, China,
approach to transformer fault diagnosis, IEEE Trans. Power Deliv. 10–14 June, 2002, pp. 2670–2697.
11 (4) (1996) 1836–1841. [22] L.B. Jack, A.K. Nandi, Fault detection using support vector ma-
[9] Y.C. Huang, H.T. Yang, C.L. Huang, Developing a new transformer chines and artificial neural networks: augmented by genetic al-
fault diagnosis system through evolutionary fuzzy logic, IEEE Trans. gorithms, Mech. Syst. Signal Process. 16 (2–3) (2002) 373–
Power Deliv. 12 (2) (1997) 761–767. 390.
[10] T.Y. Hong, C.L. Chiung, Adaptive fuzzy diagnosis system for dis- [23] W.C. Chan, C.W. Chan, K.C. Cheung, C.J. Harris, On the modeling
solved gas analysis of power transformers, IEEE Trans. Power Deliv. of nonlinear dynamic systems using support vector neural networks,
14 (4) (1999) 1342–1350. Eng. Appl. Artif. Intell. 14 (2001) 105–113.
[11] M.H. Wang, Extension neural network for power transformer incipi- [24] V.N. Vapnik, The Nature of Statistical Learning Theory, Springer-
ent fault diagnosis, in: IEE Proceedings – Generation, Transmission Verlag, New York, 1995.
and Distribution, vol. 150, no. 6, 2003, pp. 679–685.
[12] C.H. Yann, A new data mining approach to dissolved gas analysis
of oil-insulated power apparatus, IEEE Trans. Power Deliv. 18 (4) L.V. Ganyun was born in 1976 in Zhejiang, China. He received the
(2003) 1257–1261. B.S., M.S. degrees in Automatic Control from Nanjing University of
[13] J.J. Dukarm, Transformer oil diagnosis using fuzzy logic and neu- Science & Technology in 1998 and 2001. He is currently working
ral networks, in: Canadian Conference on Electrical and Computer towards his Ph.D. degree in Electrical Engineering from Shanghai
Engineering, 14–17 September, vol. 1, 1993, pp. 329–332. Jiaotong University. His areas of scientific interest are power quality,
[14] Z.Y. Wang, Y.L. Liu, P.J. Griffin, A combined ANN and expert active power filter and signal analysis in power system.
system tool for transformer fault diagnosis, IEEE Trans. Power Deliv.
13 (4) (1998) 1224–1229.
Cheng Haozhong (1962) received the B.S., M.S. and Ph.D. degrees in
[15] K.F. Thang, R.K. Aggarwal, A.J. McGrail, D.G. Esp, Analysis of
Electrical Engineering from Shanghai Jiaotong University. Presently, he
power transformer dissolved gas data using the self-organizing map,
is working as a professor at the Department of Electrical Engineering,
IEEE Trans. Power Deliv. 18 (4) (2003) 1241–1248.
Shanghai Jiaotong University. His areas of research are power system
[16] C.H. Yann, Evolving neural nets for fault diagnosis of power trans-
harmonics and power system planning.
formers, IEEE Trans. Power Deliv. 18 (3) (2003) 843–848.

You might also like