Professional Documents
Culture Documents
Asset Management Data and Decisions
Asset Management Data and Decisions
and what investment protocols yield the maximum return for 350
300
70.0%
60.0%
regulators, the owners and the customer. The collection and
Count
250 50.0%
analysis of condition data, synthesized with operational data, has 200 40.0%
150 30.0%
been used to identify anomalous performance and allow for short 100 20.0%
term reactive planning and longer term strategic planning for 50 10.0%
analytics as they have developed, and the need, within the Age at Failure
Fig. 1. Power Transformer Age Distribution Profile II. ISO ASSET MANAGEMENT AND IEEE
To assist with the gamut of targets and responsibilities
There are many older units, but the age distribution does not facing an organization asset management has been used in a
demonstrate those which are most likely to fail, or those which somewhat ad hoc manner. With a provenance in the UK in the
have the highest impact if they do fail. Figure 2 shows actual early 2000’s PAS55 has been used around the world to help
failure counts for a set of 7000 reported failures, recorded by identify symptoms of asset management in an auditable
Doble Engineering [7]; what is missing, clearly, is the total manner, across a range of industries with significant
investments in physical assets [11]. PAS55 is a significant
Dr. McGrail is with the Doble Engineering Company, Watertown, MA,
input to the development of an International Standards
02472, USA (e-mail: tmcgrail@doble.com). Organization (ISO) standard on asset management, which is
978-1-4577-0875-6/11/$26.00@2011 IEEE
Authorized licensed use limited to: Kwame Nkrumah Univ of Science and Technology. Downloaded on May 01,2023 at 03:03:53 UTC from IEEE Xplore. Restrictions apply.
currently in progress [12]. term intervention and longer term investment. The stages of
One of the benefits of an ISO Asset Management standard the overall iterative process may be summarized as:
will be the international consensus and coverage which such a • Collect data – condition, operation, family/design,
standard brings. Organizations reference quality standard ISO industry etc
9001 frequently and it is very much the case that PAS55 has • Analyze data – looking for anomaly and outliers
not had the same traction. However, it has undeniably been a • Identify anomaly – based on individual asset data,
catalyst for development. The ISO will have the standard which may combine condition and operation data or
sections on definitions, but will also develop features of a aggregate data across several assets or asset types
good asset management program and recommendations for • Diagnose anomalous data – find possible causes
implementation of good asset management systems. • Identify a prognosis for anomalous data – what is
The IEEE has a Working Group on Asset Management likely to happen and with what consequences in what
which is currently defining its terms of reference and scope; timescales
the meeting in Detroit, USA, July 2011, was well attended and • Plan for intervention – replace a failed unit;
produced much discussion on the role of the IEEE in asset repair/refurbish a unit that is failing ‘gracefully’; plan
management. The consensus was for the IEEE to act as an longer term investment; one action plan may be to not
input to standards development elsewhere rather than to physically intervene but plan for failure
develop a new or different standard [13]. • Monitor the situation – some decisions may be ‘fuzzy’
or ill defined, continue analysis and review timeliness
III. DATA AND DECISIONS and appropriateness of intervention
• Iterate – further data collection and checking and review
A. Data and Analysis
of ‘anomaly detection’.
In common to all asset management are: There are many ways to visualize the process, including
• a focus on risk as a combination of likelihood of being waterfall and control loops.
‘unavailable’ and the consequence of that unavailability
• intervention planning and cost justification Data may be used in several ways:
• measuring and acting on effectiveness of actions • By itself and with reference to its own history, as with the
operating time on a breaker, looking for variation against
The consequence of failure to plan intervention effectively ‘expected’ values
could be significant. Figure 3 shows the remains of a breaker • In synthesis/combination with other available data –
which failed in service. The consequent interruption to transformer load and top oil temperature, noting that a
customers was significant. correlation may be time delayed
• In aggregation with other individual units – does one unit
stand out? Does one unit have an operating time within
specification but which is slower than others in the same
family? Is the relationship between two parameters not
one of cause and effect, but one of two effects deriving
from a common cause?
Authorized licensed use limited to: Kwame Nkrumah Univ of Science and Technology. Downloaded on May 01,2023 at 03:03:53 UTC from IEEE Xplore. Restrictions apply.
The data for the transformer current andd the temperature
may be collected and analyzed at the transfoormer. In general,
there are benefits to analyzing data at the point where it is
collected – timeliness of analysis being onne. Where data is
combined from multiple sources, analysis at the point of
combination gives the shortest time lag to identification of
anomaly; where data is aggregated across several units,
anomaly at the individual unit level m may appear with
identification of an outlier at the ‘higher’’ level. Where a
decision is made as to an ‘intervention’ is a ddifferent question.
Protection/control settings are local and auutonomous. Some
condition monitoring acts on the same priincipal. However,
where a condition has some incipiency there may be a
decision made only after human analysis.
C. Datamining
Datamining is the process of extractingg synthesized and
previously recorded knowledge from large ddatabases [15-19].
The term `datamining' has come to mean manny things to many Fig. 5. Self-Organizing Visualization of Transformer
T DGA Results
people and covers a wide range of techniquues [20, 21]; it is
sometimes seen as a small step in a broadeer overall process
The first level of analysis is to make
m sure we are not ‘too
called knowledge discovery in databasess. Datamining is
far’ from the original data – but whaat statistics may be used to
concerned with developing algorithms thaat are applied in
characterize that data adequately? As Anscombe noted, and
software to a real problem. So, what woould be a ‘good
demonstrated, significantly differentt data sets may have very
problem’? Using dissolved gas (DGA) analysis of transformer
common derived statistics [23]: the four
f sets of data in figure x
oil is an interesting field which has led to mucch datamining.
have the same mean and standard deviation to two decimal
Most power large power transformers aree monitored using
places in both x and y values; thee four sets have the same
DGA of the insulating oil. There are sseveral standards
regression factor and trend line to two decimal places. Yet
available for the interpretation of the levels oof dissolved gases,
they look remarkably different – as shown
s in Figure 6.
both individually and in combination ()). A datamining
application sponsored by National Grid inn the UK was a
contribution to a CIGRE Session 2002 papeer on Datamining
[20, 22]. In this work, a self-organizing neuural network was
used to allow several thousand DGA results tto form their own
groupings in several dimensions. These groupings were
mapped to a two dimensional plot. The grooupings appear as
different regions of the plot, as shown in figuure 5.
An example of an individual transformerr ‘trajectory’ over
time, its progress into different regions as it deteriorates, is
given – starting at the green point and moving through
successive regions to lie in the red region. This kind of data
analysis is a useful visualization techniquee to assist in the
interpretation of data.
Care must be taken in the use of such aapproaches – it is
necessary to check that the data in the ‘trajectory’ is not too
different to that used to develop the original mmapping. Fig. 6. Anscombe’s Quartet: 4 data sets with
w very similar statistics
D. Heuristics
‘Heuristics’ refers to experien nce-based techniques for
problem solving, learning, and disco overy such as using a "rule
of thumb", an educated guess, an a intuitive judgment, or
common sense [24]. Heuristic app proaches have developed
over time and provide adequate soluttions to real problem.
It is well understood, for example, that the origin of power
factor testing is based on analysis and electrical engineering
theory and the present terminology, CH, CHL etc relate to the
techniques developed decades ago [25]. [ The interpretation of
the test result, however, is still a maatter for consideration and
debate: what may be a good result one day, in a given context,
may be bad the next, in a very simillar context. The test has an
Authorized licensed use limited to: Kwame Nkrumah Univ of Science and Technology. Downloaded on May 01,2023 at 03:03:53 UTC from IEEE Xplore. Restrictions apply.
analytic theory but is often heuristic in interprretation. when the engineer or designer fails to realize that the current
The problem with many dataminingg approaches as data set does not necessarily represeent future system states, or
described for DGA above, and heuristic appproaches, is that is interpreted in a particular way wheen other interpretations are
they are based on historic data and available ‘common sense’. available, as with the bushings.
When the prevailing wisdom no longer appliies, when the data An example of how statistics may y be misleading is from the
is no longer an interpolation between known wn and understood world of medical diagnostics [27]: if I test positive for a
data points but an extrapolation, then tthere may be a particular genetic predisposition, andd the test is 90% accurate,
breakdown in standard interpretation algorithhms. what are the chances I have the preedisposition? The question
is not answerable without a knowleedge of the prevalence of
E. Example: Bushing On-Line Monitoring D Data
the predisposition in the general population. If that is
As an example we can use bushing test tap data which is prevalence is only 10%, then my ind dividual test being positive
used to derive continuous power factor and capacitance means that there is only a 50/50 thatt I have the condition.
values. A traditional approach has been tto sum the three This statistic has been used to decry
y the efficacy of such tests.
vectors into a single value, known as the ssum current. The However, if the test was negative, th he likelihood that I am, in
problem is that the sum current approach maay hide a problem fact, negative, is well over 90%. The role of statistics in
– the processing has removed information evven as it displays misleading juries at trials, in ways exactly analogous to the
the aggregated data in a palatable form. italiciized. situation presented here, has been doocumented [27].
An example where a bushing shows degraadation is given in Consequently we must be verry aware of the origins,
figure 7, with data extracted from an on-lline measurement derivations and limits of the use ofo heuristic methods, even
system called an IDD [26]. when coupled with statistical boundaries which may,
themselves, be misleading.
F. Smart Grid
What do all the data, decisions an nd analysis have to do with
the smart grid? Many of the ‘smartts’ which are put into the
condition evaluation of substation equipment are automated
algorithms which often require ‘ch hecking by adults’ [28] –
people experienced in the origins of data and its interpretation.
With an ageing workforce, such people p are becoming less
common [29].
Fig. 7. On-Line Bushing Monitoring Data
As an example, the ‘flash crash h of 2:45’, a stock market
crash in 2010 was one which wass, in theory, the result of
On line bushing monitoring systems recorrd the phase angle analysis algorithms responding to the market in unforeseen
and magnitude of leakage current through thhe bushing tap to ways [30]. 9% of the market value disappeared in seconds. A
determine the power factor displayed in figurre 7. subsequent theory that the wholle event was based on
There are similar cases of degradationn being indicated erroneous data entry is salutary. Alg gorithms unobserved were
through phase angle variation. Figure 8 show ws what could be the source of an event related to Am mazon: the website price for
interpreted as indicating a degraded bushing for two bushings a book “The Making of a Fly”, which was out of print,
as they develop over a period of days. available from two suppliers. At one point the book, which
retailed at $35, was on sale for in excess
e of $1.7 million. No
one bought the book, but the price continued to rise to over
$23.6 million. The cause was a set of o algorithms two vendors
used, which were based on the other o vendor’s price. The
control loops weren’t able to recog gnize ‘common sense’ and
escalated in a closed loop. A single human
h intervention was all
it took to correct the situation [28].
We must ensure that as we develo op the rules and criteria of
Fig. 8. On-line Monitoring Data for Two Bushings smart grid that we keep the human element
e central. The smart
grid must thus react more quickly to external events, and in
In fact, the data in Figure 8 may also bee interpreted as a such a way that it is predictable and d the algorithms employed
variation in the power system which providdes the voltage at well understood. The day everythin ng goes dark, a flash crash
the bushing and the current through it. His was, in fact, the of the voltage due to a huge number of devices simultaneously
case. The bushings were in good conditioon but the power switching in to take advantage of lo ow electric prices, may not
system was showing significant variations. The lesson is to be far away.
make sure that interpretation of data is doone in the largest
context available – using all available data.
IV. CONCLUSIIONS
Heuristic algorithms are often employed bbecause they may
be seen to "work" without having been matheematically proven Asset management is a disciplinee which is becoming more
to meet a given set of requirements. One ccommon pitfall in holistic and far reaching in terms of
o asset lives. Data being
implementing a heuristic method to meet a reequirement comes used to make decisions is not necessarily representative of all
Authorized licensed use limited to: Kwame Nkrumah Univ of Science and Technology. Downloaded on May 01,2023 at 03:03:53 UTC from IEEE Xplore. Restrictions apply.
data that may be seen. Algorithms used to analyze data and [29] E. Leibold and E. Voelpel, “Managing the Ageing Workforce”,
Davenport, ISBN: 3895782637
make decisions must be monitored themselves so as to ensure
[30] http://en.wikipedia.org/wiki/2010_Flash_Crash
that the outcomes are not unexpected. As the work force ages,
encapsulation of tacit knowledge becomes more pressing and
more critical – to ensure that the smart grid remains smart.. VII. BIOGRAPHY
Authorized licensed use limited to: Kwame Nkrumah Univ of Science and Technology. Downloaded on May 01,2023 at 03:03:53 UTC from IEEE Xplore. Restrictions apply.