Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Tired of “Winging It”:

6 Suggestions to Get Data Working for You

Author:
Jim Crompton, Subject Matter Expert and Sr. Solutions Architect

Just for once, I would like to see a project framing document where data was in scope and
technology was on the frame. I have seen a lot of projects and, in far too many of them, data
was out of scope and the focus was technology.

I believe the value of data is underappreciated. Good data drives better business decisions.
Business decisions that drive a host of important aspects: how to operate, where to invest,
when to divest, and how to operate more efficiently to create significant value. The business
user wants to focus on using data but expects someone else to manage it. Data should be a
business asset, but it is not often treated that way.

The other part of this story is that most of the IT community does not want to focus on data
either. IT focuses on the technology to manage data, to capture and store data, and even to
visualize data. But in each case, the priority is on the technology. IT can provide amazing tools
but in most cases does not have the knowledge to be the data owner. A good DBA can tell if
valid data is in a database, but they cannot tell if it is the right data without business
knowledge.

If you take the metaphor of a pipeline, the oil and gas industry does a pretty good job of
managing the hydrocarbon molecules that travel from reservoir, to well, to pipeline, to refinery,
to the gas station where you fill up your car. But how good are we at managing the flow of data
from sensor, to process control network, to operations center, to facilities and reservoir
models, to business plans?

With modern field instrumentations, the amount of data being recorded is continually growing
at exponential rates. At the other end of the pipeline, emerging technologies ranging from data
appliances, to enterprise data warehouses, to data visualization architectures, to new modeling
and simulation applications are being developed to handle the big data load.
Tired of “Winging It”

But there is a problem in the middle of our information pipeline. We have a data leak
somewhere. The data we can collect is having trouble getting to the advanced analytics
applications. Users are having problems finding all of the data they need to make vital
decisions. Model builders are having trouble integrating the volumes and varieties of data
needed to fully describe the problem they are trying to solve. Managers are having problems
getting the data about their operations to know where to intervene, where to invest, and when
to let well enough alone.

This is not a new problem. Many studies point out the productivity loss felt when we have
spent a lot of time looking for data and making sure everything is correct, complete, and in the
right format. We counterbalance some of that with experience. If I cannot find all of the data
to run a sophisticated analysis, I can drop back to a less technical tool and use approximations
to get close enough. The fancy word is “heuristics;” the real term is “winging it.”

But what will happen when the experienced staff leaves? The new workforce is certainly
digitally literate, but can they perform well with a poor understanding of data? Can the new
engineer recognize when critical data is missing or wrong? Will they be able to recognize when
a logical conclusion is not the right one?

We also have less time to make good decisions. Current practice is to respond quickly to failure.
Field automation collects enough surveillance data to recognize when safe operating conditions
are not met and initiate shut down procedures. Then, operators are notified of the failure and
engineers begin the analysis to prepare the work to address the problem and restore
production.

However, that approach does not work in every situation. Just responding to failure, whatever
the type of operations from producing oil to producing electricity, will result in significant lost
value. We will have to move from reacting to conditions to predicting what might happen.
That means not only having access to surveillance data to understand what is happening (both
in the reservoir, well bore, and on the facility) but having models on hand which predict future
behaviors. Also important is having a means to match the physical to the predictions, correct
the models, and “manage by exception.”

To accomplish this state, the importance of data – many kinds of data – is a priority. Managing
data in our current, separate silos (structured, documents, transactions, field operations,
models) increases the resistance to efficient data flow. As many different people are involved
in these new workflows, a solution designed for one specific discipline becomes the barrier for

Noah Consulting, LLC – copyright reserved page | 2


Tired of “Winging It”

others. The need to see the bigger picture becomes equal or more important than the need to
reach a specific technical answer.

Is data the missing piece in the equation to achieve integrated asset management? Have we
ignored the importance of data long enough? How can we put the focus back on data? We are
reasonable people, trying to do the best job we can. How did the situation with the data
foundation get so bad?

There are several answers. It was not just one decision or a single action that led us down this
path. The impact of the downsizing of support staffs and mergers and acquisitions where we
never quite completed all the data migration projects contributed – so has the absence of
widely accepted industry data standards. The consequences of acquiring “best of breed”
technology solutions, where data is uniquely managed in the core of many different technology
tools, adds to the mess.

The reality is that legacy solutions never die, until the user that sponsored the tool does.
Specific applications continue to survive waves of new technology development and the
portfolio of tools continues to grow. The data explosion and the availability of powerful desktop
tools which help the individual cope but create hundreds more data silos for the organization
add the final ingredients to the current sad state.

So, how do we right the ship and leave a more organized “digital file room” to the next
generation, as the previous generations left a well-organized paper file room for us? Again,
there is not a simple answer or “magic bullet” technology solutions, but I would like to offer six
recommendations to help us get back on the road to “trusted data, easily found.”

1) The first recommendation starts with a better understanding of our business process and
how information flows through the process in order to facilitate decisions.

2) The second recommendation concerns data governance. If we wish to achieve the goal of
simplification, we need to make some key decisions and have the discipline to stick with those
decisions.

3) The next recommendation is around common reference and master data. In this step, a
company must identify the critical master data used in their core process and create a
repository where core applications can use a single version of those references (well, field,
reservoir and equipment identifiers, producing facilities or power generation plant, units of
measure, geographic coordinates, etc.).

Noah Consulting, LLC – copyright reserved page | 3


Tired of “Winging It”

4) When you create a new repository for data (often called a system of record), make sure you
populate it with the best data you can find. That is the easy part; the next recommendation is
to keep that data at a high quality. So we need an information quality assurance process to
make sure we always have the right data in the defined homes for that data.

5) The fifth recommendation is to look at data with a lifecycle perspective. We need to manage
data from its capture/creation, storage, access, use, transformation, archival, and disposal. This
is a full-time job and someone has to do it.

6) The sixth and final recommendation actually involves technology. Having a data integration
guide, the technology tools, competence, and discipline to use them will provide the glue
needed to redraw the data map and finally get us to where we need to be.

So there is the solution in six not-quite-so-easy steps. Are you ready to volunteer to do your
part? The business is waiting, but not very patiently anymore.

About the author:

Jim Crompton is a distinguished leader in E&P data management,


spending 37 years at Chevron and stacking up a truly impressive list of
accomplishments. One of the many highlights of Jim’s work was
leading one of the largest standardization projects at Chevron. Jim
also led a study team for the IT Merger Integration Organization as part
of the Chevron & Texaco merger and his team developed the IT
organization structure and IT strategic direction for the corporation,
for which he received a President's Award. In 2002, Jim was named a
Chevron Fellow in acknowledgement of his contributions and he
served as the chair of the Fellows Network from 2006-2008.

Outside of his work for Chevron, Jim was elected chair of the general committee of Petroleum
Industry Data Exchange (PIDX) where he was able to influence the direction of the standards
setting activities towards emerging technologies, such as XML, and advanced electronic
business models to complement the established EDI practices in the industry. He was selected
to participate in the SPE Distinguished Lecturer Program for 2009-2010.

Jim resides in Colorado, with his wife, and enjoys writing.

Noah Consulting, LLC – copyright reserved page | 4

You might also like