Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/282663186

CURRENT STATUS OF WELL LOGGING DATA DELIVERABLES AND A VISION


FORWARD Philippe Theys,Thuy Roque, Monica Vik Constable, John Williams,
Martin Storey

Conference Paper · May 2014

CITATIONS READS
0 991

1 author:

Martin Storey
Well Data Quality Assurance
14 PUBLICATIONS   7 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Jan Cordonnier View project

Plasma Storage Ring View project

All content following this page was uploaded by Martin Storey on 09 October 2015.

The user has requested enhancement of the downloaded file.


SPWLA 55th Annual Logging Symposium, May 18-22, 2014

CURRENT STATUS OF WELL LOGGING DATA DELIVERABLES AND


A VISION FORWARD
Philippe Theys, Independent Consultant, Thuy Roque, Anadarko, Monica Vik Constable, Statoil,
John Williams, BP, Martin Storey, WellDataQA

Copyright 2014, held jointly by the Society of Petrophysicists and Well Log much-needed quality control. They are a prerequisite to
Analysts (SPWLA) and the submitting authors.
This paper was prepared for presentation at the SPWLA 55th Annual Logging an efficient and valid log interpretation.
Symposium held in Abu Dhabi, United Arab Emirates, May 18-22, 2014.
INTRODUCTION
ABSTRACT
The conventional interpretation process starts with
Twenty years ago, at the end of 1993, the SPWLA Data loading a depth-indexed well log digital file. So why
Quality Topical Conference in London concluded on worry about standardization as, upon a superficial
three priorities for well logging data deliverables: 1/ scrutiny, these files look very similar, especially in
Establish standards. 2/ Establish standards. 3/ Establish consideration of the fact that the auxiliary data they
standards. Since then, these priorities have become all contain is seldom looked at? Standardization is
the more critical, for reasons that include the important because the log data acquisition process
deployment of numerous modern measurements, the varies tremendously according to the logging company,
“data explosion,” the multiplicity of vintages, and the the geographical location, the logging engineer and the
creation of new logging companies. local logging company management. Even for one oil
company using only one logging data supplier, the
In reality, the industry has walked resolutely in the successive deliverables can be very different. When two
opposite direction since then, away from standards. different logging companies are involved, delivery
Well logging data today is delivered in many different variations can be even more considerable. Log data
forms and with varying content. While most logging delivery is a lottery whereby some can get a lot of
companies claim that they adhere to recommended valuable information while others get hardly anything
practices RP 31 (header format) and RP 66 (digital of long-term value.
format), a quick perusal at data sets reveals that the
current delivery is a real lottery for the final data user: In the past there have been several attempts to define
many sets are incomplete for exploitation, while some standards. The API RP 31 standard was introduced in
sets are huge and difficult to understand and manage. 1948 and revised in 1967 [API, 1967]. Several excellent
papers, [Reedy, 1993] and [Hutchinson, 1994],
This paper gives numerous examples of the current proposed critical practices to increase drastically the
situation and suggests a standard delivery that applies to value of logging data. In 1998, a short note on reporting
all form of well logging data, regardless of conveyance, depth information was included in The Log Analyst
of well location in the world and of well condition. It [Theys, 1998]. By and large, these recommendations
particularly addresses Logging While Drilling (LWD) have been and remain disregarded or forgotten.
data that has often been, and continues to be, delivered
incompletely, inconsistently, and in the most reduced In the case of LWD data, the recorded information is
and untraceable ways. often incomplete and upon delivery it is often
misrepresented, as illustrated below.
In addition, the paper evaluates the potential impact of
the standardization of well logging data deliverables. Why is this alarming situation not recognized and
Standards do not mean commoditization and do not loudly decried? A main reason is that many of the oil
suppress innovation and competitiveness. However they company’s initial uses of the information are broadly
definitely improve usability of data and facilitate a qualitative (e.g., chronostratigraphic correlation

1
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

between wells) and do not suffer significantly from data between their respective areas of training or experience.
inaccuracies. Another related reason is that these Log interpretation is now mostly performed by people
largely qualitative uses of the data tend to happen without practical logging experience. Many
shortly after acquisition, while the more quantitative geoscientists use the data without having sufficient
uses, with more stringent requirements on data quality, information and therefore not accounting for the
tend to take place much later. Examples of such uses shortcomings inherent to the logging acquisition
include the quantitative petrophysical characterization process, for instance when:
of a reservoir, or a time-lapse study.
 LWD data is acquired with high rates of
For companies that require accurate data with penetration
reasonable and quantifiable uncertainties to support key  Density data is acquired with a measuring sub
business decisions, well logging data as delivered today that is not stabilized
is rarely fit-for-purpose. Particular situations when this  Enhanced-resolution processing data is
is clearly critical include: challenging interpretation acquired in poor borehole conditions
issues, the reinterpretation of vintage data, fluid  Log data is used without consideration for the
saturation management, enhanced recovery, unitization associated quality-control curves
and redetermination.  LWD depths are inaccurate, particularly when
used to estimate thicknesses.
This paper discusses and illustrates the need for
standards and the challenges their deployment would Because of the significant evolution in roles,
involve. Secondly it proposes two areas where these responsibilities, training, and competencies in the
standards should be applied, namely graphical delivery industry, acquisition deliverables must imperatively
and digital delivery. Examples focus on LWD data, and contain a complete and correct record of the acquisition
the last section describes additional documentation to allow their valid exploitation, immediately as well as
required specifically for such data. in the future.

INCREASED NEED FOR STANDARDS Demanding organizational and well conditions.


Delivery of quality data is even more critical now that
Change of roles and responsibilities. Henri Doll held there is an increasing scrutiny on operating time and
70 patents and wrote seminal papers on log significant challenges from the harsher borehole
interpretation (e.g., The S.P. Log: Theoretical Analysis environment we operate in - deeper, hotter, higher
and Principles of Interpretation, [Doll, 1948]), but his pressure, and narrower pore pressure and fracture
very early success was to perform the first well log in pressure window.
1927. One of the best-known pictures of him was taken
in Baku while he is calibrating a logging tool. He could An additional complication. Early logs had few curves
make the best use of acquired data because he knew all and included limited processing on delivery at the
the details of the acquisition process. Until only wellsite. In contrast, today, data processing before
recently, many petrophysicists and log analysts had delivery at the wellsite has become common if not
been logging engineers earlier in their career. They systematic. Figure 1 represents the steps of the data
fully understood the limitations of the acquisition flow linking the measurements performed on the
process and optimized the interpretation accordingly. formation (or casing, or anything on which
Implicit acquisition information was inferred by the log informations need to be collected) to the output that is
analysts through their past logging experience and delivered to the data user. The figure is simplified as
intimate knowledge of the technology. the different steps in the data flow may not be
performed by all logging companies in the sequence
Nowadays, data acquisition and exploitation are usually shown. For instance, corrections may be performed
done by different groups of people with little overlap before time-to-depth conversion.

2
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

Fig. 1 From a measurement of the formation to delivered information.

In any case, there is rarely a single correct way to run


the acquisition sequence, and how that is done can be
largely subjective (or in the worst case, left to default
settings of the software). A simple look at the figure
shows that the same formation may yield substantially
different output and deliverables. It is therefore
essential that all the decisions and selected options be
documented intelligibly in the deliverables, so these can
be taken into account during exploitation. Two
examples of such critical acquisition sequence options
are discussed next.

Difference in measurements. Different tool configu-


rations yield different measurements. Different sensors
(detector size and efficiency) yield different
measurements. A pad type NMR tool reads differently
from a mandrel type tool. A spectral sensor does not Fig. 2 Different gamma ray tools. Top left: one
read the same as a tool with global detectors. detector, eccentered. Top right: three detectors,
symmetrical. Bottom left: bundle of 12 detectors,
Even for the apparently simple gamma ray symmetrical. Bottom right: a single detector, centered
measurement, different tool configurations as shown in in the middle of the drill collar. These different tools
Figure 2 yield different measurement values. are bound to yield different measurement values.

Difference in time-to-depth conversion. All measure-


ments are performed in the time domain. For wireline, One area particularly prone to subjectivity is the
the indexation to depth is relatively simple, but for handling of gaps (or overlaps) in the depth-based data
LWD it can be very complicated as the motion of the after time-to-depth conversion - whether the gap is due
bit and of the BHA is generally neither smooth nor at to a recalibration of the depth-tracking hardware, a
constant speed. retying to the driller's tally, a computer glitch or a
change in the rig-up. Some engineers may fill the gap
Logging companies do not have a common standard with an arbitrary value, others may just use some form
method for LWD data time-to-depth conversion. The of interpolation, yet others may stretch or compress
conversion is a subjective process: different engineers data from the adjacent interval above or below.
may select different curves over different intervals.
In personal communications, some engineers recognize
Hence, the time-to-depth conversion of the acquisition that time-to-depth conversion is a time-consuming
data has the potential to yield significantly different sets chore that needs to be done in post-processing mode by
of deliverables. experts. A consequence of this acknowledgement is

3
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

that the depth indexing of the data done at the wellsite


may frequently be inadequate.

One logging company combines the time-to-depth


conversion with signal processing in order to optimize
the precision of the measurement [Schneider, 1994].

Finally, another company explains that this conversion


is performed deterministically as illustrated in Figures 3
and 4, but the process is not simple.

Fig. 4 Complicated but fully deterministic time-to-


depth conversion. Note that the original 24
measurements (performed every 5 s) yield only six
different values A1 to A6 (for a nuclear tool) or L1 to
L6 for a resistivity tool over a 3.5 ft interval. For
details, consult [Theys, 1999].

Fig. 3 Attribution of measurements to depth intervals.


Time is on the horizontal axis. Depth is on the vertical For longer measuring times, the sampling rate per meter
axis. would be even less. From no more than four samples
acquired per meter of wellbore, how is the logging
company capable of delivering a value every 10 cm?
MISREPRESENTATION OF DATA Some explanations and documentation from the logging
company are in order.
In addition to being incomplete and not conformant to
existing standards, data can be presented in ways that The illusion of continuity. It has been seen that logging
may mislead the log interpreter. Some examples follow. companies don’t have a unique processing for a given
measurement. It is therefore highly unlikely that two
The illusion of regularly sampled data. Figure 5 shows
successive runs, particularly if performed by two
a partial listing from a LAS file of LWD gamma-ray-
different logging companies as is the case in the
density data. Quadrant densities and final density are
example next page (Figure 7), would meet at exactly
shown every 10 cm with up to 4 digits behind the
the same value, as they seem to do. The data must have
decimal point. Such data seems to suggest a set of
been nudged somehow so that the values match exactly
precise values sampled regularly. In fact, a look at
– which would have to be explicitly commented, at the
figure 6 indicates that at a rate of penetration of 80 to
very least.
90 m/h, only four to five samples per meter can be
obtained assuming that the data was acquired with the
The presentation in Figure 8 showing a discontinuity
minimum measuring time (sample period) that can
between runs better reflects reality than the nudging
yield an acceptable density log precision.
observed on Figure 7.

4
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

Depth ROP Qdens1 Qdens2 Qdens3 Qdens4 Density Bit size Caliper Drho GRc
m m/h g/cm3 g/cm3 g/cm3 g/cm3 g/cm3 in in g/cm3 API
2502.0 81.3857 2.1597 2.1506 2.1709 2.1728 2.1594 8.5 8.6004 0.0635 62.3265
2502.1 85.1353 2.1562 2.1430 2.1636 2.1705 2.1554 8.5 8.6465 0.063 61.9269
2502.2 88.8849 2.1459 2.1474 2.1660 2.1656 2.1551 8.5 8.6465 0.0655 60.8635
2502.3 92.6345 2.1346 2.1606 2.1750 2.1603 2.1572 8.5 8.6465 0.0701 58.7918
2502.4 91.4808 2.1283 2.1726 2.1835 2.1570 2.1591 8.5 8.6234 0.0745 59.2365
2502.5 90.3271 2.1306 2.1736 2.1836 2.1575 2.1581 8.5 8.6004 0.0758 62.0729
2502.6 89.1733 2.1372 2.1655 2.1742 2.1614 2.1545 8.5 8.6004 0.0734 62.4981
2502.7 88.0196 2.1475 2.1558 2.1589 2.1687 2.1519 8.5 8.5543 0.0695 62.7434
2502.8 82.5069 2.1600 2.1509 2.1465 2.1764 2.1525 8.5 8.5543 0.0663 63.0336
2502.9 76.9942 2.1702 2.1531 2.1421 2.1824 2.1559 8.5 8.5543 0.0651 60.7639

Fig. 5 This typical data presentation from a LAS format deliverable is misleading. Firstly, the regular sampling
every 10 cm is significantly superior to the rate at which the formation was actually sampled; secondly, all logs are
presented with much more digits behind the decimal point than their respective precision. The caliper values are
produced with a simpler interpolation algorithm, and the successive repeated values, shaded here for emphasis,
betray the actual scarcity of the data.

Fig. 6 Link between data rates and rate of penetration.


It is challenging to have 10 samples per meter with a
Fig. 7 Exact match of a gamma ray curve between two
rate of penetration of 90 m/h. In red, a data rate of 8 s, a
runs. Two different logging companies have completed
reasonable value to obtain a quantitative formation
these two runs. The perfect match point is annotated.
density with a ROP of 90 m/h, corresponds to five
Note that there is no ROP curve for the deeper run.
samples per meter. Modified from [Lamont-Doherty
Earth Observatory, 2008].

5
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

not need to specify to the car manufacturer that a clutch


should be part of the car. This component is critical. So
the car company, that is the supplier, definitely needs to
make sure that this clutch is designed and present in the
car.

As mentioned in the introduction, standards have been


proposed and poorly followed. In particular, the 1993
RP31A proposed revision [Reedy, 8] clearly states “that
diagram of the logging instrument as it appears
downhole should be provided, and show dimensions,
physical measure points, and the placement of such
devices as centralizers and standoffs.” In addition, “for
each curve presented, the version and revision date of
all applicable software should be recorded. All variable
Fig. 8 A more realistic presentation. parameters (e.g., environmental corrections, assumed
The gamma ray scale (green curve) is such that there is lithologies and data filters) controlled by the logging
15 API per division. The difference between the last engineer should be recorded.” Any audit on recent
value of the previous run and the first value of the next deliverables indicates that these standards are not
run is 27 API. strictly followed by all logging companies.

WHO SHOULD DEFINE STANDARDS? Part of the problem is that the existing standards
already mentioned, while remaining fully relevant, have
More generally, the confident use of log data requires a become insufficient because they do not go into enough
clear, correct and complete record of all operational and details in relation to the acquisition of modern data,
data generation details – which jointly may be called including LWD data.
the contextual information - to be available to the data
users, present and future. Some logging companies claim that oil companies do
not know what they want and often complain about
This contextual information should be an integral part being overwhelmed by the delivery of too much data
of the acquisition record, similar to the “print” or they cannot understand. To this argument it should be
graphical presentation. Today’s situation is that most retorted that instead of delivering less data, logging
graphical records are incomplete, misleading, and companies should provide more documentation that
sometimes incorrect. would enable the use of large data sets. A “complete
data set” is a required deliverable, and if that results in a
Now that the need for more complete information and large data set (e.g., with waveforms) so be it. If the data
clearer presentations is understood, it is necessary to users chose to throw most of it away, that is their
define who should develop standards for deliverables. prerogative and the eventual consequences are their
Intuitively the data users should be the first persons responsibility.
involved in defining logging deliverables.
GRAPHICAL STANDARDS
In fact, considering the complexity of the logging
environment and the complication of the systems that In this second decade of the 21st century, it seems out of
bring data to the oil company, the onus is more on the fashion to write about print, film or graphical
vendor side. The logging company must ensure that all deliverables. Why are digital deliverables not
basic, auxiliary data and relevant contextual sufficient? They would be if an exact reproduction of
information are delivered. The similarity with the the graphical displays could be readily produced from
purchase of a car has often been used. The buyer does the data set. At the time of writing, this is not yet the

6
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

case. While graphical displays are largely subsets of In the last ten years, this component has been more and
the digital data set, they frequently contain critical more neglected. Logging engineers know that filling
information that is not available in the digital data files. remarks means more time spent explaining anomalies
to oil companies. Oil companies dislike remarks
It is for this reason that the proposed standardization because they indicate that the job was less than optimal.
starts with the graphical file. But truthful remarks could avoid wasteful work in the
future, so every effort, on both sides, should be made to
The description of the components hereafter does not submit meaningful remarks. Remarks are a team effort
replace the existing standard document RP31A [API, shared by logging and oil companies, and are best
1967] which any logging company entering the logging written jointly by the loggers and the witnesses at the
business should be familiar with. The components of wellsite. They can be captured as items for Lessons
the graphical file are listed in Figure 9. These Learned and be used for further training.
components come in addition to the main log, which is
always delivered. In particular, any annotation found on the main log or
repeat log should be repeated in the Remarks section
- Header - Survey (deviation and along with an indication of the depth at which the
- Remarks Section azimuth data) Listing annotation has been made.
- Job Chronology - Parameter Listings
- Depth Information - Parameter Change Job chronology. The job chronology was successfully
Box - Calibration Info Box introduced in the early 2000s in the North Sea. It is now
- Tool Sketch - Repeat Passes a common log component observed on every North Sea
- Well Sketch - QC Log/QC plots
job run for any oil company by any logging company.
- Well Plot - LQC Stamp
Its use should be extended worldwide.

Fig. 9 The 14 points What is its rationale? Every logging job is different and
the sequence and nature of even minute events during
Header. The header is the oldest piece of auxiliary the acquisition may have a significant impact on the
information in a log and previous documents RP31 and evaluation and interpretation of the data.
RP31A document it very well. It is important that it is
filled correctly, clearly and completely. Of most The circumstances need to be reported by the logging
importance is the information that is found on no other engineer in a chronology of the job. The valuable
records, e.g., the mud information, the maximum information helps considerably the data user, especially
recorded temperature, and the equipment identification for cased hole logs and LWD logs. A well-documented
and parameters. It is noticeable that logging companies example is shown in Figure 10, next page.
seem to use empirical formulae to derive mud
properties instead of measuring them. When oil base Depth information box. Although depth is recognized as
mud is involved, logging engineers tend not to dig any a fundamental measurement, it is today clouded by
further and do not fill the appropriate boxes. This needs many questions that will not be discussed here (see
to be corrected. The well location is found elsewhere in [Theys, 1999] and [Loermans et al. 1999] for a
the oil company archives, but an effort should be made discussion of wireline logging depth). In any case, the
to report it correctly on the header. depth measurement needs to be documented on the data
record itself. Depth sensors and related calibration are
Remarks section. Remarks are essential and must be to be reported.
used to document any nonstandard circumstances
regarding tool operation, log presentation, anomalies, Wireline: There are (at least) two possible methods to
special requests, customer orders or authorizations that measure depth: Magnetic marks or precision wheel.
change the standard operational procedures, and special Which method is used needs to be reported. Wireline
drilling fluid ingredients. depth is traditionally corrected for stretch. The amount

7
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

of stretch and the way it is derived also need to be position and dimensions of auxiliary equipment, such as
reported. knuckle joints, standoffs and eccentralizers must be
LWD: The amount of correction at drillpipe shown as they are essential to the acquisition and
connections needs to be reported. If another subsequent use of accurate information. The tool serial
organization is measuring depth, there should be full numbers are important to verify the match with
traceability to this third party. calibration data. It is also useful to have these numbers
to track rogue tools giving erroneous data.
Nov11th
01:30 Disconnect ABT from CDT string to check it In LWD, an accurate BHA sketch plays the same role.
alone
02:30 Start checking ABT
Well sketch. The well sketch is described in [Reedy,
03:00 Start ABS 49 base oil station logs
03:30 Start ABS 48 OBM station logs 1993]. It should include a description of the open hole
04:00 Do simulation logs with parameters as per sections and of the tubulars (casings and tubings) used
logging program in the well. The well sketch should represent the well at
05:00 Start checking ABS 173
the time when the logging data was acquired. When a
06:00 Finish checking ABS 173 ---> USE THIS AS
MAIN hole opener is used, the hole size changes depending on
06:10 Start checking Packer tool where the opener is and whether the data is acquired
06:45 Finish checking Packet tool while drilling, or reaming. These details should be
10:30 Finished calibrating tension device TD 1234
reported.
for 7-48 and 7-46 cables
11:30 Checked swivel head 567 with mega-ohm-
meter Well plots and deviation survey listing. Most wells
14:00 Operations and safety meeting with today are deviated. Directional data is almost always
contractors personnel onboard available and can be found in a separate data folder.
15:00 EFX125-GHT string ready for operational
check
Frequently the survey available at the time of producing
16:20 EFX125-GHT OK the print is not the “Definitive Survey” and this must be
16:55 Communication with onshore checked. OK. specified explicitly too. Considering that years after
17:20 Netsighter checked OK. logging, well data records have often a difficult time to
17:25 Announcement made to stop all hot work due
to gas in mud above 184
meet again, it is strongly suggested that this data is
20:00 Start checking backup EFX made an integral part of the log data file. Well plots
CANNOT CLOSE EFX caliper. Start (vertical and horizontal projections of the well
troubleshooting. trajectory) are particularly useful to warn the log
21:00 Restart surface system; Still cannot close EFX
caliper.
analyst when a large apparent dip correction needs to be
21:05 Check continuity and insulation on logging performed on deep resistivity logs.
head. Checks are OK, but EFX still fails.
21:30 Cable trim. EFX still fails Parameter listing and parameter changes. The
22:00 BTMS 3660/ BTMD 3683/ BTCC 3675
production of logging curves frequently requires
FAILED
22:15 Start checking backup CDT modules. extensive processing controlled by a large number of
…… options and parameters. The value of the log data is
greatly reduced unless these parameters are clearly
Fig. 10 Example of job chronology recorded and available to the log analyst. These
parameters answer the following questions:
When events relating to depth occurred during the data
acquisition operations, e.g., a problem with a heave  How much is the raw data filtered? Could thin
compensator, these must be clearly reported in the beds be hidden by the filtering process?
remarks section.  Which environmental corrections have been
performed? Which environmental parameters
Tool sketch. The tool sketch gives a visual display of were used to drive these corrections?
the combination of logging tools. The presence,  Which processing options were used?

8
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

If a parameter is changed during the logging run, there re-processing, but this is only possible if the raw curves
should be a box to report the values and the depth of the are available. It is therefore very important that all the
change. relevant raw data is delivered. In practice many of these
curves are often not provided to the customer, possibly
Calibration box. Most measurements cannot be not even recorded by the logging company.
considered accurate unless they have been calibrated. A
calibration record needs to be supplied by the logging The overwhelming recommendation is that all raw
company even if the calibration is performed by a third curves and all quality control curves should be
party. The logging company needs to provide delivered to the oil company. This cannot be over-
documentation stating the requirements of the emphasized. It must become mandatory and standard to
calibration area (for instance, it may be impossible to the extent that a data set delivered without the raw and
perform a valid calibration in a metallic environment, the quality control curves should be rejected outright.
e.g., on an offshore rig) and the calibration frequency.
As quality curves are unglamorous and often rather
Post-acquisition calibrations have been shown to be
esoteric, clever presentations, such as color-coded flags
useful for LWD density-neutron logging. They help
should be designed by logging companies. An example
detecting excessive wear. All logging companies are
of such design can be found in reference [Barber,
encouraged to perform them.
1999].
Repeat passes. Repeat passes are particularly useful to
LQC stamp. Most organizations manufacturing
assess the precision of the measurement in situ. They
complicated products include a quality stamp in their
are absolutely required for formation evaluation logs
delivery. The stamp indicates that the product has been
acquired with wireline. Repeat logging is also valuable
checked in a systematic manner. Figure 11 shows an
with LWD data, although such data is generally not
example of LQC stamp used by a directional
acquired during a dedicated pass, but instead
measurement company. The LQC should be confirmed
opportunistically, while reaming, cleaning hole, pulling
by a different person from the data producer, i.e., the
out or running in. Such LWD data is normally
logging engineer, and should be signed. Today, the use
delivered as part of the run-by-run data sets. Their
and inclusion of the LQC stamp are not widespread and
added value in LWD mode is not so obvious. They are
not consistent.
not an absolute requirement in this mode of conveyance
[Theys, 1994].
DIGITAL STANDARDS
QC curves. Many curves (channels) of the raw data Currently, the two most popular formats that deal with
have a Quality Control (QC) function. They are log data are DLIS [API, 1996] and LAS [CWLS, 2009].
essential to assess the quality of the data to be The two formats do not give any instruction on the
exploited, as well as to reprocess the data at a later time content. Logging companies may deliver a lot of data,
should this be needed or required. The first dedicated or very little as illustrated in Figure 12. For similar
quality curve introduced in the industry was the density measurements, company A delivers 15 channels,
correction curve. It became so popular that its absence company B, 289 channels. It is obvious that much less
from the log was quickly noticed by the log analyst. data quality control, validation and reprocessing would
The QSS and QLS curves of the Litho-Density tool be possible with the data set supplied by company A.
were widely used in the 1990s. Latest tools often have Company B, though much more generous in terms of
color-coded quality control flags, and these must be delivery, is not consistent. For the same oil company, in
included in the deliverables. the same field and the same measurement, it would
deliver a different number of channels.
When a new tool is designed, it often happens that not
all the environmental corrections are available. This is Logging companies often prefer to deliver the least
the case of eccentricity and apparent dip effects on the amount of digital data possible. They argue that oil
induction tools. These limitations can be overcome by companies are often unable to handle the sheer

9
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

Fig. 11 Example of LQC stamp

Logging corrections, in reality to perform better


company Measurement Sampling Channels interpretations.
A Spectral GR 6 in 5
Digital records contain time- or depth-indexed data
Dual laterolog 6 in 2
(channels or arrays), but also parameters. In addition,
Dual spaced
it is possible to add any contextual information. There
neutron 6 in 1
is no technical limitation to prevent a complete novel
Gamma ray 6 in 1
from being added to the digital file. Many logging
MicroSFL 6 in 1 companies use this capability to add parameter
Spectral density 6 in 5 listings, remarks and log chronologies to the file. But
Total 15 again, when it comes to non-depth-indexed data, the
data users are faced with a real lottery. They may get
B Triple combo 6 in 113
a lot, or hardly anything. In the absence of a standard
1 in 30
defining the required deliverables, some data sets
0.1 in 3 may be rich and others minimal. The onus is on the
0.2 in 5 oil company to verify that the data delivered is
2 in 138 complete.
Total 289
Critical components of the graphical file. Many log
B (well A) NMR 7.5 in 68
analysts do not look at graphical files. Therefore the
B (well B) NMR 7.5 in 81 digital file should contain any critical information
B (well A) Advanced neutron 6 in 65 that is shown on the graphical file, as illustrated in
B (well B) Advanced neutron 6 in 62 Figure 13.

Fig. 12 Comparison of the contents of digital files Recommended deliverables. A single digital data file
delivered by two logging companies. per well is often delivered to oil companies. It is
called a composite file. This type of file offers a
quantity of records. Nevertheless, this data is needed number of shortcomings.
to perform quality control, to eventually repair or
reprocess data sets, to improve environmental

10
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

But additional information is required for a complete


package of LWD data.

Tick marks. In a previous section, it has been shown


that sparse data can be interpolated to be shown as data
sampled every 10 cm or 6 in, thus giving a warm
feeling to the log analysts that they have plenty of data.

One graphical presentation reduces the confusion. A


tick mark is shown whenever a measurement is actually
performed. Tick marks should be associated to any
important curve. For instance, there should be tick
marks for the gamma ray curve, another series of tick
marks for the density curve, etc.

Fig. 13 Example of critical information available on a


graphical file. This information needs to be also
included in the digital file. Top: Critical annotation.
Bottom: Important remark on a calibration tail.

It does not include individual run parameters. A DLIS


record, for instance, allows only one single value for a
given parameter per data file. So when environmental
parameters, tool numbers, calibration coefficients are
changing (as they generally do when several runs are
involved), a DLIS file would only keep a record of the
last value of the parameter, that is to say, the value the
parameter took during the last run. If the log analysts
want to fine tune the environmental corrections, they
will be unable to find the values for runs before the last. Fig. 14: Tick marks shown in the depth track indicate a
variable data density. Between 623 ft and 610 ft, there
Consequently the recommended deliverable is a folder are only four data points. Between 656 ft and 638 ft,
containing a separate complete file (with raw data and there are six data points. Observe the plateau on the
QC curves) for each run and each pass. The composite gamma ray curve.
file is optional but oft-requested, and it may contain
only a selection of curves rather than the complete set. Recording rate or memory acquisition rate. This is the
This is acceptable. The composite file is a derived duration of a single sensor measurement before any
service, not the raw data product. processing. Literature mentions numbers from 5 s to 60
s, but this number is seldom reported. It is required, for
ADDITIONAL REPORTING FOR LWD
each sensor and each measurement, to estimate the
precision and resolution of the measurement.
Previous recommendations apply to all types of logs,
including open-hole, cased-hole wireline, and LWD.

11
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

Update rate. This number is sometimes mentioned in is essential to include in the end-of-well package the
the literature, from 11 to 100 s, but is never reported on original RT data (with values and depths as received in
a log. It is the rate in time at which real-time data is “real time”) to capture exactly what data was available
sent uphole. Considering the bottlenecking imposed by when operational decisions were taken.
a mud-pulse transmission, this rate is much lower than
the memory acquisition rate. It also varies with the Summary table. Several of the above requirements can
capacity of the downhole memory, expressed in hours be summarized in a table proposed in 1994
of acquisition. [Hutchinson, 1994] but that has not yet been
implemented by any logging company (Figure 16).
Maximum rate of penetration. At a high rate of
penetration, it is impossible to obtain data at a rate of AAA BBB
two samples/ft or a sample every 10 cm. Knowing the
Memory acquisition time
maximum rate of penetration for a given sample rate (2
Update rate
sample/ft or 1 sample every 10 cm) would be extremely
useful for the interpreter. This number can be computed max ROP (2 - samples/ft)
once the memory and transmission settings have been
selected, before the actual logging. That number must Fig. 16 Summary table for LWD acquisition para-
be communicated to the drillers before the operations, meters. AAA, BBB each represent a different
to ensure that data of adequate sampling is acquired – measurement.
particularly in the sections of most interest.
This table is a strict minimum and it is recommended
Information on gaps. Due to the irregular motion of the to collect the following additional information:
BHA and of the logging sensors, and to incidents with  Number of time frames per depth sample in
depth such as retying, recalibration or a computer real time , for each relevant channel
glitch, it may happen that no measurement is performed  Number of time frames per depth sample in
in a given depth interval. This could also take place on recorded mode processing, for each relevant
successive intervals. At which point is the logging channel.
company making it obvious that there are gaps? One
These channels can easily be included in the delivered
company does report this information (see Figure 15)
LAS file [Flaum, 2014]. This is all the more important
and it should be reported by all. In any instance, the
for information recorded with long measurement
raw data, prior to any interpolation, should also be
sequences (e.g., NMR).
delivered.
CONCLUSIONS

Establishing standards is a challenging endeavor. A


new standard means change and most would prefer to
keep their old ways. Adapting to a new standard
requires special efforts in training, implementation and
control. But the payoff is important. Implementing and
adhering to standards means less miscommunication,
Fig. 15 Information on gaps and interpolation
fewer errors, and higher confidence. Standards-
Real Time (RT) versus Recorded Mode (RM) data. compliant data is higher value data which benefits the
During operations, many decisions are taken on the individual data user in the short term, and the entire
basis of the available Real Time LWD data. Later, the industry in the longer term.
Recorded Mode data becomes available. Later yet, an
end-of-well package may include an updated set of RM
data, with slightly different curve values and depths. It

12
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

One important benefit of standardization is an easier The recommendations in this document are public and
data quality control. Today, logging data is delivered in non-proprietary. Exceptions to the standards are not
so many different formats that it is very difficult to train encouraged, but they are possible. An oil company may
people to recognize quality issues. For instance, every decide to waive a requirement (for example, not
logging company has different methods to present requiring a well plot or a directional survey) but this
calibrations and it is not easy to detect non-conforming should be clearly agreed with the data supplier. In all
records. cases, rigorous requirements need to be set before the
logging jobs.

REFERENCES ABOUT THE AUTHORS

API RP31A, 1967, Standard form for hardcopy of


downhole well log data.
API RP 66, 1996, Recommended practices for
exploration and production data digital interchange
Barber, T., et al., 1999, Real-time open hole evaluation,
Oilfield review, summer edition.
Canadian Well Logging Society, 2009, Floppy disk
committee, log ASCII standard. Philippe Theys graduated from Ecole Centrale de
Doll, H., 1948, The S.P. log: theoretical analysis and Paris, has written several dozens of papers and short
principles of interpretation. notes and is the author of two books, Log Data
Flaum, C., 2014, personal communication Acquisition and Quality Control and Quest for Quality
Hutchinson, M., 1994, Log quality assurance of Data. He consults for oil companies. In his spare time,
formation evaluation measurement while drilling Philippe travels (187 countries to-date), and writes non-
data, SPWLA 35th annual logging symposium, June technical books.
19-22.
Lamont-Doherty Earth Observatory, 2008, Seminar in
Marine Geophysics (www.ldeo.columbia.edu).
Loermans, T., Kimminau, S., Bolt, H., 1999, On the
quest for depth, 40th SPWLA annual logging
symposium, Oslo, Norway.
Reedy, G. K.., API recommended practice for
resistivity and other electromagnetic logs, Paper X, Thuy Rocque is Director of Petrophysical Technology
SPWLA, 34th annual symposium, 1993. for Anadarko Petroleum Corporation. In her current
Schneider, D. M., Hutchinson, M., Deady, R., 1994, role, she leads a large team of petrophysicists providing
Processing and quality assurance of unevenly global formation evaluation support to all exploration
sampled nuclear data recorded while drilling. and development projects. Thuy has been in the
SPWLA 35th annual Logging Symposium, June 19- petroleum industry for 29 years working as both a
22. petroleum engineer and a petrophysicist in various
Theys, P., 1994, A serious look at repeat sections. international and domestic assignments. Thuy began
SPWLA 35th annual logging symposium, June 19- her career as a petroleum engineer working the large oil
22. & gas fields in East Texas. Eventually, she transitioned
Theys, P., 1998, Le log column in The Log Analyst. to being a petrophysicist as she discovered the powerful
Report on the Taos data quality topical conference. combination of integrating engineering & geoscience
Theys, P., 1999, Log data acquisition and quality information. Prior to joining Anadarko, Thuy was with
control, Editions Technip. ARCO, Pennzoil, Miller & Lents, and Kerr-McGee.

13
SPWLA 55th Annual Logging Symposium, May 18-22, 2014

Thuy has a Bachelor of Science degree in Petroleum


Engineering from the University of Texas at Austin.

Thuy is active in the petroleum industry especially the


petrophysical community where she has held several
Society of Petrophysicists and Well Log Analysts
(SPWLA) Board positions including Houston Chapter
VP’s, Regional Directors, and VP of Membership, John Williams is a graduate of Imperial College
Finance, and Administration. Thuy is also active within London and is Senior Petrophysics Advisor in the BP
Anadarko helping recruit and guide staff career New Well Delivery function. The wide ranging aspects
development by serving on the Talent Review of data and their acquisition consumes much of his
Committee, Technical Recruiting Committee, and co- time, but worth it as data are one the industry’s most
founded a Petrotech Women Resource Group. She has valuable assets. He is a former Regional Director of the
also represented Anadarko and the energy industry in SPWLA.
Washington DC to inform Senate & Congressional
leaders & staffers about the roles of women in oil &
gas.

Martin Storey started in the oil and gas industry as a


logging engineer in South America some 25 years ago,
worked in Gabon as a wellsite petroleum engineer,
Monica Vik Constable is a Leading Advisor in petrophysicist and Total Quality Management
Petrophysical Operations in Statoil. She started her champion. An assignment in Brunei as a petrophysicist
career in the oilfield service industry in 1997 after and petrophysical technology planner followed. In both
receiving her MSc degree in Chemistry at the Gabon and Brunei, he implemented a large well data
Norwegian University of Science and Technology. management solution. Since 1998, he has been an
After six years working as a wireline field engineer for independent consultant and a trainer based in Western
Schlumberger in the UK, Holland and the US, she Australia. His areas of interest include operations,
started working as a Petrophysicist for Schlumberger practical petrophysics, auditability and quality. He
Data and Consulting Services in 2003. In 2005, Monica holds a BSc in Mathematics and computer science from
joined Statoil as a Senior Petrophysicist working on the Stanford University and an MSc in electrical
Kristin field; she subsequently went on to work in engineering from the California Institute of Technology
several other Statoil assets as a Principal Petrophysicist. (USA).
Monica took on her current position as a Leading
Advisor in Petrophysics in Statoil in 2009.

14

View publication stats

You might also like