2008 12 RECORDER Strange - But - True PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Strange but True

ARTICLE
Stories of Synthetic Seismograms
Paul Anderson* and Rachel Newrick**
*Apache Canada; **Nexen Petroleum U.K.

What follows is a summary of the April 2008 Lunchbox We present the nuts and bolts of synthetic seismograms from
Presentation made by the authors. Additional information is the perspective of the seismic interpreter; we look at what can
go wrong, what does go wrong, and what you can do to
available from the CSEG website, which is available at
prevent falling into some of the pitfalls that arise.
http://www.cseg.ca/events/webcasts/2008/lunchbox/04apr-
NewrickAnderson/2008Apr21-NewrickAnderson.html. According to the Schlumberger Oilfield Glossary, a synthetic
seismogram is defined as:
Abstract “The result of one of many forms of forward modeling to
predict the seismic response of the Earth. A more narrow
Synthetic seismograms are critical in understanding seismic
definition used by seismic interpreters is that a synthetic seis-
data. We rely upon them for an array of tasks, from identi-
mogram, commonly called a synthetic, is a direct one-dimen-
fying events on seismic data to estimating the full waveform
sional model of acoustic energy traveling through the layers
for inversion. That said, we often create and use synthetic
of the Earth. The synthetic seismogram is generated by
seismograms without much thought given to the input log
convolving the reflectivity derived from digitized acoustic
data or erroneous synthetic results. Contrary to popular
and density logs with the wavelet derived from seismic data.
belief, and despite how we treat them as interpreters, log
By comparing marker beds or other correlation points picked
measurements are not ground truth, and not all synthetic
on well logs with major reflections on the seismic section,
seismogram algorithms are created equal. Have you ever
interpretation of the data can be improved.
created a synthetic seismogram that has a tenuous tie to
seismic? Have you wondered why the synthetic seismogram The quality of the match between a synthetic seismogram
changes significantly when the density curve is included? depends on well log quality, seismic data processing
Have you queried the quality of the sonic and density logs quality, and the ability to extract a representative wavelet
used to generate your synthetic seismogram? Have you seen from seismic data, among other factors. …”
a synthetic seismogram change when a different software
package is used? If you can answer yes to any of these ques- The basic process of creating a synthetic is simple (Figure 1)
tions and / or you would agree that synthetic seismograms and described in many geophysical text books, however it is
are not always simple in your world, then this is the article in the details that problems occur. Input density and sonic
for you. logs are multiplied together to obtain an impedance log. This

Figure 1. Cartoon describing the process of generating and correlating a synthetic.

Continued on Page 52

December 2008 CSEG RECORDER 51


Article Cont’d
Strange but True Stories of…
Continued from Page 51
is converted to time, reflectivity calculated, and convolved with then a washout zone is definitely not “fit for purpose”.
a wavelet to obtain the synthetic seismogram. In the Geophysicists must be clear with their professional colleagues:
Schlumberger definition presented above, we have highlighted “fit for purpose” requirements vary.
some of the inputs and steps that can have problems and/or
Simply discarding the density log may not be the best procedure,
complications. Keep in mind that doing everything “correctly”
as it may lead to erroneous correlations or phase assumptions.
can still lead to issues at any or all of the above steps. Below, we
Discarding the entire log implies that we have concluded that the
discuss a few pitfalls we have witnessed.
entire log is free of any meaningful content. This is clearly not the
case. A better option may be to regenerate the density log over
Density and Sonic Logs affected intervals using all available supplemental information,
although we suggest that this is best left to a petrophysicist.
As geophysicists, we often assume that the well information,
Figure 2 shows an example of a synthetic created using velocity
particularly log data, provided by our geologist or petrophysicist
* “raw density”, velocity * “edited density”, and velocity only
is the “ground truth”. Nonetheless, we often find high amplitude
(constant density), where velocity is calculated from the sonic
events in our synthetic seismograms that do not correlate to
log. Differences exist between all three, leading to potentially
geology. A common “fix” is to throw out the density log, and
different interpretations of which events correspond to the tops
build synthetics without it. Depending on the software we use,
of the sands.
that means the density is approximated with a constant, or with
some pre-defined regression to the sonic log. In either case what Despite being less sensitive to borehole conditions, sonic logs can
we have done is declare the entire density log to be unfit, without also have issues. Sonic logs are measured over a much larger
really understanding the full picture. An old proverb declares interval, with redundancies built in to most units. Additionally,
that the devil is in the details. In this case, it turns out, like so sonic logs are the result of a process similar to first-break picking
often in geophysics, the details are a QC issue. on conventional seismic data, essentially the result of picking the
time of the direct arrival on several receivers, and computing the
Density logs are greatly affected by borehole conditions such as time it takes energy to travel from the source to each of the
washouts or invasion. Reliable density log measurements of the receivers, spaced over several meters. In the end, sonic logs
rock require good pad contact with the borehole wall. Without appear to be better behaved because they effectively average
good pad contact, density log measurements will be spuriously over a much larger interval than density. The easiest thing we
low. A geologist or petrophysicist may indicate that the density can do to check the logs is to look at a caliper log. This will give
log is “fine” over the zone of interest. “Fine” is a subjective term. you an indication of the potential for poor readings and where
The subjectivity is related to the purpose of the log. If one is the synthetic amplitudes are suspect. Additionally, this informa-
simply looking for porosity, then the presence of a washout may tion can be used with additional logs and/or wells to reconstruct
indicate porosity, and the log is then suitably “fit for purpose”. If the poor log values over that interval, providing a means of
one needs a reasonably accurate measurement of the rock’s bulk creating a more reliable synthetic trace. However, such replace-
density, in order to build a representative synthetic seismogram, ment techniques must be used with care. If a section of log is

Figure 2. Different synthetics created with different density curves. Track 2 shows the raw density log (grey) compared to the edited density log (black), the change in density
has been highlighted. Note how the sands change from peaks to troughs on the synthetic trace. This can yield a very different interpretation. Using a constant density (same
as no density) also produces a trace with differences from what should be expected. Courtesy Jay Gunderson.
Continued on Page 53

52 CSEG RECORDER December 2008


Article Cont’d
Strange but True Stories of…
Continued from Page 52
“bad” in a well this suggests something about the rock
properties in this part of the well. If this is replaced by
“good” section of log from a different hole, then it may be
that the log was only good because the rock properties
were different in the other well, and therefore not an
appropriate replacement.

One common problem with older logs, is that they have


often been digitized from paper and from time to time
human error enters the equation. Figure 3a shows a
dramatic acoustic velocity increase at 1350m. Referring to
the raster copies of the logs (Figure 3b), we see that there
is a change in the grid at the velocity break suggesting
that there may be a digitizing problem. It is possible that
the digitizer thought that this was the location of a scale
change although we note that more often scale changes
are missed rather than inserted by mistake. To check, we
can often electronically peel back the lower log to see the
overlap zone as shown in Figure 3c. While this is easy to
check and to fix once recognized, we must visually recog-
nize the possible error, making it potentially harder to
diagnose.
Figure 3. Example of digitizing problem common with older wells. (a) shows the downloaded
Sonic logs are effectively the result of geophysical curve, (b) shows the paper logs, where there is a potential change in the lateral scale. (c) shows
processing being applied to an acoustic measurement of the log displayed with the correct scales. Note that the apparent break on the downloaded
the subsurface (similar to first-break picking), which can curve (a) is not present on the edited raster image (c).
have the same problems as surface seismic data in terms of
noise, tool problems, etc. An example is
presented in Figure 4a, where erro-
neous values in the sonic log have
made it all but useless for synthetic
generation. When we look at the repeat
log (Figure 4b) we see that the same
erroneous values occur but not always
at the same depths. This does not mean,
however, that the data itself is poor, but
perhaps has not been fully QC’d.

Looking at the full waveform presented


in Figure 4c, we can see that the
semblance autopicker has jumped from
the compressional waveform (DTCO3)
to the shear waveform (DTSH3). The
human eye quickly observes the error
in the computer semblance pick – so,
like seismic, we should be QC’ing the
processing of well log data. It is also a
good idea to request and archive a copy
of the raw waveforms from your
logging company so that the processing
can be revisited as new technology
becomes available.

Another common pitfall often encoun-


tered by junior geophysicists is the
“casing point reflection”. Logging runs
typically extend from TD up to surface
casing but should be truncated at the
casing point before being interpreted.
As we record up the borehole from rela-
tively slow velocity shallow formation
into solid steel casing, there is a large

Continued on Page 54

December 2008 CSEG RECORDER 53


Article Cont’d
Strange but True Stories of…
Continued from Page 53
impedance contrast that manifests itself as a high amplitude We believe that the examples presented above highlight the
event on the synthetic (Figure 5), which is solely associated with necessity for interpreters to pay close attention to the well logs,
the surface casing (SC) not geology. This is easily spotted if all to the point of having a general understanding of the data acqui-
the logs are inspected in conjunction with the density and sonic, sition and processing and an idea of the borehole conditions. If
and we are aware of our location in the borehole. The resistivity in doubt, logs should be QC’d by a petrophysicist.
tool shows a large increase when entering casing and provides a
useful QC in this case. There are instances where this phantom The order of operations
event has been interpreted as a shallow marker and used to
correlate across townships of data, despite not being a real Thankfully, good logging runs without washouts or processing
geologic marker. Although this may seem to be an easy pitfall to problems do occur, but that is not all that we are concerned
avoid, it is far more hazardous when we consider all the interme- about. When correlating logs to seismic, we know that we are
diate casing points in deep boreholes. calculating a synthetic from flawed log data and correlating it to
flawed seismic data, but how often do we consider the effect of
our choice of software. After all, most of the software in use by
major E & P companies has certainly undergone rigorous testing,
right? True, but that does not mean it is perfect. The software
may be running on a flawless computer but programs often do
not always work the way we expect or assume.

Figure 1 shows the processing flow used by many software


companies to create synthetics, however, are they all the same
and, if not, does it matter? A comparison of the order of opera-
tions used by several of the major software vendors (Table 1)
shows that not every package contains the same order of opera-
tions – and it matters.

Flow A Flow B
Load logs (sonic & density) Load logs (sonic & density)
Calculate impedance Calculate impedance
Calculate reflectivity Convert to time
Convert to time Calculate reflectivity
Convolve with wavelet Convolve with wavelet
Table 1. Comparison of flows for creating synthetic seismograms
The only difference is the order of steps 3 & 4, but does it matter?
Figure 4. A QC display from a dipole sonic run. Tracks (a) & (b) show the main pass Three different synthetic seismograms created with identical well
and repeat p-wave sonic logs with cycle skipping present. The two right tracks show logs and wavelets (Figure 6) should theoretically be identical,
different picking displays, showing why the initial processing resulted in cycle skip- however we observe that the first two traces are reasonably similar
ping, and where the p-wave slowness should have been picked. but the third trace, the one that uses flow B, is quite different. The
only difference is the software used to create the trace.

Does the wavelet make a huge difference?


Let’s assume that the log data is perfect and the software is
creating synthetics as we expect – let’s consider the wavelet.
Often the first thing an interpreter does with his or her
seismic data upon receiving it from the processor is to corre-
late it with a synthetic, and, in those first few minutes,
several judgments about the quality of the data and the
seismic processing are made. The wavelet is a critical factor
when generating the synthetic but does it matter which one
we use? The answer depends on what we are doing with the
synthetic. For example, if we are simply estimating the
approximate phase of the data (within 45°) and identifying
the zone of interest, then most wavelets will be fine, gener-
ally. If however we are working on a more detailed analysis,
for Q-estimation or impedance inversion, then the wavelet
can critically influence the result.

A comparison of a strong reflectivity contrast with four


Figure 5. Synthetic made including the apparent “casing-point reflection”. Best-practice different wavelets is shown in Figure 7. While the main
would remove the upper (i.e. non-geologic) portion of the affected logs off before generating event does not vary much in this example, many of our
a synthetic to avoid potential pitfalls.

Continued on Page 55

54 CSEG RECORDER December 2008


Article Cont’d
Strange but True Stories of…
Continued from Page 54
hydrocarbon targets are contained in the side-lobe of a major • Keep an eye on your time-depth curve
event (e.g. Wabamun porosity, Slave Point, Lower Mannville
sands, etc.). The frequency content of the wavelet should closely • Know your wavelet
match the wavelet contained within your seismic data, the esti- • Not all software is created equally
mation of which is another paper in itself.
Not discussed but just as important are other more complex
Making a synthetic tie factors, such as sample rates during depth-to-time conversion,
AVO effects, attenuation and imaging effects. All these factors,
We’re happy with our input logs, the synthetics are being and others, are contained within the seismic data we are
generated as expected and the wavelet is appropriate for our correlating or our 1D simple synthetics, which can also lead to
seismic tie – so let’s tie the synthetic to the seismic. Most likely erroneous conclusions. R
all interpreters have had some ‘fun’ tying synthetics to seismic
data, so we are just going to illustrate one of the more bizarre
software glitches.

In our last example, applying a bulk shift to a synthetic within a


commercial software program went awry (Figure 8). A standard
script was run to apply a specified time shift. What resulted,
however, was that instead of applying the bulk shift, the
program applied a stretch to the first portion of the log, and a
bulk shift to the rest, resulting in a well log that appears signifi-
cantly longer than it is in reality. We note that the synthetic looks
reasonable in that upper portion that was stretched and we can
even convince ourselves that the tie is reasonable as the seismic
is low fold in the shallow section so not all the events are
correctly represented. But these events are not real! It is impor-
tant to keep an eye on your time-depth curve and don’t take your
software for granted.

Final comments and key points


We have presented a series of examples in which the creation and
use of synthetic seismograms have created issues for inter-
preters. Synthetic seismograms are not as simple as they seem
and many things can go wrong, so we advise:

• Use all available logs for QC

• If in doubt, check the paper logs Figure 6. Large differences in synthetics using identical input data and wavelet
parameters. Note the apparent amplitude/phase differences denoted by the red
• Just like seismic, QC every step of well log processing arrows and the large difference apparent in the cross-correlation of these 3
traces. Traces 1 & 2 were created using flow A from table 1, trace 3 from flow
• If you need help, call your petrophysicist B. Even following a similar flow, differences exist between the first two traces
in how they handle the end of the logs indicated by the apparent reflectivity
• Know the borehole environment present below 810 ms.

Figure 7. Differences in apparent reflectivity from wavelet differences. Note the differences in apparent resolution for these wavelets, for example within the box above. Often
the subtle character changes we are looking for from our targets (e.g. Wabamun porosity) can be masked or accentuated by the wavelet we use.

Continued on Page 56

December 2008 CSEG RECORDER 55


Article Cont’d
Strange but True Stories of…
Continued from Page 55

Acknowledgements We would be remiss if we did not acknowledge the support of


our spouses, Ian Crawford & Elizabeth Anderson, in putting up
We are grateful to Apache Canada Corp. and Nexen Inc. for with us while we pulled out our hair in trying to understand the
permission to present this material at the CSEG lunchbox talk in issues around what most of us take for granted.
April, 2008. Additionally, we thank the audience at our lunchbox
talk for participating in a lively discussion, providing personal The Schlumberger Oilfield Glossary can be found at:
experiences and asking interesting questions. Additionally, we http://www.glossary.oilfield.slb.com/
would like to thank Dave Monk, Marco Perez and Ron Larson
for their critiques of this article.

Figure 8. Software does not always behave. In this example, a bulk shift was applied to a synthetic, however the result was that only the lower half of the logs were bulk
shifted while the upper portion of the log was stretched. The pitfall here is the apparent reflectivity created up-hole which could be erroneously interpreted.

Paul Anderson was born and raised in Alberta and received a Bachelor degree in geophysics from the University of Calgary. He
is currently working as a Senior Geophysicist in the Exploration & Production Technology group at Apache Canada Ltd. Paul's
current responsibilities include seismic processing QC, multicomponent processing & analysis, AVO analysis & prestack inver-
sions, in Canada and abroad. Paul is also working towards his master's degree with the University of Calgary CREWES project.

Before joining Apache in January 2006, Paul was with Veritas GeoServices in Calgary (1998-1999, 2000-2005) and Houston (1999-
2000), where his responsibilities included AVO & LMR analysis, Post stack inversions, modeling, multicomponent interpretation
and fracture detection work. He has a wide range of experience in a number of geologic basins, including; Western Canada,
onshore Texas, Gulf of Mexico, Beaufort Sea, Mackenzie Delta, Alaska & Offshore Australia. Paul is currently a member of
APEGGA, CSEG, CSPG & SEG.

Rachel Newrick was born and raised in New Zealand where she obtained her BSc in geology and BSc Honours in geophysics
at Victoria University of Wellington. Since then, she has been a summer student for BHPP in Australia; spent 2 1/2 years traveling
the world; worked for Veritas in Canada; undertook 4 month work terms with Occidental Petroleum and Exxon Mobil in the
United States; co-authored a text book on exploration geophysics with Dr. Larry Lines and obtained a PhD in exploration seis-
mology from the University of Calgary under the supervision of Dr. Don Lawton and Dr. Deborah Spratt. After graduating from
the U of C in 2004, Rachel started work with Nexen Inc. in the Canadian Exploration New Growth Team as an exploration
geophysicist, worked on prospects across SK, AB and BC, worked on and traveled to Colombia as a geophysicist, visited Yemen
and NE BC while in the Formation Evaluation Team where she worked for 18 months, and most recently has been posted to the
UK as an exploration geologist.

She founded a motorcycle group for women in 1998 (Wild West Vixens) and a geoscience networking group for women in 2004 (GoGG – Girls of
Geology and Geophysics). Both are going strong. In her spare time she is either traveling, skiing, or road riding (albeit now in the UK, not Canada!).

56 CSEG RECORDER December 2008

You might also like