FracaFlow Petrel Comparison

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.

2 By Subrata Chakraborty

Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2

Contents

Introduction ............................................................................................................................................. 2 
(A)  Summary of Fracture Modeling Workflow in FracaFlow (part of OpenFlow
2011.1) ........................................................................................................................................................ 2 
Input Data ................................................................................................................... 3 
(i)  ANALYSIS: FRACTURE RESERVOIR CHARACTERIZATION ....................... 3 
(ii)  FRACTURE NETWORK MODELING ................................................................ 7 
(iii)  FRACTURE MODEL CALIBRATION ................................................................ 9 
(iv)  FRACTURE EQUIVALENT PARAMETERS COMPUTATION (Upscaling) ... 10 
(B)  Features in FracaFlow not available in Petrel ............................................................. 11 
(i)  Attribute Computation from 2D Maps : ........................................................ 12 
(ii)  Curvature Analysis and Fault Picking : ........................................................ 13 
(iii)  Fault Density Computation in a Field : ......................................................... 13 
(iv)  Fracture Analysis at Wells : .......................................................................... 14 
(v)  Fracture Aperture Options during Modeling................................................ 15 
(vi)  Fault Corridor Analysis : ............................................................................... 16 
(vii) Unfolding : ....................................................................................................... 18 
(vii)  Dynamic Data Analysis :................................................................................ 18 
(a)  Precise Production Data Analysis : ........................................................................................ 20 
(b)  Breakthrough Analysis : ......................................................................................................... 20 
(c)  Display of Well Test Curves : .................................................................................................. 21 
(viii)  Subseismic faults generation : .................................................................. 22 
(ix)  Automatic Kh Calibration (AKC) : ................................................................. 23 
(x)  Automated Dynamic Calibration (ADC) :...................................................... 29 
(xi)  Dynamic Tests Simulations........................................................................... 32 
Summary .................................................................................................................................................. 33 
 


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
Introduction

FracaFlow is the 3D fracture modeling software module which is part of the OpenFlow
Suite Version 2011.1 of Beicip-Franlab. This write-up has two parts :

(A) Fracture Modeling Workflow as practiced in FracaFlow.


(B) Features out of this workflow which the author finds are much better or nonexistent
in Petrel 2010.2 of Schlumberger. While discussing only details of FracaFlow
features have been given, while the same for Petrel has been omitted to make this
document more concise.
Both the software versions compared are the latest versions marketed by both these
companies.
The comments about a comparable feature in Petrel are colored in blue in this write-up.
An attempt has been made to analyze some of the special features of OpenFlow
(FracaFlow) vis-à-vis Petrel with comparative comments. However, the judgment and
opinion expressed in the following pages belong exclusively to that of the author.
The purpose of making this document is not to criticize the fracture modeling
capabilities of Petrel but rather this is an attempt to analyze one of our competitors in
the field of fracture modeling and to understand what they have better than us and
where we need to improve. Also, it is felt that many of our DCS/SIS personnel in many
places of the world often come across clients who are comparing various software’s
available for fracture modeling or transferring model built in FracaFlow to Petrel due to
its overall advantage. This document will provide valuable information on FracaFlow to
such audience.

(A) Summary of Fracture Modeling Workflow in FracaFlow (part of OpenFlow


2011.1)

FracaFlow (part of OpenFlow 2011.1) divides fracture modeling process into four
distinct parts (Figure-1).
I. Data analyses to characterize the fractures and their relationships inside the
network.
II. Fracture and fault modeling using mainly stochastic methods, a realization of this
model is the Discrete Fracture Network (DFN).
III. Fracture model calibration using dynamic data.
IV. Fracture equivalent parameters computation, the final output for the fluid flow
simulator (upscaling).


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-1 : Broad Processes in Fracture Modeling in


FracaFlow (2011).

Input Data
• Reservoir grid : 3D Reservoir grid
• Wells :
o Fractures: interpreted well imagery (FMI, UBI, etc.) or core interpretation
(digitalized files).
o Classical wireline logging data: facies, porosity, G-ray etc.
• Horizons: Key structural horizons, if not available, they can be extracted from
grid layers tops and units.
• Faults: Interpreted faults in the field, if not available, they can be picked in the
Structural Analysis module
• Property map or grid (density driver): if not available, attributes related to the
fracture density can be built in FracaFlow. A density driver combining different
maps can also be computed in FracaFlow.
• Dynamic data
o Flowmeter, productivity index, skin, permeability from cores, well tests,
mud loss, production data for the dynamic data analysis modules.
o Well and interference tests, flowmeter for the calibration modules
(i) ANALYSIS: FRACTURE RESERVOIR CHARACTERIZATION
The seven key data analysis steps are explained below (Figure-2). The applicability of
all of them depends upon data available and also the field geology.
Step 1: Creation of the LithoStratigraphic scales


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-2 : Data anlayiss for Fracture Modeling.

Step 1.a: LithoStratigraphic scale from grid


The LithoStratigraphic scale from grid allows you to define the layering scheme using
the grid layers. This is essential for the software to recognise the stratigraphy it is
dealing with.

Step 1.b: LithoStratigraphic scale from horizons


The LithoStratigraphic scale from horizons allows you to define the layering scheme
using the horizons. It has to be consistent with the LithoStratigraphic scale from grid.
Step 2: Dynamic data analysis (Optional)

Input data: Flowmeter, productivity index, skin, permeability from cores, well tests, mud
losses and production data.

The objective is to determine whether the reservoir is fractured or not, and to estimate
the impact of fractures on the dynamic behaviour of the reservoir. The detailed analysis
of the dynamic data and cross check of the results will allow assessing the fractures
occurrence and their capacity to act as flow conducts in the reservoir. This analysis
consists in displaying:
• Histograms of dynamic data.
• Cross plots of the dynamic data versus the distance to faults: to evaluate the link
between fractures and faults (if any) and to measure the range of the fault
influence.


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
• Bubbles maps of the dynamic data: to analyse the spatial distribution of the
fractures.
• Log views of the different kind of static and dynamic data .
• The useful dynamic data to assess fracturing are the permeability (especially the
comparison of the Kh product from transient well test with the Kh computed from
core measurement), the production data, flowmeter, the mud losses records and
the productivity index of each well.
Output data: in a synthesis module all the results are displayed in tables and bubble
maps.
It’s possible to balance the analyses to put some of them forward. The final output is an
index of fracturing that could be translated into an answer to this question: “is this well
fractured?”
Step 3 : Fracture analysis

Input data: fractures interpreted from well imagery or core interpretation.

A global analysis allows creating fracture sets, i.e. fractures with the same orientation
(dip and dip-azimuth). Then with a well by well analysis, it’s possible to analyse clusters
(i.e. intervals of high fracture density), computing fracture density logs and comparing
them to other logs and facies.
At the end of the fracture analysis, you should be able to have an idea of your fractures
scale: small scale objects (joints) and/or large scale objects (faults and fracture
corridors). During this analysis, you will also find (or not) a link between the fracture
density and geological drivers (such as bed thickness, lithology etc.). The analysis of
the fracture density drivers will be refined using the Fracture Density Controller module.
Output data : information about the fracture properties that will be used during the
fracture modelling.

Step 4: Structural analysis (Optional)

Input data: horizons and/or clouds of points and/or faults.

This module allows you to compute attributes (such as curvature for instance) and to
use these attributes to pick faults. This is important if you want to model large scale
objects such as fracture corridors or sub-seismic faults. If you already have a fault
network in your database, you can use this module to check and edit it.

Output data: Structural attributes (on horizons and maps) and fault networks.

Step 5: Fault analysis (Optional)

Input data: 2D faults (lineaments or polygons), 3D triangulated faults, lithostratigraphic


scale from horizons.


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
Fault lineaments must be first projected on horizons (could be done in FracaFlow).
With this module you can create faults sets and analyse data in order to determine the
statistical distribution laws you will use during the fracture modelling step. You will go
through this step only if you want to model large-scale objects such as fracture corridors
or sub-seismic faults.

Output data: information about fault properties that will be used during the fault
modelling.

Step 6: Stress analysis (Optional)

Input data: a set of stress measurements along the well trajectory, pore pressure log,
fracture logs.

The stress state of the fracture can have an impact on the fluid conductivity. The
purpose of this module is to create fracture groups (using labels) to discriminate
potentially conductive fractures, using a Mohr Coulomb failure criterion.

Output data: a new log of fractures labelled according their stress state.

Step 7: Fracture density driver determination (Key step but Optional)

The way to obtain the fracture density driver could be very various. One workflow is
explained here (Figure-3), only part of it could also be done.

Figure-3 : Fracture Density Driver Determination.

Step 7.a: Fracture Density Log Computation

Input data: fractures interpreted from well imagery or digitalized core interpretation.

Here, you can compute fracture density logs of various types from fracture data and by
sets (defined previously in step 3), create fracturing facies and compute statistics
related to these fracturing facies, which will be very useful for the fracture modelling.


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Output data: fracture density logs and statistics on them.

Step 7.b: Fracture density controller

Input data: wells with fracture density logs and reservoir grid.

The Fracture Density Controller module is designed to analyse factors controlling the
fracture density in the reservoir. It provides you a synthetic driver combining various
properties controlling the fracturing. These properties can be imported data (porosity,
seismic attributes, geomechanical results, etc.) or data computed in FracaFlow, for
instance fault attributes (distance to fault, fault density…).
Note that all the properties that you want to use to build the density driver have to be
transferred on the reservoir grid. This can be done thanks to the Mapping functionality.
Output data: density driver as a grid property and as a well log.

Step 7.c: Geostatistical simulations

Input data: wells with fracture density logs and reservoir grid

The objective here is to derive a 3D density grid tied to wells from the density driver
computed during step 7.b and honouring the spatial structure of the data. This density
grid will be used later on to constrain the fractures densities, for each fractures set.
The different steps are:
• Well set creation.
• Bivariate histogram computation.
• Experimental variogram computation.
• Variogram modelling.
• Sequential Indicator Simulation (launched as a workflow).

Output data: a 3D density driver that can be used during the fracture modelling.
(ii) FRACTURE NETWORK MODELING


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-4 : Fracture Network Modeling Steps.

Step 8: Creation of a zone of interest

A zone of interest (ZOI) is a region in which you will compute the Discrete Fracture
Network (DFN). This DFN is one realization of the stochastic fracture model. You can
define several ZOI in order to quality check your fracture model and, if you have
observed data, to run dynamic tests in order to calibrate it.

Step 9 : Fracture modeling

Input data: grid and lithostratigraphic scale from grid; information about the fractures

Here you will use all the previous analysis results to define your fracture model
(conceptual model, geometrical attributes of your fractures, etc.). Different types of
models are available:
• Diffuse fractures (stratabound and non stratabound with different conceptual models
like facies related, fault related etc.) .
• Sub-seismic faults with the density following the fractal dimension.
• Deterministic faults.
• Full field DFN: large scale objects with a way of generation similar to diffuse
fractures (poissonian process).
Output data: fracture model at the reservoir scale and the Discrete Fracture Network
locally, in the zones of interest.

Step 10: Fracture Model QC (Optional)

Different QCs are available in FracaFlow to analyse fracture models and DFN:


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
3D view to have a visual control of the result; it’s possible to see only fractures
intersecting wells or the connected network to the well.
• DFN analysis through the Fracture Statistic Viewer to have information about
orientation, size or connectivity of the generated network.
• Extraction of the fracture log at the well to then compute the density log.
• However, two specific modules produce much advanced results.
Step 10.a: Connectivity Analysis

Input data: one or more DFN


Here you can check the number of connected fractures in DFN, discriminate these DFN
into clusters of connected fractures and remove the clusters with a low number of
connected fractures. To decrease the computation time, the decimated DFN can be
saved and used for the calibration and upscaling.
Output data: decimated DFN

Step10.b: Volumetrics

Input data: one or several fracture models and their associated fractures porosity, a
grid with matrix porosity and facies properties, the lithostratigraphic scale, a saturation
law or grid property.

The Volume calculation is done depending on the input data. You can assess volumes
in place in a ZOI or in the whole grid.

Output data: table giving oil and gas volumes.


(iii) FRACTURE MODEL CALIBRATION

The main objective of the calibration is to modify the fracture model to ensure its
coherency with observed dynamic data (flowmeter, well and interference tests). The
steps are explained graphically in (Figure-5).

Figure-5 : Fracture Model Calibration.


 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
Step 11 : Calibration

Once you have built your fracture model and computed the associated DFN, large
uncertainties remain on the fracture parameters. The dynamic calibration is a way to
reduce these uncertainties.

Step 11.a: Automated Kh Calibration (AKC)

Input data: Interpreted Kh value from well tests, a fracture model

The AKC module’s goal is to compute the best fracture model parameters (mean size,
mean conductivity, orientation and density) that will fit the measured Kh at wells
(interpreted well tests).
Indeed, the AKC’s method based on a genetic algorithm, allows - with a low simulation
cost - to reduce the parameters’ space to investigate and to identify correlations
between fracture parameters.
Output data: a calibrated fracture model

Step 11.b: Manual calibration: dynamic tests simulations

Input data: Observed well test, interference test or flowmeter and a fracture model

This module aims at simulating a flowmeter curve, a well or an interference test along a
well crossing a 3D fracture network. Comparing a simulated flowmeter curve or well test
with actual measurements gives qualitative information on the generated fracture
network reliability. It enables the validation of the fracture characterization in terms of
network geometry and moreover in terms of fracture conductivity distribution. Before
launching simulations, it’s strongly advised to replace the complex network by an
equivalent Warren & Root network (if enough connected). This network simplification is
done inside the dynamic tests module.

Output data: a calibrated fracture model

Step 11.b: Automated Dynamic Calibration (ADC)

Input data: Simulated well test, interference test or flowmeter and fracture models

The dynamic test is a trial and error method, therefore to improve the convergence
process, the strategy is to perform an AKC first, then the ADC, which is an optimization
loop for the simulation of dynamic parameters.

Output data: a calibrated fracture model


(iv) FRACTURE EQUIVALENT PARAMETERS COMPUTATION (Upscaling)

Step 12 : Upscaling

10 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-6 : Fracture Model Upscaling.

Input data: a fracture model and for Fractures and matrix option, the matrix properties
 
The upscaling objective is to reduce the complexity of the actual fracture system to a
few relevant equivalent parameters (permeability, porosity and block dimension) (Figure-
6). These equivalent parameters are the input data for the reservoir simulation model.
The upscaling can be performed at a local scale or at a full-field scale.
Two algorithms are available: analytical upscaling (quick and recommended if the
fracture network is well-connected, uses Oda 1985 and Poiseuille’s formulations) and
the flow-based upscaling. The analytical upscaling uses analytical formulae and can be
Local Analytical Upscaling (LAU) for a much quicker test job or a Full Field Analytical
Upscaling (FFAU).
Output: equivalent permeabilities and porosity, block sizes.

(B) Features in FracaFlow not available in Petrel

Only those features which are not present in Petrel 2010.2 or are present but better in
FracaFlow are described below.

Data Items that can be loaded and used in Fracture Modeling


- Mud Loss data as logs in wells.
- Fracture aperture data as logs in wells.
- Perforation data for wells.
- Production data as oil, gas, water volumes day wise.
- Flowmeter data as depth and % contribution.
- Productivity Index data for wells.
- Well test data for wells as pressure against time.
Data Viewer to view the loaded data
- Bubble Map viewer to display production data aerially by bubble diameter

11 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
variation (heterogeneous method) or colour variation (homogeneous variation).
Also allows viewing multiple fluid phases at well at the same time. (Petrel 2010.2
allows bubble map viewing.)
(i) Attribute Computation from 2D Maps :
Following attributes can be computed from any 2D horizon created or imported in
FracaFlow. These are supposed to give drivers for fracture in a field.
- Curvature map : 5 types as mentioned under Curvature Analysis.
- Azimuth of principal curvature map (This attribute represents the azimuth of
the minimum principal curvature. The angle is computed clockwise with regard to
North. For an anticline-oriented North-South, the azimuth will be 90).

- Slope map : This attribute is the local slope (slope gradient) of the map. It is
defined by a plane tangent to a topographic surface, as modelled by the
topography at a point. Slope is classified as a vector; as such it has a quantity
(gradient) and a direction (aspect). Slope gradient is defined as the maximum
rate of change in altitude as the compass direction of this maximum rate of
change.
- Illumination map :This attribute computes an artificial illumination of a map. The
sun shaded maps were either colour or grey-scale maps and illumination from
different directions facilitates identification of lineaments that strike at different
directions. The rays are supposed to be parallel and horizontal. The chosen
direction of computation (θ) represents the azimuth of the source light, counted
clockwise to the North.
- Normalization :The [0;1] normalization functions as follows: The maximum
(max) and the minimum (min) of the maps are identified. The formula applied is
“new_value = (initial_value – min) / (max – min)”. The 0 centered normalization
functions as follows: The absolute highest value (AH) of the map is identified.
The formula applied is “new_value = initial_value / AH”. Under this formula, the
attribute remains centered on 0. For instance, for a map between –100 and 1000,
the final map will be between –0.1 and 1.

- Minimum or maximum between two topographies


- Adding a constant thickness to a topography
- Minimum or maximum between two property maps
- Calculator : This is basically calculator functions offering arithmetic computations,
trigonometric computations, logarithmic computations, Boolean and logical
computations, predefined values and some special constants. This is a
combination of calculator plus the setting->operation options available for a 2d
surface in Petrel 2010.2.
Comparison with Petrel : Some of these attributes are available in Petrel 2010.2 but
some are perhaps not available to the best of my knowledge.

12 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
(ii) Curvature Analysis and Fault Picking :
Curvature analysis is often listed among the first steps of a fractured reservoir study.
The aim of this analysis is first to find areas or axes of intense reservoir curvature using
the top horizon mapped from 3D seismic to defined fractured zones. Folding is often
related to the development of faults and their relevant geometry. In this case, the
location and the density of fracture related to small-scale faults can be described from
the detection of bending, which can be quantified by curvature. The first step of a
curvature analysis is the smoothing of the horizon, performed to remove noises (areas
of low signal, pick errors…) and to remove the shortest structural wavelengths features.
Then by various methods (Gaussian, max principal curvature, min principal curvature,
mean curvature, oriented curvature along specified direction) the curvature of the
surface is computed. From the generated curvature map, faults can be picked up by
the fault picking tool (Figure-7).

Comparison with Petrel : There is no specific too/workflow in Petrel 2010.2 to do this, but


an aware user can do this and achieve similar result by a combination of methods.

Figure-7 : Example of Fault Picking from Curvature Surface.

(iii) Fault Density Computation in a Field :


The following types of fault density can be computed in FracaFlow.
- P21 density : The P21 fault density in a grid cell is the cumulated length of the
fault lineaments in a square window centered on this cell, divided by the window
area. The window size is user defined. The result is expressed in 1/(length unit)

13 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
- P32 density : The P32 fault density is computed from P21 maps (at each node
of the grid where P21 has been calculated) using the following formula:
P32=P21/Mean(sin(dip)). Where Mean is the standard mean; dip is the fault dip
taken in the fault dip distribution. the computed P32 is correct if the cell in which it
is computed contains a large number of faults with a dip distribution independent
of fault orientation. Therefore, to compute P32 from P21, the user must provide
the program with the following parameters - Selection of the map where P32 is to
be computed, Mean dip (°),Dip dispersion (°),Strike dispersion (°).Using these
parameters, the algorithm computes the average of sin(dip) by random draws of
values in a binormal distribution with one horizontal axis (from the fault strike and
the strike dispersion given by the user) and a vertical axis (from the dip and dip
dispersion given by the user). The P21 is then divided by the returned value to
obtain P32.
- Throw density : The throw density is calculated as follows: inside a small square
window centered on the cell of the map, to each segment of the fault trace is
added the length of this segment multiplied by its corresponding mean throw.
The window size is user defined.
(iv) Fracture Analysis at Wells :
It is possible to compute and display several logs from fracture data at wells loaded
from image logs or cores :

o P10 Density
o Local Density
o Set density (It is a density log corrected with the interval Terzaghi correction)
o P21 density
o P32 from lineic P10 density
o P32 from cylindric P10 density
Local and P10 options are always available
Set density option is available only if a fracture set has been selected.
P21 option is available only if Core Fractures are available in the selected wells.
P32 from lineic P10 option is available only if a Fracture Set is selected.
P32 from cylindric P10 option is available only if Core Fractures are available in the
selected wells and if a Fracture Set is selected.
FracaFlow follows following fracture density indexes
Dimension of  1 2 3
measurement support →
Length Areas Volumes
Dimension of fracture ↓

0: number of fractures P10 P20 P30

Number of fractures  Number of fractures per  Number of fractures per 


per length unit (m‐1) surface unit (m‐2) volume unit (m‐3)

14 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
1: Length P21 P31

Cumulative fracture length  Cumulative fracture length 
per surface unit (m‐1) per volume unit (m‐2)

2: Area P22 P32

Fracture area per  Cumulative fracture area 
cumulative surface unit (%) per volume unit (m‐1)

3: Volume P33

Fracture volume per 
volume unit (= fracture 
porosity) (%)

Comparison with Petrel : Petrel 2010.2 can very well calculate the fracture intensity
log, but perhaps the fracture data loading is more dependent on the knowledge of the
user on the subject.

(v) Fracture Aperture Options during Modeling


In FracaFlow, two types of fracture models are recognized :
Stratabound : “Stratabound” fractures means that the fractures are generated from the
center of each “strata” or mechanical bed.
Non-Stratabound : non-stratabound fractures are generated in 3D and are not
constrained by the beds.
In fracture modeling, some types of models have specific options.
In case of the non-stratabound fractures, if further distribution laws for orientation, size,
aperture, and conductivity are added, it is possible to correlate several of these
parameters.

• Correlation between the aperture and the length:


The uncorrelated option means that the aperture and the size of the fracture
are independent variables. The distribution defined by the user then applies
to the aperture.
The correlated option means a coupling between the aperture and the size of
the fracture. The distribution defined by the user applies to the aspect ratio
between the aperture and the size.
• Correlation between conductivity and aperture:

15 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
The uncorrelated option means that the conductivity and the size of the
fracture are independent variables. The distribution defined by the user then
applies to the conductivity.
The correlated option means a coupling between the conductivity and the
aperture of the fracture. In this case, the distribution defined by the user
applies to λ where:
e3
C=λ
12
where C denotes the conductivity, e the aperture. Obviously λ = 0
corresponds to an impermeable fracture and λ = 1 corresponds to an ideal
Poiseuille flow between two planes. This parameter λ can be chosen inferior
to 1 in order to model a loss of conductivity from the Poiseuille solution due to
asperities on the fracture faces. If λ is superior to 1, the result will be an
improved conductivity compared to Poiseuille ideal case for a Newtonian
flow.
Comparison with Petrel : In Petrel 2010.2, for stratabound fractures you can give
either one constant value of fracture aperture or Petrel has four models for giving
fracture aperture, exponential, power, normal, log-normal, and all centred around mean
value of aperture. How these values relate to fracture length is not clear to me. You
really need to make your fracture model segmented within a strata or zone to give
different fracture aperture values, which creates problems of its own.
Then fracture permeability also has five options, constant value or mathematical models
like exponential, power, normal, log-normal, the default being the exponential which
relates permeabilities to aperture by a cubic law. At least this quite a bit of improvement
compared to Petrel 2009 where user had to assign these fracture properties by patch
calculator.
(vi) Fault Corridor Analysis :
It is also possible to get an assessment of the fault zone affecting fracture density by
doing corridor plot analysis. The corridor panel allows to assess the influence of
neighboring faults on wells. To do so, the values of different dynamic data are plotted
versus the distance of the well to the selected faults in a corridor plot (Figure-8). Those
data are: Mud loss per unit length, PI per unit length, KHtest/KHcore ratio from a well
test and production rates (cumulative or maximum).

If the faults in the field are conductive or if there are zones with increased fracturing
close to them, the values listed above are expected to be higher for wells close to those
faults.

16 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-8 : Example of Corridor Plot.


Two lines are displayed on each plot:
The vertical line: Use the displacement button (in the top right corner of the
sheet) to move this line to find the distance of influence of the selected fault
sets. The value of the X axis is displayed on top of this line.
The horizontal line is automatically computed and corresponds to the average
value of the data displayed to the right of the vertical line, corresponding to
the expected value without fault influence. This horizontal line is updated
automatically when you displace the vertical line.
On these plots, a fault zone of influence can be identified. The example below (Figure‐9)
shows the distribution of dynamic data in case of a "corridor"-fractured field: wells far
from the faults don't show any evidence of fault influence. This interpretation can only be
done when the fracturing in the field is related to fracture corridors or if it is increased
close to wells.

17 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-9 : Fault zone influence interpretation from Corridor Plot.

Comparison with Petrel : Petrel does not have such feature.


(vii) Unfolding :
Unfolding is used to rotate the fracture planes in accordance with the stratification
planes identified, in order to revert to a flat stratification configuration (prior to folding).
In the selected wells, the selected stratification planes and the fracture planes are
identified. From the stratification planes the mean stratification is computed using the
Orientation Sort algorithm. Then the rotation axis and the rotation angle is computed.
The rotated fracture plane normal vector is obtained by multiplying the rotation matrix by
the fracture plane normal vector.
Comparison with Petrel : Petrel does not have this unfolding feature. This is important
as it allows to examine if a set of fracture data has relation with folding or not.
(vii) Dynamic Data Analysis :
In a more general way, analysis of dynamic data can help us find out if the fluid flow
behaviour is typical of a fractured reservoir. Therefore we can conclude if a field needs
to be considered as fractured and if fractures are present in all wells. The dynamic
analysis will help us to identify the typical damage zone length around a fault, which will
be a parameter of our fracture model. The change will appear on the Schmidt stereoplot
diagram where the fractures are inclined. This makes it easier to interpret the data, and
it is helpful for establishing whether the fracture appeared before or after folding.
The QAF analysis table (on the right of the screen) is automatically updated. Each line
corresponds to a different well. Each column corresponds to a type of analysis.

18 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
The module allows you to perform the following analysis, depending on data availability:

- Flowmeter analysis
- Mud loss analysis
- Productivity Index analysis
- Production analysis
- Well test analysis
After the dynamic analysis module is run by Right click in the Study Explorer (Select
New > Analysis > Dynamic Analysis), a blue tick (9) is displayed if the analysis is
possible based on the type of dynamic data loaded. The results of this analysis can be
of the following types (Figure-10, Figure-11).

Figure-10 : The different panels


available for viewing the results
of Dynamic Analysis.

19 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-11 : Cross plot panel, a graph displays PI/L versus the


Number of fractures at wells.

(a) Precise Production Data Analysis :


The investigation of the the production data of wells in details to understand their
relation with fractures, the panel Production can be used. This panel allows you to
display the production curves of several wells, for all fluids (water, oil and gas). Both
instant production and cumulative production can be compared. The things that can be
displayed are Qo, Qw, Qg, Cummm Oil, Cumm Water, Cumm gas, FW, WOR, GOR,
Pw.
It is therefore possible to make more precise observations than with the Bubble map:
you can spot a sudden change of production trend and compare it with other wells or
discover that a well you thought was not very productive on the map has in fact just
been put into production and might yield a greater recovery in the end....etc.
(b) Breakthrough Analysis :
Another specific feature of fractured reservoirs is early water or gas breakthrough. The
Breakthrough panel is dedicated to the analysis of such data. To define gas
breakthrough, you need to set a GOR threshold (manually, using the cursor, or by
typing the value in the dedicated box). This threshold corresponds to the minimal value
of the GOR used to define gas breakthrough.
The depth is in TVDSS and corresponds to the deepest perforation depth (if water
breakthrough is being studied) or the shallowest perforation depth (if gas breakthrough
is being studied). This cross plot is useful to assess the evolution of the depth of the
WOC or GOC.

20 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-12 : Example of a GOR breakthrough analysis plot.

A regression line is also plotted on the graph as shown in the example below (Figure-
12). The equation of the regression line is displayed at the top of the sheet. This line
represents the trend of evolution of the fluid contact so if a well is far from it, it means
that is has experienced early breakthrough. This graph will be used to sort the wells
between late and early breakthrough, and this classification will be used later in the
Quick Assessment of Fracturing Intensity (QAF) Synthesis module. The wells below the
red line is considered as having early breakthrough whereas the others will be
considered as having late breakthrough.

(c) Display of Well Test Curves :

It is also possible to view the curves of pressure and rate versus time during a well test
in the Well Test panel. We can plot the measured pressure curves (“Data curves”
graph) and log-log pressure curves, including the computed pressure derivative curve
(“Computed Data” graph). The computation is based on the information found in the
Well test interpretation (the “Type” indicates whether the test is a build-up test or a
drawdown test) whereas it is possible in the “Well Data” panel to enter the “Initial
pressure” (to indicate the beginning of a drawdown test) or the “Start of Build up”. The
calculations are done assuming we have only one period of rate history ending at the
“Start of Build up” date. The option panel offers a Smoothing option.

21 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure 13 : Display of Well Test curves in FracaFlow.

Comparison with Petrel : In Petrel 2010.2 we can display the pressure data, but I am
not sure pressure derivative curve can be displayed.

(viii) Subseismic faults generation :

Sub-seismic faults are all fractures below the seismic resolution threshold. Some of
them are still large enough to play an active role in reservoir compartmentalization or
reservoir drainage. A discrete model of the pattern of major sub-seismic fractures is
often necessary to assess their impact on production.
Lots of fault network studies have demonstrated the fractal nature of some real fault
patterns and raised the idea that fractal analysis could be used to constrain the
modeling of sub-seismic faults, given a seismically resolved pattern of major faults. The
sub-seismic fault generation available in FracaFlow is based on this assumption.
The first phase of the process consists in initializations of:
- The local average fault strike; this is estimated by interpolating the local strikes of
the major faults of the fault set chosen by the user.
- The box-counting log-log curve; this is computed with the fault set chosen by the
user to get the initial fractal dimension of the fault network.
The object-oriented modeling of the pattern of sub-seismic faults now begins. The sub-
seismic faults are generated one by one with a "point process". The following steps are
run through for each fault:
- A point of initiation (a "seed") is chosen randomly on the top of the unit of the
reservoir; the process can be biased in order to get a heterogeneous location of
seeds among the reservoir.
- The fault length is drawn at random in the statistical distribution chosen by the
user.

22 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
- Next, the 2D sub-seismic fault trace is generated incrementally, segment by
segment, according to some kind of "growth process". Each segment during this
growth process goes through one single cell of the reservoir model; the segment
strike is drawn at random in a Von Mises distribution with the local average value
estimated during the initialization phase.
- The growth stops when the fault has reached its total pre-defined length.
- The sub-seismic fault trace now obtained is added to the network if the fractal
nature of the fault pattern is still respected. This is done by checking the box
counting log-log curve of the new network. If the fractal nature is still respected,
the new sub-seismic fault trace is definitively added to the network; otherwise, it
is rejected and a new seed is generated.
This algorithm is based on the fault lineament generation defined above. It is used
sequentially by horizon, with a progression to the top or to the bottom as chosen by the
user. This processing is applied until the last horizon.
(ix) Automatic Kh Calibration (AKC) :
Once a fracture model is computed and the associated DFN have been generated,
large uncertainties remain on the fracture parameters. The dynamic calibration is a way
to reduce these uncertainties.
The Automated Kh calibration (AKC) module’s goal is to compute the Fractures and
Fault model parameters (mean conductivity especially) that will best fit the measured Kh
at wells (obtained from well test interpretations).
The strategy for dynamic calibration is to perform the AKC first, before the Automated
Dynamic Calibration (ADC).
The AKC is an automatic iterative process. At each step, an analytical upscaling is
performed around each well to calculate its permeability-thickness kh. The parameters
of the fracture model change at each step of this optimization process, whose final aim
is to propose the combination of those parameters that fits best the interpreted kh from
actual well test.
The inversion method used in AKC is a generic algorithm. This class of algorithm takes
advantage of the principles of genetics mechanisms and evolution’s laws to perform a
selection inside a set of parameters (Figure-14).

23 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-14 : The Workflow of the AKC algorithm.

Step-1: Initialization

During the initialization phase, the invertible parameters of the fracture set model values
(selected in the Parameters tab) are chosen at random within the defined validity
intervals (Figure-15).

Figure-15 : Initialization of AKC.

Step-2 : Evaluation
Then, during the evaluation phase, a local analytical upscaling is performed around
each selected wells (which size is the ZOI radius entered in the input tab) (Figure-16).
The result of this local upscaling is then compared to the measured values through the
computation of an objective function f(X)1.

24 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure-16 : AKC Evaluation step.

Step-3 : Selection

This comparison allows the process to select the best solutions between all the results
according to the objective function value. The quality of the solution (and then its
selection) is estimated from a probability2 inversely proportional to this objective function
value (Figure-17).

Figure-17 : AKC Selection step.

Step-4 : Cross Over and Mutation

25 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
After the selection process, the algorithm uses two mechanisms derived from the
genetic process which takes place during chromosomes duplication : Cross over and
Mutation.

The Cross over mechanism is used to define new offspring from two given parents
taken from the previously selected population of results. First of all, the algorithm
selects a proportion Pc of the population according to their probabilities p(X) then, two
siblings are defined from two parents according to the steps given in (Figure 18).

Figure 18 : AKC Cross over.

Thus the algorithm defines some permutation (crossover) of parameters chosen among
parents (i.e., set of parameters) selected from their abilities to minimize the objective
function.

The Mutation mechanism is used to randomly redefine the values of the set of
parameters defining a population. First of all, the algorithm select a proportion Pm of the
population of solution according to their probabilities p(X), then for each selected
solutions, a new solution is defined by selecting randomly one of its property and
modifying its value as :

26 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

Figure 19 : AKC Mutation step.

The application of these two mechanisms (cross over and mutation) creates a new
generation of parameter set that will be in turn evaluated through the objective function
and so on until the maximum number of generation has been reached.

The AKC process requires the definition of working context. The user has to define the
wells (Kh test values) to be matched by the algorithm, the fracture set and related
parameters to be inverted; the minimum and maximum values to be used; the type and
parameters of the upscaling run by the algorithm and the weight attributed to each well
in the objective function.

AKC : Input parameters

The fracture model to calibrated and the well test interpretations to be used are
selected. The user has to define a region around the wells (Radius ZOI) in which the
local analytical upscaling will be launched and the perforation limits. The ZOI and the
perforation zones should represent the drainage area and are difficult to estimate.
Therefore, a sensitivity analysis on ZOI size and perforation limits may be useful.

AKC : parameters to be inverted

27 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
In a calibration strategy usually the first focus is on conductivity and density and then on
size and orientation and finally to identify correlations.

AKC parameters (optimization settings)

The AKC algorithm requires the following parameters :

Number of generation : This is the number of generation (created by selection, cross


over and mutation process) for which the algorithm will stop. Generally, it does not need
to be large (the default number is 5).

Initial number of solutions : This is the number of solution computed and used at the
first generation and used afterwards in the selection process. The greater this number,
the greater the pool of parameters for the algorithm to search for new solutions, the
better is the sampling. However, a high initial number of solution increases the
optimization cost. It is recommended to repeat tests by increasing progressively this
parameter.

Maximum number of solutions : This is the maximum number of solution that can be
computed during the optimization process. It has the same role in the algorithm than the
initial number of solutions but it means that at each optimization loop, this number of
solution will be computed, so the greater this number the slower is the execution of the
algorithm.

The cross over probability : This is the probability (in %) for a solution to cross over
with another one. The greater this probability, the wider the range of possibilities tested
by the algorithm. It has to be superior to 50% to sample effectively the solutions
neighborhood.

The probability of mutation : This is the probability (in %) for a given solution to
mutate. This parameter allows increasing the random sampling. It has the same
consequences on the result of the algorithm than the cross over probability.

AKC Results

The results for all the computed solutions (Maximum number of solution defined by the
user) are for all fracture set models, an optimized value for each of its invertible
parameters. The value of the objective function is displayed for each solution as the
Solution error. For each well, the Relative error is also displayed. This relative error is
the difference between permeability from well test interpretation and simulated
permeability related to the measured permeability (in %):

((Kh measured – Kh simulated) / Kh measured) * 100

28 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
The best solution (which is not necessary the one with the minimum solution error) can
then be selected as a new Fracture model to be used in a new upscaling process to
compute better equivalent parameters.

(x) Automated Dynamic Calibration (ADC) :

Once you have generated your fracture model and computed the associated DFN, large
uncertainties remain on the fracture parameters. The dynamic calibration is a way to
reduce these uncertainties. The Automated Dynamic calibration (ADC) module’s goal is
to compute, using simulations, the Fractures and Fault model parameters (mean
conductivity,…) that will best fit the well tests. ADC activity can be created via the menu
bar (Calibration > Automated Dynamic Calibration) or by right-clicking in the Study
Explorer (New > Calibration > Automated Dynamic Calibration).
Once the dynamic data is loaded at the wells, the modules shows the following :
- Fracture Models built with properties therein
o Orientation
o Size
o Aperture
o Conductivity
o Density
- Available wells with following information
o Well name
o Type of test (flowmeter, well test or interference test)
o Observed test
o Simulated test
o Fracture model associated to the test
Once the wells and the fracture model (with properties) are selected the input for ADC is
set. The following table displays the list of invertible parameters according to the chosen
model:
Facies Based Block Related Fault Related

Orientation Von Mises dip & Von Mises dip & Von Mises dip & Von
Von Mises dip-azi Von Mises dip-azi Mises dip-azi
per block
Size mean length, mean length per mean length,
crossing block, crossing crossing
probabilities per probabilities per probabilities per
facies facies and per block facies
Conductivity mean conductivity mean conductivity mean conductivity
per block
Density (i.e. average average spacing per average spacing per average spacing per
spacing - facies facies and per block facies
Stratabound case)
Aperture Constant constant per block Constant

29 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

An example of the results from the ADC is shown below (Figure-20).

Figure-20 : Example of the results from ADC.

• Upper part of the panel [1]:


o On the left side, a window [1a] lists the optimized fracture models. To display the
objective function of a fracture model on the right-side graph, select it in the list.
By default, the first fracture model of the list is selected. Note that it is possible to
select several fracture models.
o On the right side, a 2D graph [1c] displays the variations of the objective function
versus the iteration number. A scrolling menu above this graph allows choosing
between representing:
- The total value of the objective function (Total value). In this case, each
curve corresponds to a fracture model.
- The error associated to each test (Per test). In this case, a curve is
displayed for each test.
- The error associated to all the tests for a given well (Per test). In this case,
a curve is displayed for each well.
The graphical setting of this plot can be changed in the Outline.
If you select an iteration on the graph, this iteration will appear in the table [1b]
along with its associated error.

30 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
o Selected Fracture Models & Errors table [1b]: for each selected Fracture model,
this table lists the three iterations that gave the best results (minimum error
during the optimization step1) along with their associated error. It is possible to
remove the iteration of a fracture model.
You can stop the calibration after step1, if you are satisfied with the first
optimization results. To do so, remove from the table the fracture models
iterations that you do not want to keep and click on Validate to save the retained
fracture model in the database. This new fracture model appears in the Study
Explorer.

o Compute Final Optimization on Selection: this button allows launching a second


step of optimization for each selected fracture model of the table Selected
Fracture Models & Errors [1b].

• Fracture set properties (middle part of the panel, [2])


o The table Corresponding Fracture Sets [2a] lists the fractures sets associated to
the Fracture models selected in the table Selected Fracture Models & Errors [1b].
If you select a fracture set in the list, its optimised properties appear in the table
below ([2b]).
o The table [2b] summarizes parameters information and optimization results:
− Property type: from Orientation, Size, Conductivity, aperture and Density.
− Parameter: considered parameter associated to the property
− Domain of definition of the considered parameter: All (for the whole domain),
Block (block1, block2, etc.), Facies (facies1, facies2, etc.), Unit (unit1, unit2,
etc.)
− Optimized Value: value of the parameter after the optimization process.
− Unit: m, mD.m, m-1, None, etc.

o The graph [2c] on the right displays the variation of the properties selected in the
table [2b] versus the iteration number. Two properties can be selected
simultaneously in the table and displayed on the graph.

• Dynamic tests (lower part of the panel, [3])


A table [3a] displays the tests associated to the current fracture model and their
associated error. On the right side, a plot [3b] (Pressure versus time and Rate
versus time) allows comparing the simulated tests (in red) with the observed tests
and associated confidence intervals (in blue), for the current iteration of the fracture
model.
After the Final Optimization of the ADC is run, the results of the dynamic calibration is
displayed, which is similar to Figure-20 except that.

- There is a table named Fracture Models & Initial Error [1a] (top left of the
panel) that lists the fracture models selected during step1 and their
associated initial errors.

31 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
- The table named Selected Fracture Models & Errors [1b] lists the three
best iterations (minimum error during the optimization step 2) of the
fracture models selected in the table Fracture Models & Initial Error.

After fracture model modification ADC can be re-run to obtain the best calibration.

(xi) Dynamic Tests Simulations

This module aims at simulating a flowmeter curve, a well or an interference test along a
well crossing a 3D synthetic fracture network. Comparing a simulated flowmeter curve
or well test with actual measurements gives qualitative information on the synthetic
fracture network reliability. It enables the validation of the fracture characterization in
terms of network geometry and moreover in terms of fracture conductivity distribution.
Building, for example, a synthetic flowmeter curve involves a numerical simulation of the
oil flow (single phase, compressible) through the reservoir to the producing well.
Physically, this is a steady state flow where the oil contained in the reservoir matrix first
goes to the nearest fracture and then goes through the fracture network to the well.
the methodology is based on a dual porosity approach. Both single and dual
permeability models are available. The single permeability model is valid for connected
diffuse fracture networks and large matrix/fracture permeability contrasts. For other
types of fracturing (like fracture swarms), simulation results would be more reliable via
the double permeability model.
This method aims at simulating a process where the oil contained in the reservoir
moves to a producing well. This is a single phase (the only moving phase is oil),
compressible (both rock and oil are compressible) flow.
o Using this module, several actions are possible:
• Flowmeter simulation
• Well test simulation
• Interference test simulation
• Network simplification
The network simplification is useful in the case where the DFN to be calibrated is too
dense and dynamic simulations lead to memory issues. The simplification changes the
existing network in an equivalent one, decreasing the number of fractures to deal with.

Flowmeter Simulation : To perform a flowmeter simulation, following data are needed

o A well with its associated properties (diameter, completion data, skin);


o A DFN, simplified or not (or an imported DFN);
o A Zone of Interest associated to the DFN;
o A Flow transfer model (a model 2Φ1K is set by default);
o Matrix properties;
o Fluids properties;

32 
 
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty

 
o Computation parameters (time steps, maximum and minimum pressure drop…);
o Completion data;
o Initial pressure of the reservoir;
o Reservoir temperature;
o Reference time and storage periods for the pressure field.

Well test Simulation : To perform a well test simulation (transient well test), the
following data are needed.

o A well with its associated properties (diameter, completion data, skin);


o A DFN, simplified or not (or an imported DFN);
o A Zone of Interest associated to the DFN;
o A Flow transfer model (a model 2Φ1K is set by default);
o Matrix properties;
o Fluids properties;
o Computation parameters (time steps, maximum and minimum pressure drop…);
o Production rate history;
o Initial pressure of the reservoir;
o Reservoir temperature;
o Reference time and storage periods for the pressure field.

Interference Test Simulation : In the case of an interference test, the only different
parameters you have to set compared to the well test simulation are the wells status. All
the computation parameters, fluids and matrix properties are defined exactly as for a
well test simulation process.

Network Simplification : The objective of the network simplification is to reduce the


amount of fractures around a well in order to speed up the well test or flowmeter
simulations. The fractures directly nearby the well are not simplified although those far
from the trajectory are turned into an equivalent network, based on a Warren and Root
scheme.
To open the Network simplification process, it is necessary to launch a dynamic test
activity (flowmeter or well test simulation).
Then, after having chosen the input parameters, the DFN to be simplified needs to be
highlighted : the parameters of the network simplification will appear on the right part of
the panel.

Summary

The latest version of FracaFlow has better input data analysis features than the
contemporary version of Petrel. When it comes to fracture model calibration, FracaFlow
is way ahead of Petrel. A fracture model generated in Petrel can be dynamically
validated using Petrel RE/Eclipse/SWPM together, but the approach in FracaFlow is
much better, wherein the modeller or the team of modellers calibrates and improves the
model while developing it and hands over a much better product to the reservoir
engineers for dynamic use of the model.

33 
 

You might also like