Professional Documents
Culture Documents
FracaFlow Petrel Comparison
FracaFlow Petrel Comparison
FracaFlow Petrel Comparison
2 By Subrata Chakraborty
Contents
Introduction ............................................................................................................................................. 2
(A) Summary of Fracture Modeling Workflow in FracaFlow (part of OpenFlow
2011.1) ........................................................................................................................................................ 2
Input Data ................................................................................................................... 3
(i) ANALYSIS: FRACTURE RESERVOIR CHARACTERIZATION ....................... 3
(ii) FRACTURE NETWORK MODELING ................................................................ 7
(iii) FRACTURE MODEL CALIBRATION ................................................................ 9
(iv) FRACTURE EQUIVALENT PARAMETERS COMPUTATION (Upscaling) ... 10
(B) Features in FracaFlow not available in Petrel ............................................................. 11
(i) Attribute Computation from 2D Maps : ........................................................ 12
(ii) Curvature Analysis and Fault Picking : ........................................................ 13
(iii) Fault Density Computation in a Field : ......................................................... 13
(iv) Fracture Analysis at Wells : .......................................................................... 14
(v) Fracture Aperture Options during Modeling................................................ 15
(vi) Fault Corridor Analysis : ............................................................................... 16
(vii) Unfolding : ....................................................................................................... 18
(vii) Dynamic Data Analysis :................................................................................ 18
(a) Precise Production Data Analysis : ........................................................................................ 20
(b) Breakthrough Analysis : ......................................................................................................... 20
(c) Display of Well Test Curves : .................................................................................................. 21
(viii) Subseismic faults generation : .................................................................. 22
(ix) Automatic Kh Calibration (AKC) : ................................................................. 23
(x) Automated Dynamic Calibration (ADC) :...................................................... 29
(xi) Dynamic Tests Simulations........................................................................... 32
Summary .................................................................................................................................................. 33
1
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Introduction
FracaFlow is the 3D fracture modeling software module which is part of the OpenFlow
Suite Version 2011.1 of Beicip-Franlab. This write-up has two parts :
FracaFlow (part of OpenFlow 2011.1) divides fracture modeling process into four
distinct parts (Figure-1).
I. Data analyses to characterize the fractures and their relationships inside the
network.
II. Fracture and fault modeling using mainly stochastic methods, a realization of this
model is the Discrete Fracture Network (DFN).
III. Fracture model calibration using dynamic data.
IV. Fracture equivalent parameters computation, the final output for the fluid flow
simulator (upscaling).
2
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Input Data
• Reservoir grid : 3D Reservoir grid
• Wells :
o Fractures: interpreted well imagery (FMI, UBI, etc.) or core interpretation
(digitalized files).
o Classical wireline logging data: facies, porosity, G-ray etc.
• Horizons: Key structural horizons, if not available, they can be extracted from
grid layers tops and units.
• Faults: Interpreted faults in the field, if not available, they can be picked in the
Structural Analysis module
• Property map or grid (density driver): if not available, attributes related to the
fracture density can be built in FracaFlow. A density driver combining different
maps can also be computed in FracaFlow.
• Dynamic data
o Flowmeter, productivity index, skin, permeability from cores, well tests,
mud loss, production data for the dynamic data analysis modules.
o Well and interference tests, flowmeter for the calibration modules
(i) ANALYSIS: FRACTURE RESERVOIR CHARACTERIZATION
The seven key data analysis steps are explained below (Figure-2). The applicability of
all of them depends upon data available and also the field geology.
Step 1: Creation of the LithoStratigraphic scales
3
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Input data: Flowmeter, productivity index, skin, permeability from cores, well tests, mud
losses and production data.
The objective is to determine whether the reservoir is fractured or not, and to estimate
the impact of fractures on the dynamic behaviour of the reservoir. The detailed analysis
of the dynamic data and cross check of the results will allow assessing the fractures
occurrence and their capacity to act as flow conducts in the reservoir. This analysis
consists in displaying:
• Histograms of dynamic data.
• Cross plots of the dynamic data versus the distance to faults: to evaluate the link
between fractures and faults (if any) and to measure the range of the fault
influence.
4
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
• Bubbles maps of the dynamic data: to analyse the spatial distribution of the
fractures.
• Log views of the different kind of static and dynamic data .
• The useful dynamic data to assess fracturing are the permeability (especially the
comparison of the Kh product from transient well test with the Kh computed from
core measurement), the production data, flowmeter, the mud losses records and
the productivity index of each well.
Output data: in a synthesis module all the results are displayed in tables and bubble
maps.
It’s possible to balance the analyses to put some of them forward. The final output is an
index of fracturing that could be translated into an answer to this question: “is this well
fractured?”
Step 3 : Fracture analysis
A global analysis allows creating fracture sets, i.e. fractures with the same orientation
(dip and dip-azimuth). Then with a well by well analysis, it’s possible to analyse clusters
(i.e. intervals of high fracture density), computing fracture density logs and comparing
them to other logs and facies.
At the end of the fracture analysis, you should be able to have an idea of your fractures
scale: small scale objects (joints) and/or large scale objects (faults and fracture
corridors). During this analysis, you will also find (or not) a link between the fracture
density and geological drivers (such as bed thickness, lithology etc.). The analysis of
the fracture density drivers will be refined using the Fracture Density Controller module.
Output data : information about the fracture properties that will be used during the
fracture modelling.
This module allows you to compute attributes (such as curvature for instance) and to
use these attributes to pick faults. This is important if you want to model large scale
objects such as fracture corridors or sub-seismic faults. If you already have a fault
network in your database, you can use this module to check and edit it.
Output data: Structural attributes (on horizons and maps) and fault networks.
5
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Fault lineaments must be first projected on horizons (could be done in FracaFlow).
With this module you can create faults sets and analyse data in order to determine the
statistical distribution laws you will use during the fracture modelling step. You will go
through this step only if you want to model large-scale objects such as fracture corridors
or sub-seismic faults.
Output data: information about fault properties that will be used during the fault
modelling.
Input data: a set of stress measurements along the well trajectory, pore pressure log,
fracture logs.
The stress state of the fracture can have an impact on the fluid conductivity. The
purpose of this module is to create fracture groups (using labels) to discriminate
potentially conductive fractures, using a Mohr Coulomb failure criterion.
Output data: a new log of fractures labelled according their stress state.
The way to obtain the fracture density driver could be very various. One workflow is
explained here (Figure-3), only part of it could also be done.
Input data: fractures interpreted from well imagery or digitalized core interpretation.
Here, you can compute fracture density logs of various types from fracture data and by
sets (defined previously in step 3), create fracturing facies and compute statistics
related to these fracturing facies, which will be very useful for the fracture modelling.
6
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Input data: wells with fracture density logs and reservoir grid.
The Fracture Density Controller module is designed to analyse factors controlling the
fracture density in the reservoir. It provides you a synthetic driver combining various
properties controlling the fracturing. These properties can be imported data (porosity,
seismic attributes, geomechanical results, etc.) or data computed in FracaFlow, for
instance fault attributes (distance to fault, fault density…).
Note that all the properties that you want to use to build the density driver have to be
transferred on the reservoir grid. This can be done thanks to the Mapping functionality.
Output data: density driver as a grid property and as a well log.
Input data: wells with fracture density logs and reservoir grid
The objective here is to derive a 3D density grid tied to wells from the density driver
computed during step 7.b and honouring the spatial structure of the data. This density
grid will be used later on to constrain the fractures densities, for each fractures set.
The different steps are:
• Well set creation.
• Bivariate histogram computation.
• Experimental variogram computation.
• Variogram modelling.
• Sequential Indicator Simulation (launched as a workflow).
Output data: a 3D density driver that can be used during the fracture modelling.
(ii) FRACTURE NETWORK MODELING
7
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
A zone of interest (ZOI) is a region in which you will compute the Discrete Fracture
Network (DFN). This DFN is one realization of the stochastic fracture model. You can
define several ZOI in order to quality check your fracture model and, if you have
observed data, to run dynamic tests in order to calibrate it.
Input data: grid and lithostratigraphic scale from grid; information about the fractures
Here you will use all the previous analysis results to define your fracture model
(conceptual model, geometrical attributes of your fractures, etc.). Different types of
models are available:
• Diffuse fractures (stratabound and non stratabound with different conceptual models
like facies related, fault related etc.) .
• Sub-seismic faults with the density following the fractal dimension.
• Deterministic faults.
• Full field DFN: large scale objects with a way of generation similar to diffuse
fractures (poissonian process).
Output data: fracture model at the reservoir scale and the Discrete Fracture Network
locally, in the zones of interest.
Different QCs are available in FracaFlow to analyse fracture models and DFN:
8
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
3D view to have a visual control of the result; it’s possible to see only fractures
intersecting wells or the connected network to the well.
• DFN analysis through the Fracture Statistic Viewer to have information about
orientation, size or connectivity of the generated network.
• Extraction of the fracture log at the well to then compute the density log.
• However, two specific modules produce much advanced results.
Step 10.a: Connectivity Analysis
Step10.b: Volumetrics
Input data: one or several fracture models and their associated fractures porosity, a
grid with matrix porosity and facies properties, the lithostratigraphic scale, a saturation
law or grid property.
The Volume calculation is done depending on the input data. You can assess volumes
in place in a ZOI or in the whole grid.
The main objective of the calibration is to modify the fracture model to ensure its
coherency with observed dynamic data (flowmeter, well and interference tests). The
steps are explained graphically in (Figure-5).
9
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Step 11 : Calibration
Once you have built your fracture model and computed the associated DFN, large
uncertainties remain on the fracture parameters. The dynamic calibration is a way to
reduce these uncertainties.
The AKC module’s goal is to compute the best fracture model parameters (mean size,
mean conductivity, orientation and density) that will fit the measured Kh at wells
(interpreted well tests).
Indeed, the AKC’s method based on a genetic algorithm, allows - with a low simulation
cost - to reduce the parameters’ space to investigate and to identify correlations
between fracture parameters.
Output data: a calibrated fracture model
Input data: Observed well test, interference test or flowmeter and a fracture model
This module aims at simulating a flowmeter curve, a well or an interference test along a
well crossing a 3D fracture network. Comparing a simulated flowmeter curve or well test
with actual measurements gives qualitative information on the generated fracture
network reliability. It enables the validation of the fracture characterization in terms of
network geometry and moreover in terms of fracture conductivity distribution. Before
launching simulations, it’s strongly advised to replace the complex network by an
equivalent Warren & Root network (if enough connected). This network simplification is
done inside the dynamic tests module.
Input data: Simulated well test, interference test or flowmeter and fracture models
The dynamic test is a trial and error method, therefore to improve the convergence
process, the strategy is to perform an AKC first, then the ADC, which is an optimization
loop for the simulation of dynamic parameters.
Step 12 : Upscaling
10
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Input data: a fracture model and for Fractures and matrix option, the matrix properties
The upscaling objective is to reduce the complexity of the actual fracture system to a
few relevant equivalent parameters (permeability, porosity and block dimension) (Figure-
6). These equivalent parameters are the input data for the reservoir simulation model.
The upscaling can be performed at a local scale or at a full-field scale.
Two algorithms are available: analytical upscaling (quick and recommended if the
fracture network is well-connected, uses Oda 1985 and Poiseuille’s formulations) and
the flow-based upscaling. The analytical upscaling uses analytical formulae and can be
Local Analytical Upscaling (LAU) for a much quicker test job or a Full Field Analytical
Upscaling (FFAU).
Output: equivalent permeabilities and porosity, block sizes.
Only those features which are not present in Petrel 2010.2 or are present but better in
FracaFlow are described below.
11
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
variation (heterogeneous method) or colour variation (homogeneous variation).
Also allows viewing multiple fluid phases at well at the same time. (Petrel 2010.2
allows bubble map viewing.)
(i) Attribute Computation from 2D Maps :
Following attributes can be computed from any 2D horizon created or imported in
FracaFlow. These are supposed to give drivers for fracture in a field.
- Curvature map : 5 types as mentioned under Curvature Analysis.
- Azimuth of principal curvature map (This attribute represents the azimuth of
the minimum principal curvature. The angle is computed clockwise with regard to
North. For an anticline-oriented North-South, the azimuth will be 90).
- Slope map : This attribute is the local slope (slope gradient) of the map. It is
defined by a plane tangent to a topographic surface, as modelled by the
topography at a point. Slope is classified as a vector; as such it has a quantity
(gradient) and a direction (aspect). Slope gradient is defined as the maximum
rate of change in altitude as the compass direction of this maximum rate of
change.
- Illumination map :This attribute computes an artificial illumination of a map. The
sun shaded maps were either colour or grey-scale maps and illumination from
different directions facilitates identification of lineaments that strike at different
directions. The rays are supposed to be parallel and horizontal. The chosen
direction of computation (θ) represents the azimuth of the source light, counted
clockwise to the North.
- Normalization :The [0;1] normalization functions as follows: The maximum
(max) and the minimum (min) of the maps are identified. The formula applied is
“new_value = (initial_value – min) / (max – min)”. The 0 centered normalization
functions as follows: The absolute highest value (AH) of the map is identified.
The formula applied is “new_value = initial_value / AH”. Under this formula, the
attribute remains centered on 0. For instance, for a map between –100 and 1000,
the final map will be between –0.1 and 1.
12
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
(ii) Curvature Analysis and Fault Picking :
Curvature analysis is often listed among the first steps of a fractured reservoir study.
The aim of this analysis is first to find areas or axes of intense reservoir curvature using
the top horizon mapped from 3D seismic to defined fractured zones. Folding is often
related to the development of faults and their relevant geometry. In this case, the
location and the density of fracture related to small-scale faults can be described from
the detection of bending, which can be quantified by curvature. The first step of a
curvature analysis is the smoothing of the horizon, performed to remove noises (areas
of low signal, pick errors…) and to remove the shortest structural wavelengths features.
Then by various methods (Gaussian, max principal curvature, min principal curvature,
mean curvature, oriented curvature along specified direction) the curvature of the
surface is computed. From the generated curvature map, faults can be picked up by
the fault picking tool (Figure-7).
13
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
- P32 density : The P32 fault density is computed from P21 maps (at each node
of the grid where P21 has been calculated) using the following formula:
P32=P21/Mean(sin(dip)). Where Mean is the standard mean; dip is the fault dip
taken in the fault dip distribution. the computed P32 is correct if the cell in which it
is computed contains a large number of faults with a dip distribution independent
of fault orientation. Therefore, to compute P32 from P21, the user must provide
the program with the following parameters - Selection of the map where P32 is to
be computed, Mean dip (°),Dip dispersion (°),Strike dispersion (°).Using these
parameters, the algorithm computes the average of sin(dip) by random draws of
values in a binormal distribution with one horizontal axis (from the fault strike and
the strike dispersion given by the user) and a vertical axis (from the dip and dip
dispersion given by the user). The P21 is then divided by the returned value to
obtain P32.
- Throw density : The throw density is calculated as follows: inside a small square
window centered on the cell of the map, to each segment of the fault trace is
added the length of this segment multiplied by its corresponding mean throw.
The window size is user defined.
(iv) Fracture Analysis at Wells :
It is possible to compute and display several logs from fracture data at wells loaded
from image logs or cores :
o P10 Density
o Local Density
o Set density (It is a density log corrected with the interval Terzaghi correction)
o P21 density
o P32 from lineic P10 density
o P32 from cylindric P10 density
Local and P10 options are always available
Set density option is available only if a fracture set has been selected.
P21 option is available only if Core Fractures are available in the selected wells.
P32 from lineic P10 option is available only if a Fracture Set is selected.
P32 from cylindric P10 option is available only if Core Fractures are available in the
selected wells and if a Fracture Set is selected.
FracaFlow follows following fracture density indexes
Dimension of 1 2 3
measurement support →
Length Areas Volumes
Dimension of fracture ↓
14
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
1: Length P21 P31
Cumulative fracture length Cumulative fracture length
per surface unit (m‐1) per volume unit (m‐2)
Fracture area per Cumulative fracture area
cumulative surface unit (%) per volume unit (m‐1)
3: Volume P33
Fracture volume per
volume unit (= fracture
porosity) (%)
Comparison with Petrel : Petrel 2010.2 can very well calculate the fracture intensity
log, but perhaps the fracture data loading is more dependent on the knowledge of the
user on the subject.
15
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
The uncorrelated option means that the conductivity and the size of the
fracture are independent variables. The distribution defined by the user then
applies to the conductivity.
The correlated option means a coupling between the conductivity and the
aperture of the fracture. In this case, the distribution defined by the user
applies to λ where:
e3
C=λ
12
where C denotes the conductivity, e the aperture. Obviously λ = 0
corresponds to an impermeable fracture and λ = 1 corresponds to an ideal
Poiseuille flow between two planes. This parameter λ can be chosen inferior
to 1 in order to model a loss of conductivity from the Poiseuille solution due to
asperities on the fracture faces. If λ is superior to 1, the result will be an
improved conductivity compared to Poiseuille ideal case for a Newtonian
flow.
Comparison with Petrel : In Petrel 2010.2, for stratabound fractures you can give
either one constant value of fracture aperture or Petrel has four models for giving
fracture aperture, exponential, power, normal, log-normal, and all centred around mean
value of aperture. How these values relate to fracture length is not clear to me. You
really need to make your fracture model segmented within a strata or zone to give
different fracture aperture values, which creates problems of its own.
Then fracture permeability also has five options, constant value or mathematical models
like exponential, power, normal, log-normal, the default being the exponential which
relates permeabilities to aperture by a cubic law. At least this quite a bit of improvement
compared to Petrel 2009 where user had to assign these fracture properties by patch
calculator.
(vi) Fault Corridor Analysis :
It is also possible to get an assessment of the fault zone affecting fracture density by
doing corridor plot analysis. The corridor panel allows to assess the influence of
neighboring faults on wells. To do so, the values of different dynamic data are plotted
versus the distance of the well to the selected faults in a corridor plot (Figure-8). Those
data are: Mud loss per unit length, PI per unit length, KHtest/KHcore ratio from a well
test and production rates (cumulative or maximum).
If the faults in the field are conductive or if there are zones with increased fracturing
close to them, the values listed above are expected to be higher for wells close to those
faults.
16
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
17
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
18
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
The module allows you to perform the following analysis, depending on data availability:
- Flowmeter analysis
- Mud loss analysis
- Productivity Index analysis
- Production analysis
- Well test analysis
After the dynamic analysis module is run by Right click in the Study Explorer (Select
New > Analysis > Dynamic Analysis), a blue tick (9) is displayed if the analysis is
possible based on the type of dynamic data loaded. The results of this analysis can be
of the following types (Figure-10, Figure-11).
19
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
20
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
A regression line is also plotted on the graph as shown in the example below (Figure-
12). The equation of the regression line is displayed at the top of the sheet. This line
represents the trend of evolution of the fluid contact so if a well is far from it, it means
that is has experienced early breakthrough. This graph will be used to sort the wells
between late and early breakthrough, and this classification will be used later in the
Quick Assessment of Fracturing Intensity (QAF) Synthesis module. The wells below the
red line is considered as having early breakthrough whereas the others will be
considered as having late breakthrough.
It is also possible to view the curves of pressure and rate versus time during a well test
in the Well Test panel. We can plot the measured pressure curves (“Data curves”
graph) and log-log pressure curves, including the computed pressure derivative curve
(“Computed Data” graph). The computation is based on the information found in the
Well test interpretation (the “Type” indicates whether the test is a build-up test or a
drawdown test) whereas it is possible in the “Well Data” panel to enter the “Initial
pressure” (to indicate the beginning of a drawdown test) or the “Start of Build up”. The
calculations are done assuming we have only one period of rate history ending at the
“Start of Build up” date. The option panel offers a Smoothing option.
21
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Comparison with Petrel : In Petrel 2010.2 we can display the pressure data, but I am
not sure pressure derivative curve can be displayed.
Sub-seismic faults are all fractures below the seismic resolution threshold. Some of
them are still large enough to play an active role in reservoir compartmentalization or
reservoir drainage. A discrete model of the pattern of major sub-seismic fractures is
often necessary to assess their impact on production.
Lots of fault network studies have demonstrated the fractal nature of some real fault
patterns and raised the idea that fractal analysis could be used to constrain the
modeling of sub-seismic faults, given a seismically resolved pattern of major faults. The
sub-seismic fault generation available in FracaFlow is based on this assumption.
The first phase of the process consists in initializations of:
- The local average fault strike; this is estimated by interpolating the local strikes of
the major faults of the fault set chosen by the user.
- The box-counting log-log curve; this is computed with the fault set chosen by the
user to get the initial fractal dimension of the fault network.
The object-oriented modeling of the pattern of sub-seismic faults now begins. The sub-
seismic faults are generated one by one with a "point process". The following steps are
run through for each fault:
- A point of initiation (a "seed") is chosen randomly on the top of the unit of the
reservoir; the process can be biased in order to get a heterogeneous location of
seeds among the reservoir.
- The fault length is drawn at random in the statistical distribution chosen by the
user.
22
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
- Next, the 2D sub-seismic fault trace is generated incrementally, segment by
segment, according to some kind of "growth process". Each segment during this
growth process goes through one single cell of the reservoir model; the segment
strike is drawn at random in a Von Mises distribution with the local average value
estimated during the initialization phase.
- The growth stops when the fault has reached its total pre-defined length.
- The sub-seismic fault trace now obtained is added to the network if the fractal
nature of the fault pattern is still respected. This is done by checking the box
counting log-log curve of the new network. If the fractal nature is still respected,
the new sub-seismic fault trace is definitively added to the network; otherwise, it
is rejected and a new seed is generated.
This algorithm is based on the fault lineament generation defined above. It is used
sequentially by horizon, with a progression to the top or to the bottom as chosen by the
user. This processing is applied until the last horizon.
(ix) Automatic Kh Calibration (AKC) :
Once a fracture model is computed and the associated DFN have been generated,
large uncertainties remain on the fracture parameters. The dynamic calibration is a way
to reduce these uncertainties.
The Automated Kh calibration (AKC) module’s goal is to compute the Fractures and
Fault model parameters (mean conductivity especially) that will best fit the measured Kh
at wells (obtained from well test interpretations).
The strategy for dynamic calibration is to perform the AKC first, before the Automated
Dynamic Calibration (ADC).
The AKC is an automatic iterative process. At each step, an analytical upscaling is
performed around each well to calculate its permeability-thickness kh. The parameters
of the fracture model change at each step of this optimization process, whose final aim
is to propose the combination of those parameters that fits best the interpreted kh from
actual well test.
The inversion method used in AKC is a generic algorithm. This class of algorithm takes
advantage of the principles of genetics mechanisms and evolution’s laws to perform a
selection inside a set of parameters (Figure-14).
23
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Step-1: Initialization
During the initialization phase, the invertible parameters of the fracture set model values
(selected in the Parameters tab) are chosen at random within the defined validity
intervals (Figure-15).
Step-2 : Evaluation
Then, during the evaluation phase, a local analytical upscaling is performed around
each selected wells (which size is the ZOI radius entered in the input tab) (Figure-16).
The result of this local upscaling is then compared to the measured values through the
computation of an objective function f(X)1.
24
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
Step-3 : Selection
This comparison allows the process to select the best solutions between all the results
according to the objective function value. The quality of the solution (and then its
selection) is estimated from a probability2 inversely proportional to this objective function
value (Figure-17).
25
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
After the selection process, the algorithm uses two mechanisms derived from the
genetic process which takes place during chromosomes duplication : Cross over and
Mutation.
The Cross over mechanism is used to define new offspring from two given parents
taken from the previously selected population of results. First of all, the algorithm
selects a proportion Pc of the population according to their probabilities p(X) then, two
siblings are defined from two parents according to the steps given in (Figure 18).
Thus the algorithm defines some permutation (crossover) of parameters chosen among
parents (i.e., set of parameters) selected from their abilities to minimize the objective
function.
The Mutation mechanism is used to randomly redefine the values of the set of
parameters defining a population. First of all, the algorithm select a proportion Pm of the
population of solution according to their probabilities p(X), then for each selected
solutions, a new solution is defined by selecting randomly one of its property and
modifying its value as :
26
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
The application of these two mechanisms (cross over and mutation) creates a new
generation of parameter set that will be in turn evaluated through the objective function
and so on until the maximum number of generation has been reached.
The AKC process requires the definition of working context. The user has to define the
wells (Kh test values) to be matched by the algorithm, the fracture set and related
parameters to be inverted; the minimum and maximum values to be used; the type and
parameters of the upscaling run by the algorithm and the weight attributed to each well
in the objective function.
The fracture model to calibrated and the well test interpretations to be used are
selected. The user has to define a region around the wells (Radius ZOI) in which the
local analytical upscaling will be launched and the perforation limits. The ZOI and the
perforation zones should represent the drainage area and are difficult to estimate.
Therefore, a sensitivity analysis on ZOI size and perforation limits may be useful.
27
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
In a calibration strategy usually the first focus is on conductivity and density and then on
size and orientation and finally to identify correlations.
Initial number of solutions : This is the number of solution computed and used at the
first generation and used afterwards in the selection process. The greater this number,
the greater the pool of parameters for the algorithm to search for new solutions, the
better is the sampling. However, a high initial number of solution increases the
optimization cost. It is recommended to repeat tests by increasing progressively this
parameter.
Maximum number of solutions : This is the maximum number of solution that can be
computed during the optimization process. It has the same role in the algorithm than the
initial number of solutions but it means that at each optimization loop, this number of
solution will be computed, so the greater this number the slower is the execution of the
algorithm.
The cross over probability : This is the probability (in %) for a solution to cross over
with another one. The greater this probability, the wider the range of possibilities tested
by the algorithm. It has to be superior to 50% to sample effectively the solutions
neighborhood.
The probability of mutation : This is the probability (in %) for a given solution to
mutate. This parameter allows increasing the random sampling. It has the same
consequences on the result of the algorithm than the cross over probability.
AKC Results
The results for all the computed solutions (Maximum number of solution defined by the
user) are for all fracture set models, an optimized value for each of its invertible
parameters. The value of the objective function is displayed for each solution as the
Solution error. For each well, the Relative error is also displayed. This relative error is
the difference between permeability from well test interpretation and simulated
permeability related to the measured permeability (in %):
28
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
The best solution (which is not necessary the one with the minimum solution error) can
then be selected as a new Fracture model to be used in a new upscaling process to
compute better equivalent parameters.
Once you have generated your fracture model and computed the associated DFN, large
uncertainties remain on the fracture parameters. The dynamic calibration is a way to
reduce these uncertainties. The Automated Dynamic calibration (ADC) module’s goal is
to compute, using simulations, the Fractures and Fault model parameters (mean
conductivity,…) that will best fit the well tests. ADC activity can be created via the menu
bar (Calibration > Automated Dynamic Calibration) or by right-clicking in the Study
Explorer (New > Calibration > Automated Dynamic Calibration).
Once the dynamic data is loaded at the wells, the modules shows the following :
- Fracture Models built with properties therein
o Orientation
o Size
o Aperture
o Conductivity
o Density
- Available wells with following information
o Well name
o Type of test (flowmeter, well test or interference test)
o Observed test
o Simulated test
o Fracture model associated to the test
Once the wells and the fracture model (with properties) are selected the input for ADC is
set. The following table displays the list of invertible parameters according to the chosen
model:
Facies Based Block Related Fault Related
Orientation Von Mises dip & Von Mises dip & Von Mises dip & Von
Von Mises dip-azi Von Mises dip-azi Mises dip-azi
per block
Size mean length, mean length per mean length,
crossing block, crossing crossing
probabilities per probabilities per probabilities per
facies facies and per block facies
Conductivity mean conductivity mean conductivity mean conductivity
per block
Density (i.e. average average spacing per average spacing per average spacing per
spacing - facies facies and per block facies
Stratabound case)
Aperture Constant constant per block Constant
29
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
30
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
o Selected Fracture Models & Errors table [1b]: for each selected Fracture model,
this table lists the three iterations that gave the best results (minimum error
during the optimization step1) along with their associated error. It is possible to
remove the iteration of a fracture model.
You can stop the calibration after step1, if you are satisfied with the first
optimization results. To do so, remove from the table the fracture models
iterations that you do not want to keep and click on Validate to save the retained
fracture model in the database. This new fracture model appears in the Study
Explorer.
o The graph [2c] on the right displays the variation of the properties selected in the
table [2b] versus the iteration number. Two properties can be selected
simultaneously in the table and displayed on the graph.
- There is a table named Fracture Models & Initial Error [1a] (top left of the
panel) that lists the fracture models selected during step1 and their
associated initial errors.
31
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
- The table named Selected Fracture Models & Errors [1b] lists the three
best iterations (minimum error during the optimization step 2) of the
fracture models selected in the table Fracture Models & Initial Error.
After fracture model modification ADC can be re-run to obtain the best calibration.
This module aims at simulating a flowmeter curve, a well or an interference test along a
well crossing a 3D synthetic fracture network. Comparing a simulated flowmeter curve
or well test with actual measurements gives qualitative information on the synthetic
fracture network reliability. It enables the validation of the fracture characterization in
terms of network geometry and moreover in terms of fracture conductivity distribution.
Building, for example, a synthetic flowmeter curve involves a numerical simulation of the
oil flow (single phase, compressible) through the reservoir to the producing well.
Physically, this is a steady state flow where the oil contained in the reservoir matrix first
goes to the nearest fracture and then goes through the fracture network to the well.
the methodology is based on a dual porosity approach. Both single and dual
permeability models are available. The single permeability model is valid for connected
diffuse fracture networks and large matrix/fracture permeability contrasts. For other
types of fracturing (like fracture swarms), simulation results would be more reliable via
the double permeability model.
This method aims at simulating a process where the oil contained in the reservoir
moves to a producing well. This is a single phase (the only moving phase is oil),
compressible (both rock and oil are compressible) flow.
o Using this module, several actions are possible:
• Flowmeter simulation
• Well test simulation
• Interference test simulation
• Network simplification
The network simplification is useful in the case where the DFN to be calibrated is too
dense and dynamic simulations lead to memory issues. The simplification changes the
existing network in an equivalent one, decreasing the number of fractures to deal with.
32
Comparison of Fracture Modeling in OpenFlow 2011.1 & Petrel 2010.2 By Subrata Chakraborty
o Computation parameters (time steps, maximum and minimum pressure drop…);
o Completion data;
o Initial pressure of the reservoir;
o Reservoir temperature;
o Reference time and storage periods for the pressure field.
Well test Simulation : To perform a well test simulation (transient well test), the
following data are needed.
Interference Test Simulation : In the case of an interference test, the only different
parameters you have to set compared to the well test simulation are the wells status. All
the computation parameters, fluids and matrix properties are defined exactly as for a
well test simulation process.
Summary
The latest version of FracaFlow has better input data analysis features than the
contemporary version of Petrel. When it comes to fracture model calibration, FracaFlow
is way ahead of Petrel. A fracture model generated in Petrel can be dynamically
validated using Petrel RE/Eclipse/SWPM together, but the approach in FracaFlow is
much better, wherein the modeller or the team of modellers calibrates and improves the
model while developing it and hands over a much better product to the reservoir
engineers for dynamic use of the model.
33