Reservoir Model Design

Download as pdf or txt
Download as pdf or txt
You are on page 1of 80

Philip Ringrose

Mark Bentley

Reservoir
Model Design
A Practitioner’s Guide
Second Edition
Reservoir Model Design
Philip Ringrose • Mark Bentley

Reservoir Model Design


A Practitioner’s Guide

Second Edition
Philip Ringrose Mark Bentley
Equinor ASA TRACS International Limited
Norwegian University of Science Heriot-Watt University
and Technology Aberdeen and Edinburgh
Trondheim, Norway Scotland, UK

ISBN 978-3-030-70162-8 ISBN 978-3-030-70163-5 (eBook)


https://doi.org/10.1007/978-3-030-70163-5

# Springer Nature Switzerland AG 2015, 2021


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or
part of the material is concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way,
and transmission or information storage and retrieval, electronic adaptation, computer software, or
by similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are
exempt from the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in
this book are believed to be true and accurate at the date of publication. Neither the publisher nor
the authors or the editors give a warranty, expressed or implied, with respect to the material
contained herein or for any errors or omissions that may have been made. The publisher remains
neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Cover figure: Multiscale geological bodies and associated erosion, Lower Antelope Canyon,
Arizona, USA. Photograph by Jonas Bruneau. # EAGE reproduced with permission of the
European Association of Geoscientists and Engineers.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

This book is about the design and construction of subsurface reservoir models.
In the early days of the oil industry, oil and gas production was essentially an
engineering activity, dominated by disciplines related to chemical and mechan-
ical engineering. Three-dimensional (3D) geological reservoir modelling was
non-existent, and petroleum geologists were mostly concerned with the inter-
pretation of wire-line well logs and with the correlation of geological units
between wells. Two important technological developments – computing and
seismic imaging – stimulated the growth of reservoir modelling, with computa-
tional methods being applied to mapping, volumetrics and reservoir simulation.
Geological or ‘static’ reservoir modelling was given further impetus from the
development of geostatistical techniques, allowing the reservoir modeller to
estimate inter-well reservoir properties and hence make statistical predictions.
3D reservoir modelling has now become the norm.
Reservoir models in themselves do not generate prospects or development
scenarios, increase productivity or reduce risk and uncertainty. They are
simply tools which reflect our ability to achieve these goals. Advances in
reservoir modelling cannot therefore be compared to technology
breakthroughs such as 4D seismic imaging, horizontal drilling or EOR
methods. Reservoir modeling tools are, however, invaluable tools for
integrating knowledge and for making strategic decisions.
As the world moves into the energy transition and oil and gas production is
likely to decline, energy and transportation will be increasingly powered by
renewable or low-emissions technologies. In this new era, CO2 disposal and
subsurface energy storage will become increasingly important and will make
new demands of subsurface modelling. The fluids used and stored in the
subsurface may change but the value of porous rock reservoirs will remain.
We will never cease to need to know how these fluids move in the subsurface,
and modelling of subsurface reservoirs will remain a tool for building this
understanding and making useful forecasts.

***
The explosion of reservoir modelling software packages and associated
geostatistical methods has created high expectations but also led to periodic
disappointments. This has given birth to an oft quoted mantra “all models are
wrong.”

v
vi Preface

This book emerged from a series of industry and academic courses given
by the authors and aimed at guiding the reservoir modeller through the pitfalls
and benefits of reservoir modelling, in the search for a reservoir model design
that is useful for forecasting. Reservoir modelling and simulation software
packages often come with guidance about which buttons to press and menus
to use for each operation, but less advice on the objectives and limitations of
the model algorithms. The result is that whereas much time is devoted to
model building, the outcomes of the models often fall short of their objectives.
Our central contention in this book is that problems with reservoir modelling
tend not to stem from hardware limitations or lack of software skills but from the
approach taken to the modelling – the model design. It is essential to think
through the design and to build fit-for-purpose models that meet the requirements
of the intended use. All models are not wrong, but in many cases models are used
to answer questions which they were simply not designed to answer.
We cannot hope to cover all the possible model designs and approaches, and
we have avoided as much as possible reference to specific software modelling
packages. Our aim is to share our experience and present a generic approach to
reservoir model design. Our design approach is geologically based – partly
because of our inherent bias as geoscientists – but mainly because subsurface
reservoirs are composed of rocks. The pore space which houses the ‘black gold’
of the oil age, the ‘golden age’ of gas or the fluids disposed and stored in the
new ‘age of sustainability’, has been constructed by geological processes – the
deposition of sandstone grains and clay layers, processes of carbonate cemen-
tation and dissolution, and the mechanics of fracturing and folding. Good
reservoir model design is therefore founded on good geological interpretation.
There is always a balance between probability (the outcomes of stochastic
processes) and determinism (outcomes controlled by limiting conditions). We
develop the argument that deterministic controls rooted in an understanding of
geological processes are the key to good model design. The use of probabilis-
tic methods in reservoir modelling without these geological controls is a poor
basis for decision making, whereas an intelligent balance between determin-
ism and probability offers a promising path.
We also discuss the decision-making process involved in reservoir
modelling. Human beings are notoriously bad at making good judgements –
a theme widely discussed in the social sciences and behavioural psychology.
The same applies to reservoir modelling – how do you know you have a fit-
for-purpose reservoir model? There are many possible responses, but most
commonly there is a tendency to trust the outcome of a reservoir modelling
process without appreciating the inherent uncertainties.
We hope this book will prove to be a useful guide to practitioners and
students of subsurface reservoir modelling in the fields of petroleum geosci-
ence, environmental geoscience, CO2 and energy storage and reservoir
engineering – an introduction to the complex, fascinating, rapidly-evolving
and multi-disciplinary field of subsurface reservoir modelling.

Trondheim, Norway Philip Ringrose

Edinburgh, UK Mark Bentley


Tenby, UK
Prologue: Model Design

Successful Reservoir Modelling

This book offers practical advice and ready-to-use tips on the design and
construction of reservoir models. This subject is variously referred to as
‘geological reservoir modelling,’ ‘static reservoir modelling’, ‘3D reservoir
modelling’ or ‘geomodelling’, and our starting point is very much the geol-
ogy. However, the end point is fundamentally the engineering representation
of the subsurface and in this sense these notes will deal with the important
interface with the world of dynamic, simulation modelling, with which static
modelling is intrinsically integrated.
Our central argument is that whether models succeed in their goals is
generally determined in the higher level issue of model design – building
models which are fit for the purpose at hand.
Based on our own experiences and those of colleagues and groups we have
worked with on courses and in reviews, we distinguish five root causes which
commonly determine modelling success or failure:
1. Establishing the model purpose
Why are we logged on in the first place and what do we mean by ‘fit-for-
purpose’?
2. Building a 3D architecture with appropriate modelling elements
The fluid-dependent choice on the level of detail required in a model
3. Understanding determinism and probability
Our expectations of geostatistical algorithms
4. Model scaling
Reconciling the jump between the scale of data and cellular resolution of
our models, and how to represent pore-scale fluid flow reasonably at a
reservoir scale
5. Uncertainty-handling
Moving from single to multi-models and where the design becomes subject
to bias
Strategies for addressing these underlying issues will be dealt with in the
following chapters under the thematic headings of model purpose, the rock
model, the property model, upscaling flow properties and uncertainty-
handling.

vii
viii Prologue: Model Design

In the later chapters we focus on specific reservoir types, as there are


generic issues which predictably arise when dealing with certain reservoirs.
We look specifically at how our experiences from modelling hydrocarbon
reservoirs can be applied to modelling reservoirs (and their overburdens) for
disposal and storage of CO2 or hydrogen. We share our experience, gained
from personal involvement in over a hundred modelling studies, significantly
augmented by the experiences of others shared in reviews and reservoir
modelling classes over the past 30 years.
Before we engage in technical issues, however, a reflection on the central
theme of design.

Reservoir modellers in front of rocks, discussing design (Photograph by Philip Ringrose)

Design in General

Design is an essential part of everyday life, compelling examples of which are


to be found in architecture. We are aware of famous, elegant and successful
designs, such as the Gherkin – a striking feature of the London skyline
designed for the Swiss Re company by Norman Foster and Partners – but
we are more likely to live and work in less glamorous but hopefully fit-for-
purpose buildings. The Gherkin, or more correctly the 30 St. Mary Axe
building, uses half the energy typically required by an office block, optimises
the use of daylight and natural ventilation (Price 2009) and embodies both
innovative and successful design, although with significant complexity.
Prologue: Model Design ix

There are many more examples, however, of office block and accommo-
dation units that are unattractive and plagued by design faults and
inefficiencies – the carbuncles that should never have been built.
This architectural analogy gives us a useful setting for considering the
more exclusive art of constructing models of the subsurface.

Norman Foster building, 30 St. Mary Axe (Photograph from Foster and Blaser (1993) –
reproduced with kind permission from Springer Science + Business Media B.V.)

What constitutes good design? In our context we suggest the essence of a


good design is simply that it fulfils a specific purpose and is therefore fit for
that purpose.
The Petter Daas museum in the small rural community of Alstahaug in
northern Norway offers another architectural statement on design. This fairly
small museum, celebrating a local poet and designed by the architectural firm
Snøhetta, fits snugly and consistently into the local landscape. It is elegant and
practical giving both light, shelter and warmth in a fairly extreme environ-
ment. Although lacking the complexity and scale of the Gherkin, it is equally
fit-for-purpose. Significantly, in the context of this book, it rises out from and
x Prologue: Model Design

fits into the Norwegian bedrock. It is an engineering design clearly founded in


the geology – the essence of good reservoir model design.
When we build models of fluid resources in the subsurface we should never
ignore the fact that those resources are contained within rock formations.
Geological systems possess their own natural forms of design as depositional,
diagenetic and tectonic processes generate intricate reservoir architectures.
We rely on a firm reservoir architectural foundation, based on an understand-
ing of geological processes, which can then be quantified in terms of rock
properties and converted into a form useful to predict fluid flow behaviour.

The Petter Dass Museum, Alstahaug, Norway (The Petter Dass-museum, # Petter Dass-museum,
reproduced with permission)

Good reservoir model design therefore involves the digital representation


of the natural geological architecture and its translation into useful models of
subsurface fluid resources. Sometimes the representations are complex –
sometimes they can be very simple indeed.

References

Foster N, Blaser W (1993) Norman foster sketch book. Birkhauser, Basel


Price B (2009) Great modern architecture: the world’s most spectacular
buildings. Canary Press, New York
Acknowledgements

Before engaging with this subject, we must acknowledge the essential


contributions of others. Firstly, and anonymously, we thank our many profes-
sional colleagues in the fields of petroleum geoscience, reservoir engineering,
geostatistics and software engineering. Without their expertise and the
products of their innovation (commercial reservoir modelling packages), we
as users would not have the opportunity to attempt good reservoir
model design in the first place. All the examples and illustrations used in
this book are the result of collaborative work with others – by its very nature
reservoir modelling is done within multi-disciplinary teams. We have
endeavoured to credit our sources with reference to published studies where
possible. Elsewhere, where unpublished case studies are used, these are the
authors’ own work, unless explicitly acknowledged.
More specifically we would like to thank our employers past and present –
Shell, TRACS and AGR (M.B.), Statoil, Equinor and NTNU (P.R.) and
Heriot-Watt University (M.B. & P.R.) for the provision of data, computational
resources and, not least, an invaluable learning experience. The latest versions
of this book have been honed and developed as part of the TRACS and
RPS/Nautilus training programmes (www.tracs.com and www.training.
rpsgroup.com, respectively). Participants of these courses have repeatedly
given us valuable feedback, suggesting improvements which have become
embedded in the chapters of this book. Patrick Corbett, Kjetil Nordahl, Gillian
Pickup, Stan Stanbrook, Paula Wigley and Caroline Hern are thanked for
constructive reviews of the book chapters. Thanks are due also to Fiona
Swapp and Susan McLafferty at TRACS for producing many excellent
graphics for the book and the associated courses.
Each reservoir modelling study discussed has benefited from the use of
commercial software packages. We do not wish to promote or advocate any
one package or the other – rather to encourage the growth of this technology in
an open competitive market. We do however acknowledge the use of licensed
software from several sources. The main software packages we have used in
the examples discussed in this book include the Petrel E&P Software Platform
(Schlumberger), the Integrated Irap RMS Solution Platform (Roxar), the
Paradigm SKUA-GOCAD framework for subsurface modelling, the SBED
and ReservoirStudio products from Geomodeling Technology Corp., and the

xi
xii Acknowledgements

ECLIPSE suite of reservoir simulation software tools (Schlumberger). This is


not an exhaustive list, just an acknowledgement of the tools we have used
most often in developing approaches to reservoir modelling.
And finally, we would like to acknowledge our families and partners, who
have kindly let us out to engage in rather too many reservoir modelling
studies, courses and field trips on every continent (apart from Antarctica).
We hope this book is a small compensation for their patience and support.
Contents

1 Model Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Modelling for Comfort? . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Models for Visualisation Alone . . . . . . . . . . . . . . . . . . . . 2
1.3 Models for Volumes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Models as a Front End to Simulation . . . . . . . . . . . . . . . . . 4
1.5 Models for Well Planning . . . . . . . . . . . . . . . . . . . . . . . . 4
1.6 Models for Seismic Modelling . . . . . . . . . . . . . . . . . . . . . 5
1.7 Models for IOR/EOR . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.8 Models for Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.9 The Fit-for-Purpose Model . . . . . . . . . . . . . . . . . . . . . . . . 8
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2 The Rock Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.1 The Rock Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2 Model Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 The Model Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3.1 Structural Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.2 Stratigraphic Data . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.4 Model Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.4.1 Reservoir Models Not Geological Models . . . . . . . 20
2.4.2 Building Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.4.3 Model Element Types . . . . . . . . . . . . . . . . . . . . . . 20
2.4.4 How Much Heterogeneity to Include?
‘Flora’s Rule’ . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.5 Determinism and Probability . . . . . . . . . . . . . . . . . . . . . . 28
2.5.1 Balance Between Determinism and Probability . . . . 29
2.5.2 Different Generic Approaches . . . . . . . . . . . . . . . . 31
2.5.3 Forms of Deterministic Control . . . . . . . . . . . . . . . 32
2.6 Essential Geostatistics . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
2.6.1 Key Geostatistical Concepts . . . . . . . . . . . . . . . . . 34
2.6.2 Intuitive Geostatistics . . . . . . . . . . . . . . . . . . . . . . 39
2.7 Algorithm Choice and Control . . . . . . . . . . . . . . . . . . . . . 43
2.7.1 Object Modelling . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.7.2 Pixel-Based Modelling . . . . . . . . . . . . . . . . . . . . . 47
2.7.3 Texture-Based Modelling . . . . . . . . . . . . . . . . . . . 50
2.7.4 Algorithms Compared . . . . . . . . . . . . . . . . . . . . . . 51

xiii
xiv Contents

2.7.5 Process-Based Modelling . . . . . . . . . . . . . . . . . . . 53


2.7.6 The Importance of Deterministic Trends . . . . . . . . . 54
2.7.7 Alternative Rock Modelling Methods – A
Comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
2.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
2.8.1 Rock Model QC . . . . . . . . . . . . . . . . . . . . . . . . . . 61
2.8.2 Synopsis – Rock Modelling Guidelines . . . . . . . . . 61
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3 The Property Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
3.1 Which Properties? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
3.2 Understanding Permeability . . . . . . . . . . . . . . . . . . . . . . . 70
3.2.1 Darcy’s Law . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
3.2.2 Upscaled Permeability . . . . . . . . . . . . . . . . . . . . . . 70
3.2.3 Permeability Variation in the Subsurface . . . . . . . . 72
3.2.4 Permeability Averages . . . . . . . . . . . . . . . . . . . . . 72
3.2.5 Numerical Estimation of Block Permeability . . . . . . 75
3.2.6 Permeability in Fractures . . . . . . . . . . . . . . . . . . . . 77
3.3 Handling Statistical Data . . . . . . . . . . . . . . . . . . . . . . . . . 78
3.3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
3.3.2 Variance and Uncertainty . . . . . . . . . . . . . . . . . . . 80
3.3.3 The Normal Distribution and Its Transforms . . . . . . 82
3.3.4 Handling ϕ-k Distributions and Cross Plots . . . . . . 86
3.3.5 Hydraulic Flow Units . . . . . . . . . . . . . . . . . . . . . . 88
3.4 Modelling Property Distributions . . . . . . . . . . . . . . . . . . . 89
3.4.1 Kriging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
3.4.2 The Variogram . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
3.4.3 Gaussian Simulation . . . . . . . . . . . . . . . . . . . . . . . 91
3.4.4 Bayesian Statistics . . . . . . . . . . . . . . . . . . . . . . . . 92
3.4.5 Property Modelling: Object-Based Workflow . . . . . 93
3.4.6 Property Modelling: Seismic-Based Workflow . . . . 95
3.5 Use of Cut-Offs and N/G Ratios . . . . . . . . . . . . . . . . . . . . 97
3.5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
3.5.2 The Net-to-Gross Method . . . . . . . . . . . . . . . . . . . 99
3.5.3 Total Property Modelling . . . . . . . . . . . . . . . . . . . 101
3.6 Vertical Permeability and Barriers . . . . . . . . . . . . . . . . . . . 105
3.6.1 Introduction to kv/kh . . . . . . . . . . . . . . . . . . . . . . . 105
3.6.2 Modelling Thin Barriers . . . . . . . . . . . . . . . . . . . . 106
3.6.3 Modelling of Permeability Anisotropy . . . . . . . . . . 108
3.7 Saturation Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
3.7.1 Capillary Pressure . . . . . . . . . . . . . . . . . . . . . . . . . 110
3.7.2 Saturation-Height Functions . . . . . . . . . . . . . . . . . 111
3.7.3 Tilted Oil-Water Contacts . . . . . . . . . . . . . . . . . . . 112
3.8 Modelling Fracture Properties . . . . . . . . . . . . . . . . . . . . . . 115
3.8.1 Terminology and Type . . . . . . . . . . . . . . . . . . . . . 116
3.8.2 Fault Zone Properties . . . . . . . . . . . . . . . . . . . . . . 119
3.8.3 Modelling Open Fracture Properties . . . . . . . . . . . . 124
3.8.4 Capturing the Effects of Stress on
Fracture Properties . . . . . . . . . . . . . . . . . . . . . . . . . 126
Contents xv

3.8.5 Summary – Fracture Properties . . . . . . . . . . . . . . . 127


3.9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
4 Upscaling Flow Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
4.1 Multi-scale Flow Modelling . . . . . . . . . . . . . . . . . . . . . . . 132
4.2 Multi-phase Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
4.2.1 Two-Phase Flow Equations . . . . . . . . . . . . . . . . . . 135
4.2.2 Two-Phase Steady-State Upscaling Methods . . . . . . 138
4.2.3 Heterogeneity and Fluid Forces . . . . . . . . . . . . . . . 142
4.3 Multi-scale Reservoir Modelling Concepts . . . . . . . . . . . . 144
4.3.1 Geology and Scale . . . . . . . . . . . . . . . . . . . . . . . . 144
4.3.2 How Many Scales to Model and Upscale? . . . . . . . 146
4.3.3 Which Scales to Focus On? (The REV) . . . . . . . . . 148
4.3.4 Handling Variance as a Function of Scale . . . . . . . . 151
4.3.5 Construction of Geomodel and Simulator
Grids . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
4.3.6 Which Heterogeneities Matter? . . . . . . . . . . . . . . . 157
4.4 The Way Forward . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
4.4.1 Potential and Pitfalls . . . . . . . . . . . . . . . . . . . . . . . 159
4.4.2 Pore-to-Field Workflow . . . . . . . . . . . . . . . . . . . . . 160
4.4.3 Essentials of Multi-scale Reservoir Modelling . . . . . 160
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
5 Model-Based Uncertainty Handling . . . . . . . . . . . . . . . . . . . . 165
5.1 The Issue . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
5.1.1 Modelling for Comfort . . . . . . . . . . . . . . . . . . . . . 166
5.1.2 Modelling for Discomfort – Quantifying
Uncertainty and Exposing Risk . . . . . . . . . . . . . . . 166
5.2 Differing Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . 169
5.2.1 Best Guess, or ‘Rationalist’ Approaches . . . . . . . . . 169
5.2.2 Multiple Stochastic Approaches . . . . . . . . . . . . . . . 170
5.2.3 Multiple Deterministic Approaches . . . . . . . . . . . . 170
5.3 Bias . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
5.3.1 The Limits of Rationalism . . . . . . . . . . . . . . . . . . . 172
5.3.2 The Limits of Geostatistics . . . . . . . . . . . . . . . . . . 173
5.3.3 Cognitive Limits – Heuristics . . . . . . . . . . . . . . . . 175
5.4 Towards an Unbiased Methodology . . . . . . . . . . . . . . . . . 177
5.4.1 The Uncertainty List . . . . . . . . . . . . . . . . . . . . . . . 178
5.4.2 Scenario Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
5.5 Post-processing the Ensemble . . . . . . . . . . . . . . . . . . . . . . 181
5.5.1 Scenarios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
5.5.2 Exhaustive Deterministic Ensembles . . . . . . . . . . . 182
5.5.3 Variance Mapping . . . . . . . . . . . . . . . . . . . . . . . . 183
5.5.4 Sampling Probabilistic Ensembles . . . . . . . . . . . . . 185
5.5.5 Experimental Design and Sensitivity Analysis . . . . 185
5.5.6 Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
5.5.7 Updating Reservoir Uncertainty with New
Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
xvi Contents

5.5.8 Distinguishing and Illustrating Risk


vs. Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
5.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
6 Reservoir Model Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
6.1 Aeolian Reservoirs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
6.1.1 Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
6.1.2 Effective Properties . . . . . . . . . . . . . . . . . . . . . . . . 196
6.1.3 Stacking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
6.1.4 Aeolian System Anisotropy . . . . . . . . . . . . . . . . . . 199
6.1.5 Laminae-Scale Effects . . . . . . . . . . . . . . . . . . . . . . 200
6.2 Fluvial Reservoirs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
6.2.1 Fluvial Systems . . . . . . . . . . . . . . . . . . . . . . . . . . 201
6.2.2 Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
6.2.3 Connectivity and Percolation Theory . . . . . . . . . . . 202
6.2.4 Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
6.3 Tidal Deltaic Sandstone Reservoirs . . . . . . . . . . . . . . . . . . 206
6.3.1 Tidal Characteristics . . . . . . . . . . . . . . . . . . . . . . . 206
6.3.2 Handling Heterolithics . . . . . . . . . . . . . . . . . . . . . 207
6.4 Shallow Marine Sandstone Reservoirs . . . . . . . . . . . . . . . . 207
6.4.1 Tanks of Sand? . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
6.4.2 Stacking and Laminations . . . . . . . . . . . . . . . . . . . 209
6.4.3 Large-Scale Impact of Small-Scale
Heterogeneities . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
6.5 Deep Marine Sandstone Reservoirs . . . . . . . . . . . . . . . . . . 211
6.5.1 Relative Confinement . . . . . . . . . . . . . . . . . . . . . . 213
6.5.2 Seismic Limits . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
6.5.3 Thin Beds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214
6.5.4 Small-Scale Heterogeneity in High
Net-to-Gross ‘Tanks’ . . . . . . . . . . . . . . . . . . . . . . 215
6.5.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
6.6 Carbonate Reservoirs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
6.6.1 Depositional Architecture . . . . . . . . . . . . . . . . . . . 218
6.6.2 Pore Fabric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
6.6.3 Diagenesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
6.6.4 Fractures and Karst . . . . . . . . . . . . . . . . . . . . . . . . 225
6.6.5 Hierarchies of Scale – The Carbonate REV . . . . . . 225
6.6.6 Conclusion: Forward-Modelling or Inversion? . . . . 228
6.7 Fractured Reservoirs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
6.7.1 Fracture Concepts . . . . . . . . . . . . . . . . . . . . . . . . . 229
6.7.2 Low Density, Compartmentalised Fracture
Systems (Fault-Dominated) . . . . . . . . . . . . . . . . . . 229
6.7.3 Low Density Fracture Systems, Open to Flow
(Fault-Dominated) . . . . . . . . . . . . . . . . . . . . . . . . 234
6.7.4 High Density Fractured Reservoirs – Open
to Flow (Joint-Dominated) . . . . . . . . . . . . . . . . . . . 239
Contents xvii

6.7.5Forward-Modelling or Inversion in Fractured


Reservoirs? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
6.8 Fit-for-Purpose Recapitulation . . . . . . . . . . . . . . . . . . . . . 246
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
7 Models for Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
7.1 Displacements of Different Fluids . . . . . . . . . . . . . . . . . . . 252
7.2 Geological Storage of CO2 . . . . . . . . . . . . . . . . . . . . . . . . 252
7.3 CO2 Storage Modelling Objectives . . . . . . . . . . . . . . . . . . 254
7.4 Understanding the CO2 Storage Process . . . . . . . . . . . . . . 257
7.5 The Influence of Geological Heterogeneity . . . . . . . . . . . . 264
7.6 Handling Pressure and Rock Deformation . . . . . . . . . . . . . 267
7.7 Model Design Futures for the Energy Transition . . . . . . . . 273
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
8 Modelling Workflows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
8.1 The Detailed, Full-Field Model Default . . . . . . . . . . . . . . . 278
8.2 Resource and Decision Models . . . . . . . . . . . . . . . . . . . . . 278
8.3 Iterative Workflows – The Forth Bridge . . . . . . . . . . . . . . 280
8.4 Handling Dynamic Data . . . . . . . . . . . . . . . . . . . . . . . . . . 283
8.4.1 History-Matching . . . . . . . . . . . . . . . . . . . . . . . . . 283
8.4.2 History-Comparing . . . . . . . . . . . . . . . . . . . . . . . . 284
8.5 ‘Truth Models’ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
9 Epilogue – Modelling for the Energy Transition . . . . . . . . . . 291
9.1 Synopsis – The Story So Far . . . . . . . . . . . . . . . . . . . . . . 292
9.2 Use of Analogues and Data . . . . . . . . . . . . . . . . . . . . . . . 294
9.3 Restoring Lost Heterogeneity . . . . . . . . . . . . . . . . . . . . . . 296
9.4 New Workflows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
9.4.1 Surface-Based Models . . . . . . . . . . . . . . . . . . . . . . 297
9.4.2 Disposable Grids . . . . . . . . . . . . . . . . . . . . . . . . . 298
9.5 Stepping Beyond the Solution – ‘Modelling for
Understanding’ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302

Nomenclature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305

Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307

Appendix A: A Template for Model Design . . . . . . . . . . . . . . . . . . 311

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
Model Purpose
1

Abstract we argue the case for building for fit-for-pur-


Should we aspire to build detailed full-field pose models, which may or may not be
reservoir models with a view to using detailed and may or may not be full-field.
those models to answer business questions? This choice triggers the question: ‘what is
In this chapter it is suggested the answer to our purpose?’ The answer to this question
the above question is ‘not necessarily’. Instead determines the model design.

A reservoir engineer and a geoscientist establish model purpose against an outcrop analogue

Keywords
1.1 Modelling for Comfort?
Design · Reservoir modelling · Simulation ·
Energy transition · Fit-for-purpose ·
There are two broad schools of thought on the
Volumetrics · Well planning · Forecasting ·
purpose of models:
Development planning

# Springer Nature Switzerland AG 2021 1


P. Ringrose, M. Bentley, Reservoir Model Design, https://doi.org/10.1007/978-3-030-70163-5_1
2 1 Model Purpose

1. To provide a 3D, digital representation of a question “what purpose?”, as the model design
hydrocarbon reservoir, which can be built and will vary according to that purpose. To illustrate
maintained as new data becomes available, this, the following sections look at contrasting
and used to support on-going lifecycle needs purposes of reservoir modelling, and the distinc-
such as volumetric updates, well planning and, tive design of the models associated with these
via reservoir simulation, production differing contexts.
forecasting.
2. That there is little value in maintaining a single
‘field model’. Instead, we should build and
1.2 Models for Visualisation Alone
maintain a field database, from which several
fit-for-purpose models can be built quickly to
Simply being able to visualise the reservoir in 3D
support specific decisions.
was identified early in the development of
The first approach seems attractive, especially modelling tools as a potential benefit of reservoir
if a large amount of effort is invested in the first modelling. Simply having a 3D box in which to
build prior to a major investment decision. How- view the available data is beneficial in itself.
ever, the ‘all-singing, all-dancing’ full-field This is the most intangible benefit of
approach tends to result in large, detailed models, modelling, as there is no output other than a richer
generally working at the limit of the available mental impression of the subsurface, which is
software/hardware and which are cumbersome difficult to measure. However, most people
to update and difficult to pass hand-to-hand as benefit from 3D visualisation (Fig. 1.1), con-
people move between jobs. Significant effort can sciously or unconsciously, particularly where
be invested simply in the on-going maintenance cross-disciplinary issues are involved.
of these models, to the point that the need for the Some common examples are:
model ceases to be questioned and the purpose of
• To show the geophysicist the 3D structural
the model is no longer apparent. In the worst case,
model based on their seismic interpretations.
the maintenance of the model becomes a job in
Do they like it? Does it make geological
itself and the investment in modelling technology
sense? Have seismic artefacts been inadver-
may just be satisfying an urge for technical rigour
tently included?
in the lead up to a business decision – simply
• To show the petrophysicist (well-log special-
‘modelling for comfort’ (Bentley 2015).
ist) the 3D property model based on the well-
We argue that the route to happiness lies with
log data supplied in 1D. Has the 3D property
the second approach: building fit-for-purpose
modelling been appropriate or have features
models which are equally capable of creating
been introduced which are inconsistent with
comfort or discomfort around a business decision.
the 1D knowledge along the wells,
Choosing this approach immediately raises the

Fig. 1.1 The value of


visualisation: appreciating
structural and stratigraphic
architecture, during well
planning
1.3 Models for Volumes 3

e.g. correlations and geological or many cases, the model is effectively a 3D visual
petrophysical trends? data base and many of the steps described in the
• To show the reservoir engineer the geo-model subsequent chapters in this book are not required
grid, which will be the basis for subsequent to achieve the desired understanding.
flow modelling. Is it usable? Does it conflict
with prior perceptions of reservoir unit
continuity?
1.3 Models for Volumes
• To show the well engineer what you are trying
to target in 3D with the complex well path you
Knowing how much oil and gas is in place or how
have just planned. Does it look feasible?
much fluid can be stored or disposed of is usually
• To show the asset team how a conceptual
one of the first goals of reservoir modelling. This
reservoir model sketched on a piece of paper
may be done using a simple map-based approach,
actually transforms into a 3D volume.
but the industry has now migrated to 3D software
• To show the manager or investment fund
packages, which is appropriate given that
holder what the subsurface resource actually
volumetrics are intrinsically a 3D property. The
looks like. That oil and gas do not come from a
tradition of calculating volumes from 2D maps
‘hole in the ground’ but from a heterogeneous
was a necessary simplification, no longer
pore-system requiring significant technical
required.
skills to access, and a different set of skills
3D mapping to support volumetrics should be
when we come to consider storing or perma-
quick, and is ideal for quickly screening
nently disposing of fluids in the subsurface.
uncertainties for their impact on volumetrics, as
in the case shown in Fig. 1.2, where the volumet-
Getting a strong shared understanding of the ric sensitivity to fluid contact uncertainties is
subsurface concept tends to generate useful being tested, as part of a quick asset evaluation.
discussions on commercial risks and Models designed for this purpose can be rela-
uncertainties, and looking at models or data in tively coarse, containing only the outline fault
3D often facilitates this process. The value of pattern required to define discrete blocks and the
visualisation alone is the improved understanding gross layering in which the volumes will be
it gives. reported. The reservoir properties involved
If this is a prime purpose then the model need (e.g. porosity and net-to-gross) are statistically
not be complex – it depends on the audience. In additive (see Chap. 3 for further discussion)

Fig. 1.2 Two models for different fluid contact scenarios built specifically for volumetrics
4 1 Model Purpose

which means cell sizes can be large. There is no 1.5 Models for Well Planning
requirement to create permeability models and, if
this is for quick screening only, it may be suffi- If the purpose of the modelling exercise is to
cient to run 3D volumes for gross rock volume assist well planning and geosteering, the model
only, combining the remaining reservoir may require no more than a top structure map,
properties on spreadsheets. nearby well ties and seismic attribute maps. Wells
Models designed for volumetrics should be may also be planned using simulation models,
coarse and fast. allowing for alternative well designs to be tested
against likely productivity.
It is generally preferable to design well paths
1.4 Models as a Front End in reservoir models which capture all factors
to Simulation likely to impact a fairly costly investment deci-
sion. Most geoscience software packages have
The majority of reservoir models, however, are good well design functionality allowing for accu-
built for input to flow simulators. To be success- rate well-path definition in a high resolution static
ful, such models have to capture the essential model. Figure 1.4 shows an example model for a
permeability heterogeneity which will impact on proposed horizontal well, the trajectory of which
reservoir performance. If the static models fail to has been optimised to access oil volumes (HCIIP)
capture this, the subsequent simulation forecasts by careful geo-steering with reference to expected
may be useless. stratigraphic and structural surfaces.
The requirement for capturing connected per- Some thought is required around the
meability usually means finer scale modelling is determinism-probability issue referred to in the
required because permeability is a multiplicative prologue and explored further in Chap. 2 because,
rather than an additive property. Unlike models although there are many possible statistical
for volumetrics, the scope for simple averaging of simulations of a reservoir, there will only be one
detailed heterogeneity is limited. Issues of grid final well path. It is therefore only reasonable to
geometry and cell shape are also more pressing target the wells at the more deterministic features
for flow models (Fig. 1.3); strategies for dealing in the model – features that are placed in 3D by
with this are discussed in Chap. 4. the modeller and determined by the underlying
At this point it is sufficient to simply appreci- reservoir concept. These typically include fault
ate that taking a static geological model through blocks, key stratigraphic rock units, and high
to simulation automatically requires additional porosity features which are well determined,
design, with a focus on permeability such as channel belts or seismic amplitude
architecture. ‘sweet spots.’ It is wrong to target wells at highly

Fig. 1.3 Rock model (a) and property model (b) designed for reservoir simulation for development planning (c)
1.6 Models for Seismic Modelling 5

Fig. 1.4 Example planned


well trajectory with an
expected fault, base
reservoir surface and well
path targets

Fig. 1.5 Modelling for


horizontal well planning
based on deterministic data
(a) vs. a model with
significant stochastic
elements (b)

stochastic model features, such as a simulated ‘planned’ well, including uncertainty ranges.
random channel, stochastic porosity highs or Using visualisation, it is easier to understand
small-scale probabilistic bodies (Fig. 1.5). surprises as they occur, particularly during
The dictum is that wells should only target geosteering (e.g. Fig. 1.4).
highly probable features; this means well
prognoses (and geosteering plans) can only be
confidently conducted on models designed to be 1.6 Models for Seismic Modelling
largely deterministic. If we try and drill a proba-
bilistic rock body, we will probably be Over the last few decades, geophysical imaging
disappointed. has led to great improvements in reservoir
Having designed the well path it can be useful characterisation – better seismic imaging allows
to monitor the actual well path (real-time updates) us to ‘see’ progressively more of the subsurface.
by incrementally reading in the well deviation file However, an image based on sonic wave
to follow the progress of the ‘actual’ well vs. the reflections requires translation into rock and
6 1 Model Purpose

fluid properties. Geological reservoir models are well data. The nature of the required input model
therefore important as a priori input to Quantita- varies according to the QI process being
tive Interpretation (QI) seismic studies. followed – this needs to be discussed with the
This may be as simple as providing the geophysicist.
layering framework for routine seismic inversion, In the example shown here (Fig. 1.6), a reser-
or as complex as using Bayesian probabilistic voir model (top) has been passed through to the
rock and fluid prediction to merge seismic and simulation stage to predict the acoustic

Fig. 1.6 Reservoir modelling in support of seismic inter- matched (Bentley and Hartung 2001). (Redrawn from
pretation: (a) rock model; (b) forecast of acoustic imped- Bentley and Hartung 2001, #EAGE reproduced with
ance change between seismic surveys; (c) 4D seismic kind permission of EAGE Publications B.V., The
difference cube to which the reservoir simulation was Netherlands)
1.8 Models for Storage 7

impedance change to be expected on a 4D seismic We started by arguing that there is little value
survey (middle). The actual time-lapse in ‘fit-for-all purposes’ detailed full-field models.
(4D) image from seismic (bottom) is then com- However, IOR schemes generally require very
pared to the synthetic acoustic impedance change, detailed models to give very accurate answers,
and the simulation is history-matched to achieve a such as “exactly how much more oil will I recover
fit. if I start a gas injection scheme?” This requires
If input to geophysical analysis is the key detail, but not necessarily at a full-field scale.
issue, the focus of the model design shifts to the Many IOR solutions are best solved using
properties relevant to geophysical modelling, detailed sector or near-well models, with rela-
notably models of velocity and density changes. tively simple and coarse full-field grids to handle
There is, in this case, no need to pursue the the reservoir management.
intricacies of high resolution permeability archi- Figure 1.7 shows an example IOR model
tecture, and simpler (coarser) model designs may (Brandsæter et al. 2001). Gas injection was
therefore be appropriate. simulated in a high-resolution sector model with
fine-layering (metre-thick cells) and various fault
scenarios for a gas condensate field with difficult
fluid phase behaviour. The insights from this IOR
1.7 Models for IOR/EOR
sector model were then used to constrain the
coarse-grid full-field reservoir management
Efforts to extract maximum possible volumes
model.
from oil and gas reservoirs usually fall under the
banner of Improved Oil Recovery (IOR) or
Enhanced Oil recovery (EOR). IOR tends to
include all options including novel well design 1.8 Models for Storage
solutions, use of time-lapse seismic and second-
ary or tertiary flooding methods (water-based or The growing interest in CO2 storage as a means of
gas-based injection strategies), while EOR gener- controlling greenhouse gas emissions brings a
ally implies tertiary flooding methods, new challenge for reservoir modelling. Here
i.e. something more advanced than primary there is a need for both initial scoping models
depletion or secondary waterflood. CO2 flooding (for capacity assessment) and for more detailed
and Water Alternating Gas (WAG) injection models to understand injection strategies and to
schemes are typical EOR methods. We will use assess long-term storage integrity. Some of the
IOR to encompass all the options. issues are similar – find the good permeability

Fig. 1.7 Gas injection


patterns (white) in a thin-
bedded tidal reservoir
(coloured section)
modelled using a multi-
scale method and
incorporating the effects of
faults in the reservoir
simulation model
8 1 Model Purpose

zones, identify important flow barriers and pres- 1.9 The Fit-for-Purpose Model
sure compartments – but other issues are rather
different, such as understanding formation Given the variety of models described above, we
response to elevated pressures and geochemical argue that it is best to abandon the notion of a
reactions due to CO2 dissolved in brine. CO2 is single, all-knowing, all-purpose, full-field
also normally compressed into the liquid or dense model, and replace this with the idea of flexible,
phase to be stored at depths of c.1–3 km, so that faster models based on thoughtful model design,
understanding fluid behaviour is also an impor- tailored to answer specific questions at hand.
tant factor. CO2 storage generally requires the Such models have a short shelf life and are
assessment of quite large aquifer/reservoir built with specific ends in mind, i.e. there is a
volumes and the caprock system – presenting clear model purpose. The design of these models
significant challenges for grid resolution and the is informed by that purpose, as the contrast
level of detail required. between the models illustrated in this chapter
An example geological model for CO2 storage has shown.
is shown in Fig. 1.8 from the In Salah CO2 injec- With a fit-for-purpose mind set, the long-term
tion project in Algeria (Ringrose et al. 2011). handover items between geoscientists are not a set
Here CO2, removed from several CO2-rich gas of 3D property models, but the underlying build-
fields, has been stored in the down-flank aquifer ing blocks from which those models were created,
of a producing gas field. Injection wells were notably the reservoir database (which should
placed on the basis of a seismic porosity inver- remain updated and ‘clean’) and the reservoir
sion, and analysis of seismic and well data was concept, which should be clear and explicit, to
used to monitor the injection performance and the point that it can be sketched.
verify the integrity of the storage site. Geological It is also often practical to hand-over some
models at a range of scales were required, from aspects of the model build, such as a fault model
near-wellbore models of flow behaviour to large- if the software in use allows this to be updated
scale models of the geomechanical response. easily, or workflows and macros, if these can be

Fig. 1.8 Models for CO2 CO2 Injection wells


storage: Faulted top (downflank aquifer)
structure map with seismic- Crestal gas production wells
based porosity model and
positions of injection wells
1.9 The Fit-for-Purpose Model 9

understood and edited readily. The pre-existing this should be decided up-front before
model outputs (property models, rock models, significant modelling effort is expended. There
volume summaries, etc.) are best archived. is no ‘correct’ workflow, because the actual
The rest of this book develops this theme in steps to be taken are an output of the fit-for-
more detail – how to achieve a design which purpose design. However, it may be useful to
addresses the model purpose whilst representing refer to a general workflow (Fig. 1.10) which
the essential features of the geological architec- represents the main steps outlined in this
ture (Fig. 1.9). book (see also Cannon 2018 for a description of
When setting about a reservoir modelling the standard workflow).
project, an overall workflow is required and

Fig. 1.9 Geological


architecture. (Image of
geomodel built in SBED
Studio™ merged with
photograph of Petter Dass
Museum (Refer Chap. 1))

Fig. 1.10 Generic Decide the model purpose


reservoir modelling
workflow
Establish conceptual geological models

Build rock models


Re-iterate:
1. Maintain subsurface Build property models
database
2. Preserve model build
decision track Assign flow properties and functions
3. Discard or archive the
model results
Upscale flow properties and functions
4. Address the next
question
Make forecasts

Assess and handle uncertainties

Make an economic or
engineering decision
10 1 Model Purpose

References field, Offshore Mid-Norway. Paper SPE 66391


presented at the SPE reservoir simulation symposium
held in Houston, Texas, 11–14 February 2001
Bentley MR (2015) Modelling for Comfort. Pet Geosci
Cannon S (2018) Reservoir modelling – a practical guide.
22:3–10
John Wiley & Sons, pp. 312
Bentley MR, Hartung M (2001) A 4D surprise at Gannet
Ringrose P, Roberts DM, Raikes S, Gibson-Poole C,
B. Presented at 63rd EAGE conference & exhibition,
Iding M, Østmo S, Taylor M, Bond C, Wightman R,
Amsterdam (extended abstract)
Morris J (2011) Characterisation of the Krechba CO2
Brandsæter I, Ringrose PS, Townsend CT, Omdal S
storage site: critical elements controlling injection per-
(2001) Integrated modeling of geological heterogene-
formance. Energy Procedia 4:4672–4679
ity and fluid displacement: Smørbukk gas-condensate
The Rock Model
2

Abstract Fortunately all the detail is rarely, if ever,


required and we are saved the daunting task of
capturing it all. We are in the business of
If you can sketch it, you can model it
building model representations: cellular prox-
Many static model practitioners embark on ies which capture the essence of the underlying
exercises of ‘geological modelling’, complexity, extracting only the detail which is
attempting to create a digital version of the necessary to address commercial questions
geology as seen at outcrop. In practice this is such as forecasts of resource volumes or fluid
futile. Even relatively high resolution models flow rates over time. We are building reservoir
such as that for the Urgonian Shuiba analogue models rather than geological models and our
shown in below are clearly unlike the outcrop focus is therefore on achieving a reasonable
and close inspection of the outcrop would representation.
reveal a huge amount of missing detail.

Outcrop view and model representation of the Urgonian Limestone at the Gorges de la Nesque, Provence (hot
colours ¼ high permeability)

# Springer Nature Switzerland AG 2021 11


P. Ringrose, M. Bentley, Reservoir Model Design, https://doi.org/10.1007/978-3-030-70163-5_2
12 2 The Rock Model

Keywords founded on a thoughtfully-constructed rock


model. However, given that the rock model is
Reservoir elements · Reservoir architecture · rarely a model deliverable in itself, some
Correlation · Hierarchy · Concept sketching · reservoirs do not require rock models.
Geostatistics · Algorithms · Variograms · Figure 2.1 shows a porosity model which has
Multi-point statistics · Seismic conditioning · been built with and without a rock model. If the
Fluid flow · Darcy upper porosity model is deemed a reasonable
representation of the reservoir, then a rock
model is not required. If, however, the porosity
2.1 The Rock Model distribution is believed to be significantly
influenced by the rock contrasts shown in the
Most of the outputs from reservoir modelling are middle image, then the lower porosity model is
quantitative and derive from property models, so the one to go for. Rock modelling is therefore a
the main purpose of a rock model is to get the means to an end rather than an end in itself, an
properties in the right place – to guide the spatial optional step which is useful if it helps to build an
property distribution in 3D. In an applied geosci- improved property model.
ence and reservoir engineering world, rock The details of rock model input are software-
modelling is rarely an end in itself. For certain specific and are not covered here. Typically, the
model designs, the rock model component is model requires specification of variables such as
minimal, for others it is essential. sand body sizes, facies proportions and reference
In a generic reservoir modelling workflow, the to directional data such as dip-logs. These are part
construction of a rock or ‘facies’ model usually of a standard model build and need consideration,
precedes the property modelling (Fig. 1.10). but are not viewed here as critical to the higher
Effort is focussed on capturing contrasting rock level issue of model design. Moreover, many of
types identified from sedimentology and these variables cannot be specified precisely
representing these in 3D. This is often seen as enough to determine the modelling outcome:
the most ‘geological’ part of the model build rock body databases are only there as a guide,
along with the fault modelling, and it is generally albeit a very useful one. Most critical to the
assumed that a ‘good’ final model is one which is design are the issues identified below,

Fig. 2.1 To model rocks,


or not to model rocks?
Upper image: porosity
model built directly from
porosity logs; middle
image: a rock model
capturing reservoir
heterogeneity; lower image:
the porosity model rebuilt,
conditioned to the rock
model
2.3 The Model Framework 13

mishandling of which is a common source of a appealing visually, as are interpreted areal


poor model build: sketches, but suffer from being present-day
snap-shots in 2D rather than images of the pre-
• Reservoir concept – is the architecture under-
served reservoir. 3D blocks are much better, but
stood in a way which readily translates into a
many find them hard to construct.
reservoir model?
A good starting point is often a simple 2D
• Model elements – from the range of observed
conceptual sketch in section, which can be built
structural components and sedimentological
with guidance from log and core data. Figure 2.3
facies types, has the correct selection of
is an example built after a day in the core store
elements been made on which to base the
interpreting heterogeneities in a fluvial reservoir.
model?
Elements have been chosen which reflect deposi-
• Model build – is the conceptual model carried
tional facies and diagenetic alteration and the
through intuitively into the statistical compo-
sketch illustrates how these elements connect
nent of the build?
architecturally. The elements are petrophysically
• Determinism and probability – is the balance
distinct and the image was built up in the core
of determinism and probability in the model
store based on a geological-petrophysical
understood, and is the conceptual model firmly
conversation.
represented in the deterministic model
The ability to draw a conceptual sketch section
components?
is highly informative and brings clarity to the
mental image of the reservoir held by the model-
These four questions are used in this chapter to ler. If this conceptual sketch is not clear, the
structure the discussion on the rock model, process of model building is unlikely to make it
followed by a summary of more specific rock any clearer. If there is no clear up-front concep-
model build choices. tual model then the model output is effectively a
random draw, hence:
If you can sketch it, you can model it
2.2 Model Concept
An early question to address is “what are the
The best hope of building robust and sensible fundamental building blocks for the reservoir
models is to use conceptual models to guide the concept?” These have been referred to here as
model design. We favour this in place of purely the ‘model elements’ and discussed further
data-driven modelling because of the issue of below. For the moment, the key thing to appreci-
under-sampling (see later). The geologist should ate is that:
have a mental picture of the reservoir and use model elements 6¼ facies types
modelling tools to convert this into a quantitative
With the idea established of a reservoir con-
geocellular representation. Using system defaults,
cept as an architectural sketch constructed from
or treating the package as a black box that some-
model elements, we will look at the issues
how adds value or knowledge to the model, will
surrounding the build of the model framework
always result in models that make little or no
then return to consider how to select elements to
geological sense, and which usually have poor
place within that framework in Sect. 2.4.
predictive capacity.
The form of the reservoir concept is not com-
plex. It may be an image from a good outcrop
analogue or, better, a conceptual sketch. The 2.3 The Model Framework
sketch may be an aerial view of depositional
elements or a 3D block view of the same, The structural framework for all reservoir models
incorporating the preserved form of the reservoir is defined by a combination of structural inputs
(Fig. 2.2). Satellite images are extremely (faults and surfaces from seismic to impart gross
14 2 The Rock Model

Fig. 2.2 Capturing the reservoir concept for a tidal reser- from the Hugli delta in Bangladesh, or a generic block
voir outcrop in the Lajas Formation in Patagonia (top diagram sketch. (Block diagram adapted from McIlroy
image) in terms of an analogue depositional environment et al. 2005)

Fig. 2.3 Capturing the reservoir concept in a simple sketch showing the architectural arrangement of distinct reservoir
elements observed in core from a Triassic fluvial reservoir
2.3 The Model Framework 15

geometry) and stratigraphic inputs (to define is whether the fault model reflects the seismic
internal layering). interpretation only or whether it has been adjusted
The main point we wish to consider here is with reference to a conceptual structural interpre-
what are the structural and stratigraphic issues tation. We recommend the latter, as a direct
that a modeller should be aware of when thinking expression of a seismic interpretation will tend
through a model design? These are discussed to be a conservative representation of the fault
below. architecture – it is limited to the resolution of
the data.
Facets of an incomplete structural interpreta-
tion (Fig. 2.4) are:
2.3.1 Structural Data
• Fault networks tend to be incomplete,
Building a fault model tends to be one of the more e.g. faults may be missing in areas of poor
time-consuming and manual steps in a modelling seismic quality.
workflow, and is therefore commonly done with • Faults may not be joined (under-linked
each new generation of seismic interpretation. In networks) due to lack of sufficient seismic
the absence of new seismic, a fault model may be resolution in areas of fault intersections.
passed on between users and adopted simply to • Horizon interpretations may stop short of
avoid the inefficiency of repeating the manual faults due to poorer seismic resolution around
fault-building. the fault zone.
Such an inherited fault framework therefore • Horizon interpretations may be extended down
requires quality control. The principal question fault planes or unconformities, i.e. the fault or

Fig. 2.4 Incomplete structural interpretations, inappropri- surface geometries around faults, presumably interpreted
ate for modelling. Highlighted areas pick out well-surface or gridded without reference to the fault planes
misties, disconnections in the fault network and unlikely
16 2 The Rock Model

unconformity is not identified independently inputs are smoothed, not only within-surface but
from each horizon, or not identified at all. more importantly around faults, the latter tending
• Faults may be interpreted on seismic noise to have systematically reduced fault
(artefacts). displacements.
A more rigorous structural model workflow is
Although models made from such ‘raw’ seis-
to:
mic interpretations are honest reflections of that
data, the structural representations are incomplete 1. Determine the structural concept – are faults
and a structural interpretation should be overlain expected to die out laterally or to link? Are en
on the seismic outputs as part of the model echelon faults separated by relay ramps? Are
design. To achieve this, the workflow similar to there small, possibly sub-seismic connecting
that shown in Fig. 2.5 is recommended. faults? Make a sketch of the desired fault
Rather than start with a gridded framework architecture to clarify.
constructed directly from seismic interpretation, 2. Input the fault sticks and grid them as fault
the structural build should start with the raw, planes (Fig. 2.5a).
depth-converted seismic picks and the fault 3. Link faults into a network consistent with the
sticks. This is preferable to starting with horizon concept (1, above, also Fig. 2.5b).
grids, as these will have been gridded without 4. Import depth-converted horizon picks as
access to the final 3D fault network. Working points and remove spurious points, e.g. those
with pre-gridded surfaces means the starting erroneously picked along fault planes rather

Fig. 2.5 A structural build based on fault sticks from using a 3D velocity model. The key feature of this
seismic (a), converted into a linked fault system (b), workflow is the avoidance of intermediate surface
integrated with depth-converted horizon picks (c) to gridding steps which are made independently of the final
yield a conceptually acceptable structural framework interpreted fault network. Example from the Douglas
which honours all inputs (d). The workflow can equally Field, East Irish Sea. (Bentley and Elliott 2008)
well be followed using time data, then converted to depth
2.3 The Model Framework 17

than stratigraphic surfaces (Fig. 2.5c) to avoid correlation. For static models which are to go
the structural errors shown in Fig. 2.4. forward to dynamic modelling, the best correla-
5. Edit the fault network to ensure optimal posi- tion lines for modelling are those which most
tioning relative to the raw picks; this may be an closely reflect paths of fluid-flow during produc-
iterative process with the geophysicist, partic- tion or injection.
ularly if potentially spurious picks are The choice of correlation surfaces hugely
identified. influences the resulting model architecture, as
6. Grid surfaces against the edited fault network illustrated in Fig. 2.6. The example is for a
(Fig. 2.5d). shoreface system, broadly cleaning upwards. A
7. Iterate steps 2–6 to capture alternative poten- ‘geological’ solution would identify time-lines
tial fault networks to capture structural and correlate genetic parasequences (Fig. 2.6a).
uncertainties. In this case, for the water injection scenario
illustrated, the flow in the highly permeable
upper part of the sequence will simply follow
the high permeability pathway which may or
2.3.2 Stratigraphic Data
may not cross timelines. In the lower part of the
sequence, however, the low kv/kh ratio associated
There are two main considerations in the selection
with lower and middle shorefaces will direct flow
of stratigraphic inputs to the geological frame-
lines from the horizontal, will result in poorer
work model: correlation and hierarchy.
macroscopic sweep and will predict unswept
intervals in the left-hand well. A lithostratigraphic
2.3.2.1 Correlation interpretation (Fig. 2.6b) will predict an uneven
In the subsurface, correlation usually begins with floodfront, but with a different production profile
markers picked from well data – well picks. and a prediction that all zones will ultimately be
Important information also comes from correla- swept. The correct model representation is the
tion surfaces picked from seismic data. Numerous one which captures how the fluid will ‘see’ the
correlation picks may have been defined in the reservoir, and this depends on the permeability
interpretation of well data and these picks may architecture, and an idea for how sensitive the
have their origins in lithological, biostrati- reservoir fluids are to heterogeneity (guidance
graphical or chronostratigraphical correlations – on this is given in Sect. 2.4.4).
all of these being influenced directly or indirectly An excellent example of the same issue for
by sequence stratigraphy (see for example Van from a lacustrine reservoir system is given by
Wagoner et al. 1990; Van Wagoner and Bertram Ainsworth et al. (1999).
1995). If multiple stratigraphic correlations are
available these may produce surfaces which spa- 2.3.2.2 Use of Hierarchy
tially cross, such as timelines crossing lithostra- Different correlation schemes have different
tigraphic correlations. Not all these surfaces are influences on the issue of hierarchy, as the stra-
needed in reservoir modelling. A choice is there- tigraphy of most reservoir systems is inherently
fore required and as with the structural framework hierarchical (Campbell 1967) and for reservoir
the selection of surfaces should be made with modelling this impacts on the representation of
reference to the conceptual sketch, which is in length scales, an issue which is central to rock and
turn driven by the model purpose. property geostatistics. For a sequence strati-
For models built for exploration purposes or graphic correlation scheme, a low-stand systems
for in-place resource calculations only, correla- tract may have a length-scale of tens of kilometres
tion lines which track the distribution of reservoir and contain stacked parasequences with a length-
properties are ideal, as the geostatistical simula- scale of kilometres (e.g. Fig. 2.6a). These
tion of properties will follow the 3D grid stratig- parasequences in turn act as the container for
raphy, which is in turn influenced by the zone individual reservoir elements with dimensions of
18 2 The Rock Model

Fig. 2.6 Alternative (a) chronostratigraphic and (b) lines directly influences flow lines in a dynamic model
lithostratigraphic correlations of the same sand simulation. Note: the chronostratigraphic correlation
observations in three wells; the choice of correlation invokes an additional hierarchical level in the stratigraphy

tens to hundreds of metres. Lower levels of the Particular care is required if a sequence strati-
hierarchy are always associated with shorter graphic approach has been used to correlate. The
length scales. method yields a framework of timelines, often
The reservoir model should aim to capture the based on picking the most shaley parts of
levels in the stratigraphic hierarchy which influ- non-reservoir intervals so a rock model for an
ence the spatial distribution of significant interval between two flooding surfaces should
heterogeneities (determining ‘significance’ will contain a shaley portion at both the top and the
be discussed in Sect. 2.4.4). Bounding surfaces base of the interval. The probabilistic aspects of
within the hierarchy (e.g. flooding surfaces in a the subsequent geostatistical modelling, however,
marginal marine interval) may or may not act as can easily degrade the correlatable nature of the
flow barriers – so they may represent important flooding surfaces, with inter-well shales becom-
model elements in themselves or they may merely ing scattered incorrectly throughout the zone.
control the distribution of model elements within This particular case can be resolved by applying
that hierarchy. This applies to structural model deterministic trends; this will be pursued further
elements as well as the more familiar sedimento- in Sect. 2.7.4.
logical elements, as features such as fracture den- In general, hierarchical order is intrinsic to
sity can be controlled by mechanical reservoirs and some degree of hierarchy is usually
stratigraphy – implicitly related to the strati- captured in standard software workflows, but not
graphic hierarchy. necessarily all. The modeller is required to work
2.4 Model Elements 19

out if the default hierarchy is sufficient to capture of the two lithofacies elements. The two rock
the required concept. If not, the workflow should models were then combined using a logical prop-
be modified, most commonly by applying logical erty model operation, which imposed the texture
operations. An example of this is illustrated in of the fine-scale lithofacies heterogeneity, but
Fig. 2.7, from a reservoir model in which the only within the relevant channels. Effectively
first two hierarchical levels were captured by the this created a third hierarchical level within the
default software workflow: tying layering to seis- model.
mic horizons (first level) then infilled by One way or another hierarchy can be
conformant sub-seismic stratigraphy (second represented, but only rarely by using default
level, Fig. 2.7a). An additional hierarchical level model workflows.
was required because an important permeability
heterogeneity existed between lithofacies types
within a particular model element (the main
2.4 Model Elements
channels). The chosen solution was to build the
channel model using channel objects and creating
Having established a structural/stratigraphic
a separate, in this case probabilistic, model which
model framework, we can now return to the
contained the information about the distribution
model concept and consider how to fill the

Fig. 2.7 The addition of


hierarchy by logical
combination: (a)
stratigraphical framework;
(b) single-hierarchy
channel model (left,
blue ¼ mudstone,
yellow ¼ main channel)
built in parallel with a
probabilistic model of
lithofacies types (right,
yellow ¼ better quality
reservoir sands); (c) logical
combination of lithofacies
detail in the main channel
only – an additional level of
hierarchy
20 2 The Rock Model

framework to create an optimal architectural (perhaps incorrectly) excluded from descriptions


representation. of ‘facies’.
The flow-based terms tend to be well-centric
views which, when converted into a 3D reservoir
model, generally lead to a layer-based representa-
2.4.1 Reservoir Models Not
tion. For truly ‘layer-cake’ reservoirs (sensu
Geological Models
Webber and van Guens 1990) this may be a
sufficient representation. Our aim is to define
The rich and detailed geological story that can be
building blocks for any 3D reservoir architecture,
extracted from days or weeks of analysis of the
including parts of a field remote from well and
rock record from the core store need not be
production data – a broader mission.
incorporated directly into the reservoir model,
Modelling elements are defined here as:
and this is a good thing. There is a natural ten-
dency to ‘include all the detail’ just in case some- three-dimensional rock bodies which are
petrophysically and/or geometrically distinct from
thing minor turns out to be important. Models
each other in the specific context of the reservoir
therefore have a tendency to be over-complex fluid system
from the outset, particularly for novice modellers.
The amount of detail required in the model can, to Elements are therefore similar to petrophysical
a large extent, be anticipated. ‘rock types’, but include a geometric distinction,
There is also a tendency for modellers to seize i.e. two rocks which are petrophysically similar
the opportunity to build ‘real 3D geological would be distinguished if they were contained
pictures’ of the subsurface and to therefore make within two differently-shaped rock bodies such
these as complex as the geology is believed to as sheets and channels. This is because bodies
be. This is a hopeless objective as the subsurface with different geometries stack together differ-
is considerably more complex in detail than we ently to produce different architectures. Also,
are capable of modelling explicitly and, thank- petrophysical rock types can often be combined
fully, much of that detail is irrelevant to economic to make one modelling element, depending on the
or engineering decisions. We are building reser- demands of the fluid system. The fluid-fill factor
voir models – reasonable representations of the is important as it highlights the fact that different
detailed geology – not geological models. levels of heterogeneity are important for different
types of fluid, e.g. gas reservoirs behave more
homogeneously than oil reservoirs for a given
reservoir type (see Sect. 2.4.4 below for handy
2.4.2 Building Blocks guidance on this).

Hence the view of the components of a reservoir


model as model elements – the fundamental
building blocks of the 3D architecture. The use 2.4.3 Model Element Types
of this term distinguishes model elements from
geological terms such as ‘facies’, ‘lithofacies’, If we can work beyond the traditional use of
‘facies associations’ and ‘genetic units’ and depositional facies to define rock bodies for
from production-related terms such as ‘hydraulic modelling, a broad spectrum of elements can be
units’ or ‘flow units’. considered for use, i.e. making the sketch of the
The geological terms are required to help inter- reservoir as it is intended to be modelled. Six
pret the richness of the geological story, but are candidate model element types are considered
not necessarily the things we need to put into below.
reservoir models – they usually carry more detail
than we need (see Sect. 2.4.4). Moreover, key 2.4.3.1 Lithofacies Types
elements of the reservoir model may be small- This is sedimentologically-driven and is the tradi-
scale structural or diagenetic features, often tional way of defining the components of a rock
2.4 Model Elements 21

model. Typical lithofacies elements may be related by a depositional process. These include
coarse sandstones, mudstones or grainstones, the rock bodies which typical modelling packages
and will generally be defined from core and or are most readily designed to incorporate, such as
log data (e.g. Fig. 2.8). channels, sheet sands or heterolithics. These usu-
ally comprise several lithofacies, for example, a
2.4.3.2 Genetic Elements fluvial channel might include conglomeratic,
In reservoir modelling, genetic elements are a cross-bedded sandstone and mudstone
component of a sedimentary sequence which are lithofacies. Figure 2.9 shows an example for a

Fig. 2.8 Example


lithofacies elements; left:
coarse, pebbly sandstone;
right: massively-bedded
coarse-grained sandstone

Fig. 2.9 Genetic Units: lithofacies types grouped into channel, upper shoreface (USF) and lower shoreface (LSF)
genetic depositional elements. (Image courtesy of Simon Smith)
22 2 The Rock Model

shoreface in which genetic elements are identified by structural rather than sedimentological or strat-
from core and log data. igraphic aspects. Fault damage zones are impor-
tant volumetric structural elements (e.g. Fig. 2.12)
2.4.3.3 Stratigraphic Elements as are mechanical layers (strata-bound fracture
For models based on a sequence stratigraphic sets) with properties driven by small-scale
framework, the fine-scale components of the jointing or cementation. These are discussed fur-
scheme may be the predominant model elements. ther in Sect. 6.7.
These may be parasequence sets organised within
a larger-scale sequence-based framework, 2.4.3.6 Exotic Elements
e.g. Fig. 2.10. The list of potential model elements matches the
diversity of reservoir types. Reservoirs in volcanic
2.4.3.4 Diagenetic Elements rocks are a good example (Fig. 2.13), in which the
Diagenetic elements commonly overprint key model elements may be zones of differential
lithofacies types, may cross major stratigraphic cooling and hence differential fracture density.
boundaries and are often the predominant feature
of carbonate reservoir models. Typical diagenetic

elements could be zones of meteoric flushing,
dolomitisation or de-dolomitisation (Fig. 2.11). The important point about using the term
‘model element’ is to stimulate broad thinking
2.4.3.5 Structural Elements about the model concept, a thought process
Assuming a definition of model elements as which runs across the reservoir geological
three-dimensional features, structural model sub-disciplines (stratigraphy, sedimentology,
elements should be considered when the structural geology, even volcanology). For avoid-
properties of a reservoir volume are dominated ance of doubt, the main difference between the

Fig. 2.10 Sequence stratigraphic elements


2.4 Model Elements 23

Fig. 2.11 Diagenetic elements in a carbonate build-up; where reservoir property contrasts are driven by differential
development of dolomitisation

Fig. 2.12 Structural Bounding


elements: volumes fault
dominated by minor
fracturing in a fault damage
zone next to a major block- Slip surfaces/
bounding fault. (Bentley Isolated small deformation
faults and bands in best
and Elliot 2008) deformation sands – some
bands open (blue)

A
e1
Zon
C
B e1
e1 Zon e2
A
Zon Zon

OWC
Minor faulting in
poorer sands
(open?)

Damage zone

model framework and the model elements is that 2.4.4 How Much Heterogeneity
2D features are used to define the model frame- to Include? ‘Flora’s Rule’
work (faults, unconformities, sequence
boundaries, simple bounding surfaces) whereas The answer to this fundamental question
it is 3D model elements which fill the volumes depends on the interaction of geology and flow
within that framework. physics. The key criteria for distinguishing which
Having defined the framework and identified model elements are required for the model build
the candidate elements, the next question is how are:
much detail to carry explicitly into the modelling
1. The identification of potential elements from a
process. Everything that can be identified need
large number of ‘candidates’;
not be modelled.
24 2 The Rock Model

• Oil reservoirs under depletion are sensitive to


2 or more orders of magnitude permeability
variation;
Rubble
• Heavy oil reservoirs or lighter crudes under
layer
secondary or tertiary recovery tend to be sen-
Basalt
layer sitive to 1 order of magnitude permeability
contrasts
Lava flow Direction
This simple rule of thumb, which has become
Rubble known to us as ‘Flora’s Rule’ (after an influential
and much-respected reservoir engineering col-
Basalt
Basalt league of one of the authors), has its foundation
Rubble in the viscosity term in the Darcy flow equation:
Basalt
Basalt
k
u¼ ∇ðPÞ ð2:1Þ
Fig. 2.13 Exotic elements: reservoir breakdown for a μ
bimodal-permeability gas-bearing volcanic reservoir in
which model elements are driven by cooling behaviour where:
in a set of stacked lava flows. (Image courtesy of Jenny
Garnham) u ¼ fluid velocity
k ¼ permeability
2. The permeability contrasts between the candi- μ ¼ fluid viscosity
date elements; ∇P ¼ pressure gradient
3. The fluid type;
4. The production and/or injection mechanism. Because the constant of proportionality
The four items above then need to be consid- between flow velocity and the pressure gradient
ered in terms of the length scales of the underly- is k/μ, low viscosity results in a weaker depen-
ing heterogeneities and the scale associated with dence of flow on the pressure gradient whereas
the original model purpose. higher viscosities give increasingly higher depen-
The first steps are illustrated in Fig. 2.14 in dence of flow on the pressure gradient. Combine
which six candidate elements have been identified this with a consideration of the mobility ratio in a
from core and log data (step 1), placed in an two-phase flow system, and the increased sensi-
analogue context (step 2) and their rock property tivity of secondary and tertiary recovery to per-
contrasts compared (step 3). The six candidate meability heterogeneity becomes clear.
elements seem to cluster into three, but is it right Using these criteria, some candidate elements
to lump these together? How great does a contrast which contrast geologically in core may appear
have to be to be ‘significant’? Here we can invoke rather similar to a fluid passing through them. The
some useful guidance. same heterogeneities that are shown to have an
A simple way of combining the factors above important effect in a CO2 disposal scheme (see
is to consider what level of permeability contrast Chap. 7) may have absolutely no effect in a
would generate significant flow heterogeneities hydrocarbon gas reservoir under depletion. The
for a given fluid type and production mechanism. importance of some ‘borderline’ heterogeneities
The handy rule of thumb is as follows (Fig. 2.15): may be unclear – and these could be included on a
‘just in case’ basis. Alternatively, a quick static/
dynamic sensitivity run may be enough to dem-
• Gas reservoirs are generally sensitive to 3 or onstrate that a specific candidate element can be
more orders of magnitude permeability dropped or included with confidence.
variation;
2.4 Model Elements 25

1000.0
Shoreface
trend Channel
trend
100.0
Permeability (mD)

Heterolithic
cluster
10.0

Channel fill
Gravel lag
Floodplain heterolithic
1.0
Middle shoreface sand
Lower shoreface heterolithics
Mudstone

0.1
0 5 10 15 20 25 30 35 40
Porosity (%)

Fig. 2.14 Six candidate model elements and one non-reservoir element identified from core and log data and clustered
into three on a k/phi cross plot – how do we decide if it is acceptable to lump these together?

Fig. 2.15 Critical order of Critical


magnitude permeability permeability
contrasts for a range of fluid contrast
and production
mechanisms – ‘Flora’s Fluid fill dry wet light heavy
Rule’ gas gas oil oil

Production depletion depletion water gas/steam


mechanism (no aquifer) (with aquifer) injection injection
26 2 The Rock Model

The outcome of this line of argument is that geological one. Geological and petrophysical
some reservoirs may not require complex 3D analyses are required to define the degree of per-
reservoir models at all (Fig. 2.16). Gas-charged meability variation and to determine the spatial
reservoirs require high degrees of permeability architecture, but it is the fluid type and the pro-
heterogeneity in order to justify a complex duction or injection process which determine the
modelling exercise – they often deplete as simple level of geological detail needed in the reservoir
tanks. Fault compartments and active aquifers model and hence the selection of ‘model
may stimulate heterogeneous flow production in elements’.
gas fields, but even in this case the model required Beyond the considerations above, there
to capture key fault blocks can be quite coarse. By remains the question of whether a significant het-
contrast, heavy oil fields under water or steam erogeneity should be modelled explicitly or cap-
injection or CO2 storage schemes are highly sus- tured implicitly in a well-selected average. This
ceptible to minor heterogeneities, and benefit comes down to length scales, specifically the
from detailed modelling. The difficulty here lies difference between the length scale of the signifi-
in assessing the architecture of these cant uncertainty and the length scale of the ques-
heterogeneities, which can often be on a very tion being addressed in the modelling exercise
fine, poorly-sampled scale. (Fig. 2.17).
The decision as to which candidate elements to Heterogeneity length scales are an established
include in a model is therefore not primarily a concept and will be discussed fully in Chap. 4 in

Fig. 2.16 What type of


reservoir model? A choice Three
based on heterogeneity and
orders 3D
model
Perm Heterogeneity

fluid type

Two
orders

Simple
One analytical
order model

Fluid fill dry wet light heavy


gas gas oil oil

Production depletion depletion water gas/steam


mechanism (no aquifer) (with aquifer) injection injection

Fig. 2.17 A comparison of


The length scale of the The length scale of the
length scales to assess the heterogeneity (H) development question (Q)
option of
implicit vs. explicit element
representation

vs.
2.4 Model Elements 27

the context of models scales and the REV (the The reason the comparison is important is that
Representative Elementary Volume). Here it the subsurface has a dispersive effect on fluid
suffices to say that the length scale is an assess- flow, which is also rate-dependent. This is
ment of the distance over which the heterogeneity illustrated in Fig. 2.18 which shows a simple
varies: systems with high point-to-point cross-sectional flow experiment involving water
variations over short distances are said to have injecting into an oil reservoir. The homogeneous
short length scales, variations over long distances flow model (a) shows simple piston-like displace-
are said to have long length scales. The concept of ment, distorted only by the effect of gravity. The
questions having scales may be less commonly heterogeneous flow model (b) clearly shows the
considered, but holds true for most subsurface sensitivity to the underlying permeability hetero-
decisions; every model has a purpose, the purpose geneity but the shape of the flood front is not
relates to a question which is being asked and that significantly impacted as the length of the hetero-
question generally has a scale. For infill drilling, geneity is small compared to the length scale of
the scale of the question is the proposed well the question, which is the well spacing (in this
spacing, typically a few hundred metres and typi- experiment this is the width of the model). The
cally shorter onshore than offshore. For gas model is literally ‘so heterogeneous, it is homo-
depletion the scale of the question is the size of geneous’. Only as the length scale of the hetero-
the reservoir compartment being blown down, geneity (H) approaches the length scale of the
which may be the full field. For CO2 storage development question (Q) does the floodfront
schemes it is the size of the storage complex. start to deviate from the simple homogeneous
For well-test interpretations the scale is much case (the lower two sets of figures).
smaller – typically the radius of investigation of In all cases in Fig. 2.18b–d the heterogeneity is
the well test. significant in terms of impacting flow. The

Fig. 2.18 The impact of contrasting heterogeneity length heterogeneous model with very short length scales
scales vs. a fixed length scale of the development question (H < < Q); (c) & (d) models in which the heterogeneity
(the well spacing): left-hand models are static permeabil- length scales approach that of the scale of the development
ity, right-hand models are paired water injection question (where H ~ Q). Images produced courtesy of
simulations, injecting from the left; ornament is water Rayyan Hussein and Ed Stephens
saturation. (a) homogeneous permeability model; (b)
28 2 The Rock Model

Observe Consider fluid,


heterogeneity sensitivity

No
Significant ? Ignore

Yes

Consider length scales


(LS)

H≈Q H << Q
Compare
heterogeneity LS Implicitly
Explicitly model (H) with question average
LS (Q)

Fig. 2.19 How much detail needs to be incorporated in a length scale of the contrasts, the reservoir fluid and pro-
static modelling exercise? Assuming the ultimate purpose duction mechanism and the scale of the question the model
is flow modelling, it depends on the magnitude and the is addressing

difference is that in Fig. 2.18b it can be captured geological heterogeneities in 3D reservoir


using a simple average, whereas in Fig. 2.18c–d it models.
needs to be explicitly modelled. The averaging However, the promise of geostatistics (and
options are discussed in Chap. 4, but for now the ‘knowledge-based systems’) to solve reservoir
question of how many elements need to be forecasting problems sometimes led to disap-
incorporated in a model can be summarised in pointment. Probabilistic attempts to predict desir-
the flow chart in Fig. 2.19. It relates to the magni- able outcomes, such as the presence of a sand
tude of the heterogeneity compared to the fluid fill body, yield naïve results if applied
and the production mechanism (‘Flora’s Rule’) simply (Fig. 2.20).
and a consideration of the length scales of the This potential for disappointment is unfortu-
heterogeneity and the development question at nate as the available geostatistical library of tools
hand. is excellent for applying quantitative statistical
If the fluid doesn’t sense the heterogeneity, the algorithms rigorously and routinely, and is essen-
heterogeneity need not be modelled. tial for filling the inter-well volume in a 3D reser-
voir model. The disappointments come from
incorrect application, nicely summarised by
2.5 Determinism and Probability Pyrcz and Deutch (2014):
Geostatistical methods aim to reproduce (the) input
The use of geostatistics in reservoir modelling statistics; geostatistical models have no predictive
became widely fashionable in the early 1990s power with respect to these statistics
(e.g. Haldorsen and Damsleth 1990; Journel and At the heart of this is an appreciation of the
Albert 1990) and was generally received as a interplay of determinism and probability in the
welcome answer to some tricky questions such reservoir model build.
as how to handle uncertainty and how to represent
2.5 Determinism and Probability 29

Fig. 2.20 A naïve example of an expectation from geostatistical forecasting – the final mapped result simply illustrates
where the wells are

2.5.1 Balance Between Determinism ‘guessing’) is one whose behaviour is completely


and Probability non-deterministic. A probabilistic method is one
in which likelihood or probability theory is
The underlying design issue we stress is the bal- employed. Monte Carlo methods, referred to
ance between determinism and probability in a especially in relation to uncertainty handling, are
model, and whether the modeller is aware of, and a class of algorithms that rely on repeated random
in control of, this balance. sampling to compute a probabilistic result.
To define the terminology as used here: Although not strictly the same, the terms proba-
bilistic and stochastic are often treated synony-
• Determinism is taken to mean an aspect of a
mously and in this book we will restrict the
model which is fixed by the user and imposed
discussion to the contrast between deterministic
on the model, such as placing a fault in the
and probabilistic approaches applied in reservoir
model or precisely fixing the location of a
modelling.
particular rock body;
Put simply: probability involves dice, but in
• Probability refers to aspects of the model
determinism there are no dice. The subtlety is that
which are guided by a random (stochastic)
the balance of deterministic and probabilistic
outcome from running a probabilistic
influences on a reservoir model is not as black
algorithm.
and white as it may at first seem. Consider the
simple range of cases shown in Fig. 2.21,
To complete the terminology, a stochastic pro- showing three generic types of rock body:
cess (from the Greek stochas for ‘aiming’ or
30 2 The Rock Model

Fig. 2.21 Different rock body types as an illustration of the deterministic/probabilistic spectrum

1. Correlatable bodies (Fig. 2.21, left). These are Deterministic constraints will have been
largely determined by correlation choices placed on the algorithm to make sure bodies
between wells, e.g. sand observations are are not unrealistically large or small, and are
made in two wells and interpreted as appropriately numerous.
occurrences of the same extensive sand unit
So, if everything is a mixture of determinism
and are correlated. This is a deterministic
and probability, what’s the problem? The issue is
choice, not an outcome of a probabilistic algo-
that although any reservoir model is rightfully a
rithm. The resulting body is not a 100% deter-
blend of deterministic and probabilistic processes,
mined ‘fact’, however, as the interpretation of
the richness of the blend is a choice of the user so
continuity between the wells is just that – an
this is an issue of model design. Some models are
interpretation. At a distance from the wells, the
highly deterministic, some are highly probabilis-
sand body has a probabilistic component in its
tic and which end of the spectrum a model sits at
interpretation, if not its algorithmic
influences the uses to which it can be put. A
emplacement.
single, highly probabilistic model is not suitable
2. Non-correlated bodies (Fig. 2.21, centre).
for planning one well (probabilistic rock bodies
These are bodies encountered in one well
will probably not be encountered as prognosed).
only. At the well, their presence is determined.
A highly deterministic model may be inappropri-
At increasing distances from the well, the loca-
ate, however, for simulations of reservoirs with
tion of the sand body is progressively less well
small rock bodies and little well data. Further-
determined, and is eventually controlled
more, different modellers might approach the
almost solely by the roll of the dice. These
same reservoir with more deterministic or more
bodies are each partly deterministic and partly
probabilistic mindsets.
probabilistic.
The balance of probability and determinism in
3. Probabilistic bodies (Fig. 2.21, right). These
a model is therefore a subtle issue, and needs to be
are the bodies not encountered by wells, the
understood and controlled as part of the model
position of which will be chosen by a probabi-
design. We suggest here that greater happiness is
listic algorithm. Even these, however, are not
generally to be found in models built with strong
100% probabilistic as their appearance in the
deterministic control, as the deterministic inputs
model is not a completely random surprise.
are the direct carrier of the reservoir concept.
2.5 Determinism and Probability 31

2.5.2 Different Generic Approaches data set, which can yield clear statistical guidance
for a model build. In reservoir modelling we are
To emphasise the importance of user choice in the typically dealing with more sparse data, an excep-
approach to determinism and probability, two tion being direct conditioning of the reservoir
approaches to model design are summarised model by high quality 3D seismic data
graphically. The first is a data-driven approach (e.g. Doyen 2007).
to modelling (Fig. 2.22). In this case, the model An alternative is to take a more concept-
process starts with an analysis of the data, from driven, top-down approach (Fig. 2.23). In this
which statistical guidelines can be drawn. These case, the modelling still starts with an analysis
guidelines are input to a rich statistical model of of the data, but the analysis is used to generate
the reservoir which in turn informs a geostatistical alternative conceptual models for the reservoir.
algorithm. The outcome of the algorithm is a The reservoir concept should honour the data at
model, from which a forecast emerges. its geographic location but, as the dataset is gen-
This is the approach which most closely erally statistically insufficient, the full 3D model
resembles the default path in reservoir modelling, should not be limited to it. The model build is
resulting from the linear workflow of a standard strongly concept-driven, has a strong determin-
reservoir modelling software package, as outlined istic component, and less reliance is placed on
in Fig. 1.10. geostatistical algorithms to produce the desired
The limit of a simple data-driven approach product. The final outcome is not a single fore-
such as this is that there is a reliance on the rich cast, but a set of forecasts based on the
geostatistical algorithm to generate the desired uncertainties associated with the underlying res-
model outcome. This in turn relies on the statisti- ervoir concepts.
cal content of the underlying data set, yet for most The difference between the data- and concept-
of our reservoirs, the underlying data set is sta- driven approaches described above is the expec-
tistically insufficient. This is a critical issue and tation they place on the geostatistical algorithm
distinguishes reservoir modelling from other in the context of data insufficiency. The result
types of geostatistical modelling in earth sciences is a greater emphasis on deterministic model
such as mining and soil science. In the latter aspects, which therefore need some more
cases, there is often a much richer underlying consideration.

Generate concepts (work beyond the data)

Forecast Identify uncertainties

Model build Generate models


using geostats

Rich statistical algorithms


Forecasts

Work up data

Fig. 2.22 The data-driven approach to reservoir Fig. 2.23 The concept-driven approach to reservoir
modelling modelling
32 2 The Rock Model

2.5.3 Forms of Deterministic Control dimensions and statistical success criteria (Sect.
2.6). In the context of the concept-driven logic
The deterministic controls on a model can be seen described above, these limits need to be determin-
as a toolbox of options with which to realise an istically chosen rather than a simple consequence
architectural concept in a reservoir model. These of the spread of a statistically insufficient well
will be discussed further in the last section of this data set.
chapter, and are introduced below.
2.5.3.5 Seismic Conditioning
2.5.3.1 Faulting The great hope for detailed deterministic control
With the exception of some (relatively) specialist is exceptionally good seismic data. This hope is
structural modelling packages, large scale struc- often forlorn, as even good quality seismic data is
tural features are strongly deterministic in a reser- not generally resolved at the level of detail
voir model. Thought is required as to whether the required for a reservoir model. All is not lost,
structural framework is to be geophysically or however, and it is useful to distinguish between
geologically led, that is, are only features resolv- hard and soft conditioning.
able on seismic to be included, or will features be Hard conditioning is applicable in cases where
included which are kinematically likely to occur extremely high quality seismic, sufficiently
in terms of rock deformation. This in itself is a resolved at the scale of interest, can be used to
model design choice, introduced in the discussion directly define the architecture in a reservoir
on model frameworks (Sect. 2.3) and the choice model. An example of this is seismic geobodies
will be imposed deterministically. in cases where the geobodies are believed to
directly represent important model elements.
Some good examples of this have emerged from
2.5.3.2 Correlation and Layering
deepwater clastic environments, but in many of
The correlation framework (Sect. 2.3) is deter-
these cases detailed investigation (or more dril-
ministic, as is any imposed hierarchy. The proba-
ling) ends up showing that reservoir pay extends
bilistic algorithms work entirely within this
sub-seismically, or that the geobody is itself a
framework – layer boundaries are not moved in
composite, heterogeneous feature.
common software packages. Ultimately the
Soft conditioning is the more generally useful
flowlines in any simulation will be influenced by
approach for rock modelling, where information
the fine layering scheme and this is all set
from seismic is used to give a general guide to the
deterministically.
probabilistic algorithms (Fig. 2.24). In this case,
the link between the input from seismic and the
2.5.3.3 Choice of Algorithm probabilistic algorithm may be as simple as a
There are no hard rules as to which geostatistical correlation coefficient. It is the level of the coeffi-
algorithm gives the ‘correct’ result yet the choice cient which is now the deterministic control; and
of pixel-based or object-based modelling the decision to use seismic as either a hard or soft
approaches will have a profound effect on the conditioning tool is also a deterministic one.
model outcome (Sect. 2.7). The best solution is One way of viewing the role of seismic in
the algorithm or combination of algorithms which reservoir modelling is to adapt the frequency/
most closely reflect the desired reservoir concept, amplitude plot familiar from geophysics
and this is a deterministic choice. (Fig. 2.25). These plots are used to show the
frequency content of a seismic data set and can
2.5.3.4 Boundary Conditions be used to indicate how improved seismic acqui-
for Probabilistic Algorithms sition and processing can extend the frequency
All algorithms work within limits, which will be content towards the ends of the spectrum. Fine-
given by arbitrary default values unless imposed. scale reservoir detail sits below the resolution of
These limits include correlation lengths, object the seismic data. The low end of the frequency
2.5 Determinism and Probability 33

spectrum – the large scale layering – is also frequency ‘earth model’ to support seismic inver-
beyond the range of the seismic sample unless sion work.
broadband technologies are in use (e.g. Soubaras The plot is a convenient backdrop for
and Dowle 2010). Even in the case of broadband arranging the components of a reservoir model,
seismic, there is a requirement to construct a low and the frequency/amplitude axes can be alterna-
tively labelled as ‘scale of heterogeneity’ and
‘content’. The reservoir itself exists on all scales
and is represented by the full ‘rectangle of truth’,
which is only partially covered by seismic data.
The missing areas are completed by the frame-
work model at the low frequency end, by core and
log-scale detail at the high frequency end and by
interpretation in between these data scales.
The only full-frequency data set is a well
exposed outcropping reservoir analogue, as it is
only in the field that the reservoir can be accessed
on all scales. Well facilitated excursions to out-
crop analogues are thereby conveniently justified
(an often neglected activity, the value of which is
summarised in Howell et al. 2014).
Is all the detail necessary? Here we can refer
back to Flora’s Rule and the model purpose,
which will inform us how much of the full spec-
Fig. 2.24 Deterministic model control in the form of trum is required to be modelled in any particular
seismic ‘soft’ conditioning of a rock model. Upper case. In terms of seismic conditioning, it is only in
image: AI volume rendered into cells. Lower image: Best the case where the portion required for modelling
reservoir properties (red, yellow) preferentially guided by
high AI values (Image courtesy of Simon Smith)
exactly matches the blue area in Fig. 2.25 that we

Fig. 2.25 Seismic framework improved log & core-scale


conditioning in the context model seismic detail
of the ‘rectangle of truth’ –
the full frequency spectrum
only observed at the amplitude
outcropping reservoir interpretation
analogues

Subsurface concept

reservoir seismic
model
content
frequency
scale of heterogeneity

Outcrop analogue
34 2 The Rock Model

can confidently apply hard conditioning using confounded by complex geostatistical terminol-
geobodies in the reservoir model, and this happy ogy which is difficult to translate into the
coincidence is rarely the case. modelling process. Take for example this quota-
tion from the excellent but fairly theoretical treat-

ment of geostatistics by Isaacs and Srivastava
With the above considered, there can be some (1989):
logic as to the way in which deterministic control in an ideal theoretical world the sill is either the
is applied to a model, and establishing this is part stationary infinite variance of the random function
or the dispersion variance of data volumes within
of the model design process. The probabilistic the volume of the study area.
aspects of the model should be clear, to the
point where the modeller can state whether the The problem for many of us is that we don’t
design is strongly deterministic or strongly prob- work in an ideal theoretical world and are unfa-
abilistic and identify where the deterministic and miliar with the concepts and terminology that are
probabilistic components sit. used in statistical theory. This section therefore
Both components are implicitly required in aims to extract just those statistical concepts
any model and it is argued here that the road to which are essential for an intuitive understanding
happiness lies with strong deterministic control. of what happens in the statistical engines of res-
The outcome from the probabilistic components ervoir modelling packages.
of the model should be largely predictable, and
should be a clear reflection of the input data
combined with the deterministic constraints 2.6.1 Key Geostatistical Concepts
imposed on the algorithms.
Disappointment occurs if the modeller expects A central concept in geostatistics which must be
the probabilistic aspects of the software to take on understood is that of variance. Variance, σ2, is a
the role of model determination, and fill in the measure of the average difference between indi-
knowledge and interpretational gaps in their vidual values and the mean of the dataset they
own mind. come from. It is a measure of the spread of the
So how to approach geostatistics? The first dataset:
step is to ensure we have a firm understanding
of the underlying concepts behind the
algorithms – just the essentials. σ2 ¼ Σðxi  μÞ2 =N ð2:2Þ

where:

2.6 Essential Geostatistics xi ¼ a value for the variable in question,


N ¼ the number of values in the data set, and
Good introductions to the use of statistics in res- μ ¼ the mean of that data set
ervoir modelling can be found in Yarus and
Chambers (1994), Holden et al. (1998), Dubrule Variance-related concepts underlie much of
and Damsleth (2001), Caers (2011) and Pyrcz and reservoir modelling. Two such occurrences are
Deutsch (2014) and these will not be superseded summarised below: the use of correlation
here. In the spirit of a practitioners guide, our coefficients and the variogram.
interest lies in the application of the techniques – The correlation coefficient measures the
seeing the wood for the trees and finding the key strength of the dependency between two
levers and points of qualitative assumption – the parameters by comparing how far pairs of values
points in the workflow where the ‘intuitive leaps’ (x, y) deviate from a straight line function, and is
happen. Very often the reservoir modeller is given by the function:
2.6 Essential Geostatistics 35

Pn¼N  
1=N ðxi  μx Þ yi  μy variogram function, which is most simply
ρ¼ n¼1
ð2:3Þ expressed as:
σx σy

where:  2
2γ ¼ ð1=NÞΣ zi  zj ð2:4Þ
N ¼ number of points in the data set
xi, yi ¼ values of point in the two data sets where:
μx, μy ¼ mean values of the two data sets, and
zi and zj are pairs of points in the dataset
σx, σy ¼ standard deviations of the two data sets
(the square of the variance)
For convenience we generally use the
semivariogram function:
If the outcome of the above function is positive
then higher values of x tend to occur with higher
values of y, and the data sets are said to be  2
γ ¼ ð1=2NÞΣ zi  zj ð2:5Þ
‘positively correlated’. If the outcome is ρ ¼ 1
then the relationship between x and y is a simple The semivariogram function can be calculated
straight line. A negative outcome means high for all pairs of points in a data set, whether or not
values of one data set correlate with low values they are regularly spaced, and can therefore be used
of the other: ‘negative correlation’. A zero result to describe the relationship between data points
indicates no correlation. from, for example, irregularly scattered wells.
Note that correlation coefficients assume the The results of variogram calculations can be
data sets are both linear. For example, two data represented graphically as a scatterplot
sets which have a log-linear relationship might (Fig. 2.26a) to establish the relationship between
have a very strong correlation but still display a the separation distance (known as the ‘lag’) and
poor correlation coefficient. Of course, a coeffi- the average γ value for pairs of points which are
cient can still be calculated if the log-normal data that distance apart. The data is grouped into dis-
set, e.g. permeability, is first converted to a linear tance bins to do the averaging; hence only one
form by taking the logarithm of the data. value appears for any given lag in Fig. 2.26b.
Correlation between datasets (e.g. porosity A more formal definition of semi-variance,
versus permeability) is typically entered into res- incorporating the separation distance term, is
ervoir modelling packages as a value between given by:
0 and 1, in which values of 0.7 or higher generally n o
1
indicate a strong relationship. The value is com- γ ðhÞ ¼ E ½Z ðx þ hÞ  Z ðxÞ2 ð2:6Þ
monly described as the ‘dependency’. 2
Correlation coefficients reflect the variation of where:
values within a dataset, but say nothing about
how those values vary spatially. For reservoir E ¼ the expectation (or mean)
modelling we need to express the spatial variation Z(x) ¼ the value at a point in space
of parameters, and the central concept controlling Z(x + h) ¼ the value at a separation distance,
this is the variogram. h (the lag)
The variogram captures the relationship
between the different values between pairs of Generally, γ increases as a function of separa-
data points, and the distance separating those tion distance. Where there is some relationship
two points. Numerically, this is expressed as the between the values in a spatial dataset, γ shows
averaged squared differences between the pairs of smaller values for points which are closer
data in the data set, given by the empirical together in space, and therefore more likely to
36 2 The Rock Model

a b

g g

Lag (distance) Lag (distance)

Fig. 2.26 The variogram: a systematic change in variance between data points with increasing distance between those
points; left: the raw γ values plotted versus the lag; right: the data binned, averaged and a line-fit applied

have similar values. As the separation distance


Sill
increases the difference between the paired
samples tends to increase. The variogram thus
provides a quantified measure of spatial
heterogeneity – ‘spatial correlation’. g
The trend line through the points on a
semivariogram plot yields the semivariogram
model (Fig. 2.27) and it is this model which may
Nugget Range
be used as input to geostatistical packages during
parameter modelling.
Lag (distance)
A semivariogram model has three defining
features: Fig. 2.27 A semivariogram model for the data in
• the sill, which is a constant γ value that may be Fig. 2.26 with annotated components
approached for widely-spaced pairs and
approximates the variance; i.e. they are more homogeneously spread; a
• the range, which is the separation distance at small range means the parameters are highly var-
which the sill is reached, and iable over short distances i.e. they are heteroge-
• the nugget, which is the extrapolated γ value at neous on a smaller scale. The presence of a
zero separation. nugget means that although the dataset displays
correlation, quite sudden variations between
Now recall the definition of the sill, from neighbouring points can occur, such as when
Isaaks and Srivastava (1989), quoted at the start gold miners come across a nugget, hence the
of this section. In simple terms, the sill is the point name. The nugget is also related to the sample
at which the semivariogram function is equal to scale – an indication that there is variation at a
the variance, and the key measure for reservoir scale smaller than the scale of the measurement.
modelling is the range – the distance at which The impact on reservoir models of adjusting
pairs of data points no longer bear any relation- the range in a variogram function is illustrated
ship to each other. A large range means that data simply in Fig. 2.28. The geostatistical simulation
points remain correlated over a large area, technique itself will be discussed in Sect. 2.7 but
2.6 Essential Geostatistics 37

Fig. 2.28 Contrasting reservoir models built from a common data set, but modelled with different ranges: short range to
the left (short length scale heterogeneity); long range to the right (long length scale heterogeneity)

for the moment the connection to make is the link A final important feature of variograms is
between the variogram range and heterogeneity that they can vary with direction. The spatial
length scales. Short ranges correspond to short variation represented by the variogram model
length scale heterogeneities, long ranges to long can be orientated on any geographic axis, N-S,
length scale heterogeneities. E-W, etc. This has an important application
There are several standard functions which can to modelling in sedimentary rocks, where a
be given to semivariogram models, and which trend can be estimated based on the deposi-
appear as options on reservoir modelling software tional environment. For example, reservoir
packages. Six types are illustrated in Fig. 2.29. properties may be more strongly correlated
Three are options present in most modelling along a channel direction, or along the strike
packages (spherical, exponential and gauss- of a shoreface. This directional control on spa-
ian); the spherical model is probably the most tial correlation leads to anisotropic variograms.
widely used, and is appropriate for most purposes. Anisotropy is imposed on the reservoir model
The power law semivariogram describes data by indicating the direction of preferred conti-
sets which continue to get more dissimilar with nuity and the strength of the contrast between
distance. A simple example would be depth the maximum and minimum continuity
points on a tilted surface or a vertical variogram directions, usually represented as an oriented
through a data set with a porosity/depth trend. ellipse.
The power law semivariogram has no sill. Anisotropic correlation can also occur in the
It should also be appreciated that, in general, vertical plane, controlled by features such as sed-
sedimentary rock systems often display a ‘hole imentary bedding. In most reservoir systems ver-
effect’ when data is analysed vertically (Fig. 2.29, tical length scales are shorter than horizontal
lower right). This is a feature of any rock system length scales because sedimentary systems tend
that shows cyclicity (Jensen et al. 1995), where to be layered. It is generally much easier to calcu-
the γ value decreases as the repeating bedform is late vertical variograms directly from subsurface
encountered. In practice this is generally not data, because continuous data come from logs in
required for the vertical definition of layers in a sub-vertical wells. Vertical length scales in lay-
reservoir model, as the layers are usually created ered systems are short, and vertical variograms
deterministically from log data, or introduced therefore tend to have short ranges, often less than
using vertical trends (Sect. 2.7). that set by default in software packages.
38 2 The Rock Model

Fig. 2.29 Standard Nugget model Gaussian model


semivariogram models, 1 1
with γ normalised to
1. (Redrawn from Deutsch
2002, # Oxford University g g
Press, by permission of
Oxford University Press,
USA (www.oup.com))
0 0
distance (lag) distance (lag)

Spherical model Power law model


1 1

g g

0 0
distance (lag) distance (lag)

Exponential model Hole model


1 1

g g

0 0
distance (lag) distance (lag)

Horizontal variograms are likely to have much The figures in Table 2.1 summarise important
longer ranges, and may not reach the sill at the and useful contrasts but also reflect the residual
scale of the reservoir model. This is illustrated uncertainty in predicting the range of a horizontal
conceptually in Fig. 2.30, based on work by semivariogram model, which is the key parameter
Deutsch (2002). required in any model build using variograms.
As vertical semivariograms can be easier to Attempts can be made to derive horizontal
estimate, one approach for geostatistical analysis semivariograms directly from any data set, but
is to measure the vertical correlation (from well the dataset is only a sample, most likely an imper-
data) and then estimate the likely horizontal fect one. This is a particular issue for well data,
semivariogram using a vertical/horizontal anisot- especially sparse offshore well data, which is
ropy ratio based on a general knowledge of sedi- generally statistically insufficient for the model
mentary systems. Considerable care should be purpose. For many datasets, the horizontal
taken if this is attempted, particularly to ensure variogram is difficult to estimate, and the model-
that the vertical semivariograms are sampled ler is therefore often required to choose a
within distinct (deterministic) zones. Deutsch variogram model ‘believed’ to be representative
has estimated ranges of typical anisotropy ratios of the system being modelled. This is the intuitive
by sedimentary environment (Table 2.1) and leap and it is an intuitive application of
these offer a general guideline. geostatistics that we advocate.
2.6 Essential Geostatistics 39

the variogram range and anisotropy from this


image. We assume the image intensity is an indi-
vertical variogram cator for sand, and extract this quantitatively from
the image by pixelating the image, converting to a
greyscale and treating the greyscale as a proxy for
‘reservoir’. This process is illustrated in
horizontal variogram
g Figs. 2.31, 2.32, 2.33, 2.34, 2.35 and 2.36.
This example shows how the semivariogram
emerges from quite variable line-to-line transects
anisotropy over the analogue image to give a picture of
average variance. The overall result suggests
pixel ranges of 22 pixels in an E-W direction
(Fig. 2.35) and 32 in a N-S direction (Fig. 2.36),
reflecting the N-S orientation of the sand
distance (lag)
system and a 32:22 (1.45:1) horizontal anisotropy
Fig. 2.30 Horizontal-vertical anisotropy ratio in ratio.
semivariograms. (Redrawn from Deutsch 2002, # Oxford This example is not intended to suggest that
University Press, by permission of Oxford University quantitative measures should be derived from
Press, USA (www.oup.com))
satellite images and applied simply to reservoir
modelling: there are issues of
Table 2.1 Typical ranges in variogram anisotropy ratios
depositional vs. preserved architecture to con-
Element Anisotropy ratio sider, and for a sand system such as that
Point bars 10:1–20:1 illustrated above the system would most likely
Braided fluvial 20:1–100:1 be broken down into elements which would not
Aeolian 30:1–120:1
necessarily be spatially modelled using
Estuarine 50:1–150:1
variograms alone (see next section).
Deepwater 80:1–200:1
Deltaic 100:1–200:1
The example is designed to guide our thinking
Platform carbonates 200:1–1000:1 towards an intuitive connection between the
From Deutsch (2002)

2.6.2 Intuitive Geostatistics

In the discussion of key geostatistical


principles above we have tried to make the link
between geostatistics and the underlying reser-
voir concepts which should drive modelling.
Although this link is difficult to define precisely,
an intuitive link can always be made between
the variogram and the reservoir architectural
concept.
In the discussion below we develop that link
using a satellite image adopted as a conceptual
analogue for a potential reservoir system. The
image is of a wide fluvial channel complex open- Fig. 2.31 Image of a present-day sand system – an ana-
ing out into a tidally-influenced delta. Assuming logue for lower coastal plain fluvial systems and tidally-
the analogue is appropriate, we extract the guid- influenced deltas. The Brahmaputra Delta (NASA shuttle
ance required for the model design by estimating image)
40 2 The Rock Model

Fig. 2.32 Figure 2.31 converted to greyscale (left), and pixelated (right)

Pixel row 13
Pixel row 33

lag
lag

Pixel row 43

lag
lag

Pixel row 23

Fig. 2.33 Semivariograms for pixel pairs on selected E-W transects

variogram (geostatistical variance) and reservoir reasonable to convert actual spatial variation to a
heterogeneity (our concept of the variation). In representative average and then apply this aver-
particular, the example highlights the role of age over a wide area. Using sparse well data as a
averaging in the construction of variograms. Indi- starting point this is a big assumption, and its
vidual transects over the image vary widely, and validity depends on the architectural concept we
there are many parts of the sand system which are have for the reservoir. The concept is not a statis-
not well represented by the final averaged tical measure; hence the need to make an intuitive
variogram. The variogram is in a sense quite connection between the reservoir concept and the
crude and the application of variograms to either geostatistical tools we use to generate reservoir
rock or property modelling assumes it is heterogeneity.
2.6 Essential Geostatistics 41

Pixel column 28

Pixel column 54
Pixel column 8
lag

lag
lag

Fig. 2.34 Semivariograms for pixel pairs on selected N-S transects

5000

4500

4000

3500
variance

3000

2500

2000

1500

1000 Range – 22 pixels

500

0
0 10 20 30 40 50 60

lag (pixels)

Fig. 2.35 Semivariogram based on all E-W transects

The intuitive leap in geostatistical reservoir the geostatistical model, assuming the applica-
modelling is therefore to repeat this exercise for tion of an average variogram model is valid. The
an analogue of the reservoir being modelled basic steps are as follows:
and use the resulting variogram to guide
42 2 The Rock Model

6000

5000

4000
variance

3000

2000

Range – 32 pixels
1000

0
0 10 20 30 40 50 60

lag (pixels)

Fig. 2.36 Semivariogram based on all N-S transects

1. Select (or imagine) an outcrop analogue. • Highly heterogeneous systems, e.g. glacial
2. Choose the rock model elements which appro- clastic systems, should have short ranges and
priately characterise the reservoir. are relatively isotropic in (x, y).
3. Sketch their spatial distribution (the architec- • Shoreface systems generally have long ranges,
tural concept sketch) guided by a suitable ana- at least for their reservoir properties, and the
logue dataset. maximum ranges will tend to be along the
4. Estimate appropriate variogram ranges for strike of the system.
individual elements (with different variogram • In high net fluvial systems, local coarse-
ranges for the horizontal and vertical grained components (if justifiably extracted
directions). as model elements) may have very short
5. Estimate the anisotropy in the horizontal ranges, often only a nugget effect.
plane. • In carbonate systems, it needs to be clear
6. Input these estimates directly to a variogram- whether the heterogeneity is driven by diage-
based algorithm if pixel-based techniques are netic or depositional elements, or a blend of
selected (see next section). both; single-step variography described above
7. Carry through the same logic for modelling may not be sufficient to capture this (see
reservoir properties, if variogram-based Chap. 6 for suggested solutions).
algorithms are chosen.
The approach above offers an intuitive route to Often these generalities may not be apparent
the selection of the key input parameters for a from a statistical analysis of the well data, but
geostatistical rock model. The approach is they make intuitive sense. The outcome of an
concept-based and deterministically steers the ‘intuitive’ variogram model should of course be
probabilistic algorithm which will populate the sense-checked for consistency against the well
3D grid. data – any significant discrepancy should prompt
There are some generalities to bear in mind: a re-evaluation of either the concept or the
approach to the handling of the data, e.g. the
• There should be greater variance across the choice of model elements. However, this intuitive
grain of a sedimentary system than along approach to geostatistical reservoir modelling is
it (represented by the shorter E-W range for recommended in preference to simple condition-
the example above). ing of the variogram model to the well data –
2.7 Algorithm Choice and Control 43

which is nearly always statistically Sketch estimated semivariogram functions for


unrepresentative. the horizontal and vertical directions assuming
that colour (grey-scale) indicates rock quality.
Exercise 2.1 Estimate Variograms from an The hammer head is 10 cm across. Use the grey-
Outcrop Image scale image and pixelated grey-scale images to
The image below shows an example photo of guide you.
bedding structures from an outcrop section of a
fluvial sedimentary sequence. Redox reactions
(related to paleo-groundwater flows) give a strong
2.7 Algorithm Choice and Control
visible contrast between high porosity (white) and
low porosity (red) pore types. Small scours with
The preceding sections presented the basis for the
lag deposits and soft-sediment deformation
design of the rock modelling process:
features are also present.

Grey scale image is 22.5  22.5 cm; pixelated grey-scale image is 55 by 55 pixels
44 2 The Rock Model

1. Form geological concepts and decide whether • Texture-based methods, using training images
rock modelling is required; to recreate the desired architecture. Although
2. Select the model elements; this has been experimented with since the early
3. Set the balance between determinism and days of reservoir modelling this has only
probability; recently ‘come of age’ through the develop-
4. Intuitively set parameters to guide the ment of practical multi-point statistical (MPS)
geostatistical modelling process, consistent algorithms (Strebelle 2002).
with the architectural concepts. • Process-based methods, in which an architec-
ture is built up stratigraphically, often using
The next step is to select an algorithm and
2D grids to define the upper and lower surfaces
decide what controls are required to move beyond
of a rock body or layer.
the default settings that all software packages
offer. Algorithms can be broadly grouped into
four: The pros and cons of these algorithms, includ-
ing some common pitfalls, are explored below.
• Object modelling, placing bodies with discrete
shapes into 3D space for which another model
element, or group of elements, has been 2.7.1 Object Modelling
defined as the background.
• Pixel-based methods, using indicator Object modelling uses various adaptations of the
variograms to create the model architecture ‘marked point process’ (Holden et al. 1998). A
by assigning the model element type on a position in the 3D volume, the marked point, is
cell-by-cell basis. The indicator variogram is selected at random and a point defined within a
simply a variogram that has been adapted for modelled 3D body (ellipse, half moon, channel
discrete rather than continuous variables. etc.) is attached to that point in space. The main
There are several variants of pixel modelling inputs for object modelling are an upscaled ele-
including Sequential Indicator Simulation ment log, a shape template and a set of geometri-
(SIS), indicator kriging and various facies cal parameter distributions such as width,
trend or facies belt methods which attempt to orientation and body thickness, derived from out-
capture gradational lateral facies changes. crop data (e.g. Fig. 2.37).

Fig. 2.37 An early


example of outcrop-derived
data used to define
geometries in object
models. (Redrawn from
Fielding and Crane 1987,
# SEPM Society for
Sedimentary Geology
[1987], reproduced with
permission)
2.7 Algorithm Choice and Control 45

The algorithm works by implanting objects user to control multiple well intersections of the
with dimensions selected from the prescribed same object but this is not commonplace.
input distributions and then rejecting objects Secondly, the distribution of objects at the
which do not satisfy the well constraints wells does not influence the distribution of inter-
(in statistics, the ‘prior model’). For example, a well objects because of the assumption of
channel object which is placed in 3D but stationarity in the algorithm. Channel
intersects a well without a channel observed at morphologies are particularly hard to control
that location is rejected. This process continues because trend maps only influence the location
iteratively until a target proportion of each ele- of the ‘marked point’ for the channel object and
ment is achieved, constrained by the expected not the rest of the body, which generally extends
total volume fraction of the object, e.g. 30% throughout the model.
channels. Objects that do not intersect the wells A key issue with object modelling, therefore,
are also simulated if needed to achieve the is that things can easily go awry in the inter-well
specified element proportions. However, spatial area. Figure 2.39 shows an example of
trends of element abundance or changing body ‘funnelling’, in which the algorithm has found it
thickness are not automatically honoured because difficult to position channel bodies without
most algorithms assume stationarity (no interwell intersecting wells with no channel observations;
trends). Erosional, or intersection, rules are the channels have therefore been preferentially
applied so that an object with highest preservation funnelled into the inter-well area. Again, some
potential can replace previously simulated objects intuitive geological sense is required to control
(Fig. 2.38). and if necessary reject model outcomes. The issue
There are issues of concern with object illustrated in Fig. 2.39 can easily be exposed by
modelling which require user control and aware- making a net sand map of the interval and looking
ness of the algorithm limitations. Firstly, it is for bulls-eyes around the wells.
important to appreciate that the algorithm can Thirdly, the element proportions of the final
generate bodies that cross multiple wells if model do not necessarily give guidance as to the
intervals of the requisite element appear at the quality of the model. Many users compare the
right depth intervals in the wells. That is, the element (‘facies’) proportions of the model with
algorithm can generate probabilistic correlations those seen in the wells as a quantitative check on
without user guidance – something that may or the result, but matching the well intersections is
may not be desirable. Some algorithms allow the the main statistical objective of the algorithm so

vertical heterolithics
10m exaggeration x50
channel
1000m sheetflood

Fig. 2.38 Cross section through the ‘Moray Field’ model, same section line through models which are conditioned to
an outcrop-based model through Triassic fluvial clastics in the same well data, with the same choice of elements,
NE Scotland. Figures 2.38, 2.40, 2.41 and 2.42 follow the differing only in the selection of rock modelling algorithm
46 2 The Rock Model

Fig. 2.39 ‘Funnelling’ –


over-concentration of model in simbox view
objects (channels, yellow)
in between wells which
have lower concentrations
of those objects; a result of
inconsistencies between
well data, guidance of the
model statistics and the
model concept (Image
courtesy of Simon Smith)

135 metres
channel sand
overbank
shale
heterolithics

there is a circular logic to this type of QC. The key answers but do give good guidance on realistic
thing to check is the degree of ‘well match’ and preserved body geometries.
the spatial distributions and the total element 4. The obvious object shape to select for a given
proportions (together). Repeated mismatches or element is not always the best one to use.
anomalous patterns point to inconsistencies Channels are a good example of this, as the
between wells, geometries and element architecture of a channel belt is sometimes
proportions. better constructed using ellipse- or crescent-
Moreover, it is highly unlikely that the element shaped objects rather than channel objects
proportions seen in the wells truly represent the per se, which are often more representative of
distribution in the subsurface as the wells dramat- the shape of the rivers that formed the channel
ically under-sample the reservoir. It is always belts, rather than the preserved belt
useful to check the model element distributions geometries of nested barforms. Also, these
against the well proportions, and the differences body shapes are less extensive than the chan-
should be explained, but differences should be nel shapes, rarely go all the way through a
expected. The ‘right’ element proportion is the model area and so reflect the trend map inputs
one which matches the underlying concept. more closely and are less prone to the ‘bull’s
The following list of observations and tips eye’ effect.
provides a useful checklist for compiling body 5. There may be large differences between the
geometries in object modelling: geometry of a modern feature and that pre-
served in the rock record. The example above
1. Do not rely on the default geometries.
of channels is a case in point; carbonate
2. Recall that thickness distributions have to be
reservoirs offer a more extreme example as
customised for the reservoir. Depending on the
any original depositional architecture may
software package in use, the upscaled facies
have been completely overprinted by
parameter can be based on the measured thick-
subsequent diagenetic effects. Differential
ness from deviated wells – not necessarily the
compaction effects (sands vs. muds and coals)
stratigraphic thicknesses.
may also change the vertical geometry of the
3. Spend some time customising the datasets and
original sediment body.
collating your own data from analogues. There
6. It is important to distinguish uncertainty from
are a number of excellent data sources avail-
variability. Uncertainty about the most
able to support this; they do not provide instant
2.7 Algorithm Choice and Control 47

appropriate analogue may result in a wide The kriging algorithm attempts to minimise
spread of geometrical options. It is incorrect, the estimation error at each point in the model
however, to combine different analogue grid. This means the most likely element at each
datasets and so create spuriously large location is estimated using the well data and the
amounts of variation. It is better to make two variogram model – there is no random sampling.
scenarios using different data sets and then Models made with indicator kriging typically
quantify the differences between them. show smooth trends away from the wells, and
7. Get as much information as possible from the the wells themselves are often highly visible as
well and seismic data sets. Do well ‘bulls-eyes’, especially in very sparse data sets.
correlations constrain the geometries that can These models will have different element
be used? Is there useful information in the proportions to the wells because the algorithm
seismic? does not attempt to match those proportions to
8. We will never know what geometries are cor- the frequency distribution at the wells. Indicator
rect. The best we can do is to use our concep- kriging can be useful for capturing lateral trends if
tual models of the reservoir to select a series of these are well represented in the well data set,
different analogues that span a plausible range and if a high degree of correlation between
of geological uncertainty and quantify the wells is desired.
impact. This is pursued further in Chap. 5. In general, it is a poor method for representing
reservoir heterogeneity because the lateral hetero-
geneity in the resulting model is too heavily
influenced by the well spacing. For fields with
2.7.2 Pixel-Based Modelling
dense, regularly-spaced wells and relatively long
correlation lengths in the parameter being
Pixel-based modelling is a fundamentally differ-
modelled, it may still be useful.
ent approach, based on assigning properties using
Figure 2.40 shows an example of indicator
geostatistical algorithms on a cell-by-cell basis,
Kriging applied to the Moray data set – it is first
rather than by implanting objects in 3D. It can be
and foremost an interpolation tool.
achieved using a number of algorithms, the
Sequential Gaussian Simulation (SGS) is most
commonest of which are summarised below.
commonly used for modelling continuous
Kriging is the most basic form of interpolation
petrophysical properties (Sect. 3.4), but one vari-
used in geostatistics, developed by the French
ant, Sequential Indicator Simulation (SIS), is
mathematician Georges Matheron and his student
quite commonly used for rock modelling (Journel
Daniel Krige (Matheron 1963). The technique is
and Alabert 1990). SIS builds on the underlying
applicable to property modelling (next section)
geostatistical method of kriging, but then
but rock models can also be made using an
introduces heterogeneity using a sequential sto-
adaptation of the algorithm called Indicator
chastic method to draw Gaussian realisations
Kriging.

vertical heterolithics
10m exaggeration x50
channel
1000m sheetflood

Fig. 2.40 Rock modelling using indicator kriging


48 2 The Rock Model

using an indicator transform. The indictor is used conditioning cases and the funnelling effect is
to transform a continuous distribution to a dis- avoided. The method also avoids the bulls-eyes
crete distribution (e.g. element 1 vs. element 2). around wells which are common in indicator
When applied to rock modelling, SIS will gen- kriging.
erally assume the reservoir shows no lateral or The algorithm can be used to create
vertical trends of element distribution – the prin- correlations by adjusting the variogram range to
ciple of stationarity again – although trends can be greater than the well spacing. In the example in
be superimposed on the simulation (see the Fig. 2.42, correlated shales (shown in blue) have
important comment on trends at the end of this been modelled using SIS. These correlations con-
section). tain a probabilistic component, will vary from
Models built with SIS should, by definition, realisation to realisation and will not necessarily
honour the input element proportions from wells, create 100% presence of the modelled element
and each geostatistical realisation will differ when between wells. Depending on the underlying con-
different random seeds are used. Only when large cept, this may be desirable.
ranges or trends are introduced will an SIS model When using the SIS method as commonly
realisation differ statistically from the input applied in commercial packages, we need to be
well data. aware of the following:
The main limitation with such pixel-based
1. Reservoir data is generally statistically insuffi-
methods is that it is difficult to build architectures
cient and rarely enough to derive meaningful
with well-defined margins and discrete shapes
experimental variograms. This means that the
because the geostatistical algorithms tend to cre-
variogram used in the SIS modelling must be
ate smoothly-varying fields (e.g. Fig. 2.41). Pixel-
derived by intuitive reasoning (see previous
based methods tend to generate models with lim-
section).
ited linear trends, controlled instead by the prin-
2. The range of the variogram is not the same as
cipal axes of the variogram. Where the rock units
the element body size. The range is related to
have discrete, well-defined geometries or they
the maximum body size, and actual simulated
have a range of orientations (e.g. radial patterns),
bodies can have sizes anywhere along the
object-based methods are preferable to SIS.
slope of the variogram function. The range
SIS is useful where the reservoir elements do
should therefore always be set larger than
not have discrete geometries either because they
your expected average body size, as a rule of
have irregular shapes or variable sizes. SIS also
thumb – twice the size.
gives good models in reservoirs with many
3. The choice of the type of kriging used to start
closely-spaced wells and many well-to-well
the process off can have a big effect. For
correlations. The method is more robust than
simple kriging a universal mean is used and
object modelling for handling complex well-

vertical heterolithics
10m exaggeration x50
channel
1000m sheetflood

Fig. 2.41 Rock modelling using SIS


2.7 Algorithm Choice and Control 49

Fig. 2.42 Creating


correlatable shale bodies
(shown in blue) in a fluvial
system using SIS (Image
courtesy of Simon Smith)

the algorithm assumes stationarity. For ordi- Instinctively, the anisotropy should be more
nary kriging the mean is estimated locally pronounced, perhaps 10:1, but feeding this
throughout the model, and consequently ratio through an SIS run will also not repro-
allows lateral trends to be captured. Ordinary duce the original (Fig. 2.43, right-hand image).
kriging works well with large numbers of In short: the SIS algorithm is robust but has
wells and well-defined trends, but can produce limitations; an alternative is required if the
unusual results with small data sets. desire is to reproduce an model similar to the
4. Some packages allow the user to specify local centre image in Fig. 2.43.
azimuths for the variogram. This information
The facies trend simulation algorithm is a
can come from the underlying architectural
modified version of SIS which attempts to honour
concept and can be a useful way of avoiding
a logical lateral arrangement of elements, for
the regular linear striping which is typical for
example, an upper shoreface passing laterally
indicator models, especially those conditioned
into a lower shoreface and then into mudstone.
to only a small number of wells.
Figure 2.44 shows an example applied to the
5. Variograms make average statements about
Moray data set. The facies trend approach,
spatial correlation in a given direction. This
because it uses SIS, gives a more heterogeneous
places a limit on the textures that can be
pattern than indicator kriging and does not suffer
modelled; variogram modelling is often
from the problem of well bulls-eyes. The latter is
described as a two-point process. The impact
because the well data is honoured at the well
of this simplification is illustrated in Fig. 2.43,
position, but not necessarily in the area local to
in which the image from Sect. 2.6.2 is
the well.
revisited. In that case, the exhaustive, true
The user can specify stacking patterns,
dataset was available and a variogram analysis
directions, angles and the degree of inter-
of that data yielded length scales (variogram
fingering. The approach can be useful, but it is
ranges) and a horizontal anisotropy ratio of
often very hard to get the desired inter-fingering
1.45:1. Yet if those same ranges and ratio are
throughout the model. The best applications tend
fed back into an SIS run the channelised image
to be shoreface environments where the logical
does not reappear (Fig. 2.43, left-hand image).
50 2 The Rock Model

N-S areal anisotropy N-S areal anisotropy N-S areal anisotropy


= 1.4:1 = 1.4:1 = 10:1

Fig. 2.43 Attempts to recreate a complex image using anisotropy in the centre image; right: an alternative run
SIS. Centre: the original image; left: an SIS realisation with increased anisotropy. The complexity of the image is
based on the measured spatial correlation and average beyond the reach of a standard variogram workflow

vertical heterolithics
10m exaggeration x50
channel
1000m sheetflood

Fig. 2.44 Rock modelling using facies trend simulation

sequence of elements, upper to lower shoreface, point statistics (MPS) technique (Strebelle 2002;
transition on a large scale. Similar modelling Caers 2003).
effects can also be achieved by the manual appli- The approach starts with a pre-existing train-
cation of trends (see below). ing image, a cellular model or a scanned image,
which is analysed for textural content. Using a
geometric template, the frequency of instances of
2.7.3 Texture-Based Modelling a model element occurring next to similar and
different elements are recorded, as is their relative
An alternative approach to the averaging of position (to the west, the east, diagonally etc.). As
variography is the emergence of algorithms the cellular model framework is sequentially
which aim to honour texture directly. Although filled, the record of textural content in the training
there are parallels with very early techniques such image is referred to in order to determine the
as simulated annealing (Yarus and Chambers likelihood of a particular cell having a particular
1994, Chapter 1 by Srivistava) the approach has model content, given the content of the
become more widely available through the multi- surrounding cells (Fig. 2.45).
2.7 Algorithm Choice and Control 51

Model Training image

Coarse sands Heterolithics Coarse sands Heterolithics


Medium sands Non-reservoir Medium sands Non-reservoir

Fig. 2.45 The Multi-Point Statistics (MPS) approach to rock modelling

Although the approach is pixel-based, the key and SIS-based architectural elements. The
step forward is the emphasis on potentially com- MPS algorithm does not ‘work alone.’
plex texture rather than relatively simple ‘two- 2. The additional effort of generating and
point’ statistics of variography. The prime limita- checking a training image may not be required
tion of variogram-based approaches – the need to in order to generate the desired architecture;
derive simple rules for average spatial 2-point geostatistics or object modelling may
correlation – is therefore surmounted by be sufficient.
modelling instead an average texture.
Despite the above, the technique can provide
An example is shown in Fig. 2.46 – a cross-
very realistic-looking architectures which over-
sectional model based on a sketch of a single bed
come both the simplistic textures of older pixel-
in a dryland river system. The sketch was hand-
based techniques and the simplistic shapes and
drawn, drafted neatly, scanned and input to an
sometimes unrealistic architectures produced by
MPS algorithm resulting in the digital model
object modelling.
illustrated.
MPS is arguably the most desirable algorithm
for building 3D reservoir architectures because
architecture itself is a heterogeneous textural fea- 2.7.4 Algorithms Compared
ture and MPS is designed to model these textures
directly. In spite of this there are two reasons why A comparison of the techniques above is shown
MPS is not necessarily the algorithm of choice: in Fig. 2.48. These are the same images shown in
Figs. 2.38, 2.40, 2.41, 2.44 and 2.47 but now
1. A training image is required, and this is a 3D
placed together and shown with less vertical
architectural interpretation in itself. MPS
exaggeration. All models are based on the same
models are therefore not as ‘instantaneous’ as
underlying data set and all are architectural
the simpler pixel-based techniques such as
arrangements of the same three elements of an
SIS, and require more pre-work. The Moray
ephemeral braided fluvial system from the
Field example shown in Fig. 2.47 was built
‘Moray Field’ training set.
using a training data set which was itself
There is no technical error in the construction
extracted from a model combining object-
of the models in Fig. 2.48 and each would pass a
52 2 The Rock Model

Fig. 2.46 MPS model based on a sketch of a bed in a training image; (c) MPS realization of the bed over
dryland fluvial system: (a) graphic log through a type 400 m based on the training image (model courtesy of
section of the bed; (b) conceptual sketch of the arrange- Tom Buckle and Rhona Hutton)
ment of elements within the bed over 50 m used as a

vertical heterolithics
10m exaggeration x50
channel
1000m sheetflood

Fig. 2.47 Rock modelling using MPS

MPS

Objects

Kriging
Truncated
Gaussian
SIS

Fig. 2.48 True-scale comparison of five techniques applied to the Moray Field outcrop-based model
2.7 Algorithm Choice and Control 53

statistical QC, yet the images are clearly very Models such as that in Figs. 2.49 and 2.50 can
different and subsequent flow modelling would be used to provide a template for modelling in
yield very different results. standard software packages. This can be done by
Which is correct? We argue, it is the one which using the models as training images for MPS or,
best matches the underlying conceptual interpre- as in the case in Figs. 2.51 and 2.52 mapping the
tation drawn before the modelling began – the process model grid onto a standard 3D modelling
sketch. grid for onward property modelling (Da Pra et al.
2017).
Process models can become extremely elabo-
rate as more realistic parameters are introduced to
2.7.5 Process-Based Modelling
the algorithm. The example in Fig. 2.53 is a 3D
process model mimicking the evolution of the
The section above compared modelling
Nile Delta which was generated by solving
algorithms widely available in software packages
used in the E&P industry. There are another
approaches, however, and these can be consid-
ered for use in addition to the standard workflow
if the approaches above fail to produce the desired
architectures.
Figure 2.49 shows the model outcome from a
process-based model for a meandering fluvial
system. The image is a map view of a 3D model
built stratigraphically with the modelling algo-
rithm depositing layers guided by process sedi-
mentological parameters. These parameters can
be varied to produce different architectures, as
shown in Fig. 2.50 which compares variations in
aggradation rate with preserved net sand volumes.
The process can be tuned not only to understand
key ‘tipping’ points for architectural connectivity Fig. 2.49 Map view of a process-based fluvial model
but also to provide a range of architectures for (left), mimicking a meandering fluvial system (right).
forward-modelling. Image courtesy of Tom Buckle

Fig. 2.50 Four process-


based model versions of
Fig. 2.49, exploring the
relationship between
aggradation rate and net-to-
gross (NTG). Image
courtesy of Tom Buckle
54 2 The Rock Model

Fig. 2.51 Process-based models for a fluvial system; four contrasting map views of the models (Da Pra et al. 2017).
Image reproduced courtesy of the EAGE

Fig. 2.52 Cross-sections through the fluvial model after reservoir elements (lower image), (Da Pra et al. 2017).
translating the process-based model onto a standard Image reproduced courtesy of the EAGE
modelling grid (upper image) and infilling with the desired

interactions between bathymetry, hydrodynamics 2.7.6 The Importance


and sediment transport. These are complex of Deterministic Trends
models which do not go forward simply into a
fluid flow simulation. The models are primarily All of the algorithms above involve a probabilis-
aiming to create realistic architectures, including tic component. In Sect. 2.5 the balance between
distribution of sediment bodies and grain size determinism and probability was discussed and it
variations within them, which can then be used was proposed that strong deterministic control is
as training images for rock modelling (Aarnes generally required to realise the desired architec-
et al. 2019). tural concept.
2.7 Algorithm Choice and Control 55

Fig. 2.53 Plan view of a process-based model of the Nile Delta. Generated by solving interaction between bathymetry,
hydrodynamics and sediment transport. Image reproduced courtesy of TUDelft

Having discussed the pros and cons of the geological norm, indeed, Walther’s Law – the
algorithms, the final consideration is therefore principle that vertical sequences can be used to
how to overlay deterministic control. In statistical predict lateral sequences – is a statement of
terms, this is about overcoming the stationarity non-stationarity.
that probabilistic algorithms assume as a default. Deterministic trends are therefore required,
Stationarity is a prerequisite for the algorithms whether to build a model using object- or pixel-
and assumes that elements are randomly but based techniques, or to build a training image for
homogeneously distributed in the inter-well a texture-based technique.
space. This is at odds with geological systems,
in which elements are heterogeneously
2.7.6.1 Vertical Trends
distributed and show significant non-stationarity:
Sedimentary systems typically show vertical
they are commonly clustered and show patterns in
organisation of elements which can be observed
their distribution. Non-stationarity is the
in core and on logs and examined quantitatively
56 2 The Rock Model

in the data-handling areas of modelling packages. effectively being adjusted up and down away
Any such vertical trends are typically switched off from well control. The adjustments therefore
by default – the assumption of stationarity. require justification which comes, as ever, from
As a first assumption, observed trends in the the underlying conceptual model.
form of vertical probability curves should be
switched on, unless there are compelling reasons
2.7.6.2 Horizontal Trends
not to use them. More significantly, these trends
Horizontal trends are most simply introduced as
can be manually adjusted to help realise an archi-
2D maps which can be applied to a given interval.
tectural concept perhaps only partly captured in
Figure 2.55 shows the application of a sand trend
the raw well data.
map to a low net-to-gross system following the
Figure 2.54 shows an edited vertical element
steps below:
distribution which represents a concept of a depo-
sitional system becoming sand-prone upwards.
This is a simple pattern, common in sedimentary 1. Sand elements are identified in wells based on
sequences, but will not be integrated in the core and log interpretation.
modelling process by default. 2. A net-to-gross (sand) value is extracted at each
Thought is required when adjusting these well and gridded in 2D to produce a map
profiles because the model is being consciously illustrating any sand trend apparent from well
steered away from the statistics of the well data. data alone.
Unless the well data is a perfect statistical sample 3. The 2D map is hand-edited to represent the
of the reservoir (rarely the case and never prov- desired concept, with most attention being
able) this is not a problem, but the modeller paid to the most poorly sampled areas (in the
should be aware that hydrocarbon volumes are example shown, the trend grid is also

Fig. 2.54 Vertical


probability trends; each
colour represents a different
reservoir element and the
probability represents the
likelihood of that element
occurring at that point in the
cell stratigraphy
(blue ¼ mudstone;
yellow ¼ sandstone)
2.7 Algorithm Choice and Control 57

Fig. 2.55 Deterministic


application of a horizontal
trend

smoothed – the level of detail in the trend map for a model with the trends removed, with all other
should match the scale of the sand distribution model variables unchanged. Stationarity is over-
concept). come and the concept dominates the modelling.
4. The trend map is input to, in this case, an SIS The source of the trend can be an extension of
algorithm for rock modelling. the underlying data, as in the example above, or a
5. As a check, the interval average net-to-gross is less data-independent concept based on a regional
backed-out of the model as a map and com- model, or a trend surface derived from seismic
pared with the concept. The map shows more attributes – the ‘soft conditioning’ described in
heterogeneity because the variogram ranges Sect. 2.5.
have been set low and the model has been
tied to the actual well observations; the desired
2.7.6.3 3D Probability Volumes
deterministic trends, however, clearly control
The 3D architecture can be directly conditioned
the overall pattern.
using a 3D volume – a natural extension of the
The influence of the trend on the model is process above. The conditioning volume can be
profound in this case as the concept is for the built in a modelling exercise as a combination of
sand system to finger eastwards into a poorly the horizontal/vertical trends described above, or
drilled, mud-dominated environment. The fluid derived from a 3D data source, typically a seismic
volumes in the trended case are half that calculated volume.
58 2 The Rock Model

Seismic conditioning directly in 3D creates sedimentology for which is less clear and can be
some issues: viewed as either a lower coastal plain or a river-
dominated delta.
1. The volume needs QC. It is generally easier to
Rock model realisations have been built from
check simpler data elements, so if the desired
element distributions in 19 wells. Cross-sections
trends are separately captured in 2D trend
taken at the same location through the models are
surfaces and vertical proportion curves then
illustrated in Figs. 2.57, 2.58 and 2.59 for a 2-, 4-
combination into a 3D trend volume is not
and 7-interval correlation, respectively. The
necessary.
examples within each layering scheme explore
2. If conditioning to a 3D seismic volume, the
object vs. pixel (SIS) modelling and the default
resolution of the model framework needs to be
model criteria (stationarity maintained) vs. the use
consistent with the intervals the seismic attri-
of deterministic trends (stationarity overwritten).
bute is derived from. For example, if the
The models contrast greatly and the following
parameter being conditioned is the sand con-
observations can be made:
tent within a 25 m thick interval, it must be
assumed that the seismic data from which the 1. The more heavily subdivided models are natu-
seismic attribute is derived is also coming rally more ‘stripy’. This is partly due to the
from that 25 m interval. This is unlikely to be ‘binning’ of element well picks into zones,
the case from a simple amplitude extraction which starts to break down stationarity by
and a better approach is to condition from picking up any systematic vertical
inverted seismic data. The questions to ask organisation of the elements, irrespective of
are therefore what the seismic inversion pro- the algorithm chosen and without separate
cess was inverting for (was it indeed the sand application of vertical trends.
content) and was the earth model used for the 2. The stripy architecture is further enhanced in
inversion the same one as the reservoir model the 7-zone model because the layering is based
is being built on? on a flooding surface model, the unit
3. If the criteria for using 3D seismic data boundaries for which are preferentially picked
(2, above) are met, can a probabilistic seismic on shales. The unit boundaries are therefore
inversion be called upon? This is the ideal shale-rich by design and prone to generating
input to condition to. correlatable shales if the shale dimension is big
4. If the criteria in point 2, above, are not met, the enough (for object modelling) or the shale
seismic can still be used for soft conditioning, variogram range long enough (for SIS).
but will be more artefact-free and easier to QC 3. Across all frameworks, the object-based
if applied as a 2D trend. The noisier the data, models are consistently more ‘lumpy’ and the
the softer the conditioning will need to be, SIS-based models consistently more ‘spotty’, a
i.e. the lower the correlation coefficient. consequence of the different underlying
principles of the algorithms.
4. The untrended object model for the two zone
realisation is the one most dominated by
2.7.7 Alternative Rock Modelling
stationarity, and looks the least realistic
Methods – A Comparison
geologically.
5. The addition of deterministic trends, both ver-
An example is given below, to which alternative
tical and lateral, creates more ordered, less
algorithms have been applied. The case is taken
random-looking models, as the assumption of
from a fluvio-deltaic reservoir – the Franken
stationarity is overridden by the conceptual
Field – based on a type log with a well-defined
model.
conceptual geological model (Fig. 2.56). The
main reservoir is the Shelley, which divides into Any of above models presented in Figs. 2.57,
a clearly fluvial Lower Shelley characterised by 2.58 and 2.59 could be offered as a ‘best guess’,
sheetfloods, and an Upper Shelley, the and could be supported at least superficially with
2.7 Algorithm Choice and Control 59

Fig. 2.56 The Franken Field reservoir – type log and for the Upper Shelley, the other for an alternative coastal
proposed depositional environment analogues. The plain model for the Upper Shelley. Sands are marked in
model elements are shown on the right-hand coloured yellow, muds in blue, intermediate lithologies in interme-
logs, one of which is associated with a delta interpretation diate colours. (Image courtesy of Simon Smith)

an appropriate story line. Presenting multiple of these models is a ‘good’ or ‘bad’ representation
models using different layering schemes and of the reservoir. However, a number of the
alternative algorithms also appears thorough and models were made quickly using system defaults
in a peer-review it would be hard to know which and have little substance; stationarity (within
60 2 The Rock Model

Fig. 2.57 The Franken Field. Top image: the two zone Fig. 2.58 The Franken Field: Top image: the four zone
subdivision; middle image: object model (no trends subdivision; middle image: pixel (SIS) model (no trends
applied, stationarity maintained); bottom image: trended applied, stationarity maintained); bottom image: trended
object model pixel (SIS) model

zones) is dominant and although the models are


statistically valid, they lack an underlying concept
and have poor deterministic control. Only the
lower models in each figure take account of the
trends associated with the underlying reservoir
concept, and it is these which are the superior
representations – at least matching the quality of
the conceptual interpretation.
The main point to take away from this example
is that all the models match the well data and no
mechanical modelling errors have been made in
their construction, yet the models differ drasti-
cally owing to the algorithm choice, the style of
correlation and the use of trends. The comparison
reinforces the importance of the underlying reser-
voir concept as the tool for assessing which of the
resulting rock models are acceptable Fig. 2.59 The Franken Field: Top image: the seven zone
subdivision; bottom image: trended SIS model
representations of the reservoir.
2.8 Summary 61

2.8 Summary • Make net sand isochore maps for each zone
without wells posted; imposed trends should
In this section we have offered an overview of be visible and the well locations should not
approaches to rock modelling and reviewed a (no bulls-eyes around wells).
range of geostatistically-based methods, whilst • Make full use of the visualisation tools, espe-
holding the balance between probability and cially the ability to scroll through the model
determinism and the primacy of the underlying vertically, layer by layer, to look for anoma-
concept as the core issues. Reservoir modelling is lous geometries, e.g. spikes and pinch-outs.
not simply a process of applying numerical tools
to the available dataset – there is always an ele-
ment of subjective design involved – an intuitive 2.8.2 Synopsis – Rock Modelling
leap. Guidelines
Overall the rock model must make geological
sense and to summarise this we offer a brief The first decision to be made is whether or not a
resume of practical things which can be done to rock model is truly required. If rock modelling
check the quality of the rock model – the QC can add useful control upon the desired distribu-
process. tion of reservoir properties, then it is needed. If
the desired property distributions can be achieved
directly by property modelling, then rock
2.8.1 Rock Model QC modelling is not necessary at all.
If it is decided that a rock model is required, it
• Make architectural sketches along depositional then needs some thought and design. The use of
strike and dip showing the key features of the system default values is unlikely to be successful.
conceptual models. During the model build, This chapter has attempted to stress the follow-
switch the model display to stratigraphic ing things:
(simbox) view to remove the structural defor- 1. The model concept needs to be formed before
mation. How do the models compare with the the modelling begins, otherwise the modeller
sketches? is ‘flying blind’. A simple way of checking
• Watch out for the individual well matches your (or someone else’s) grasp of the concep-
reported by the software package – well by tual reservoir model is to make a sketch sec-
well. These are more useful and diagnostic tion of the reservoir, or request a sketch,
than the overall ‘facies’ proportions. Anoma- showing the desired architecture. If you can
lous wells point to weaknesses in the model sketch it, you can model it.
execution. 2. The model concept needs to be expressed in
• Make element proportion maps for each ele- terms of the chosen model elements, the selec-
ment in each zone and check these against well tion of which is based on not only a consider-
data and the overall concept. This is an impor- ation of the heterogeneity, but with a view to
tant check on the inter-well probabilistic the fluid type and production mechanism.
process. Some fluid types are more sensitive to hetero-
• Assuming trends have been applied, check the geneity than others; if the fluid molecules do
statistics of the modelled element distribution not sense the heterogeneity, there is no need to
against that for the well data alone; they are model it – reference ‘Flora’s Rule’ as a handy
highly unlikely to be the same because of the rule of thumb.
undersampling of the wells, but the differences 3. Rock models are mixtures of deterministic and
should be explicable in terms of any applied probabilistic inputs. Well data tends to be
trends and the spatial location of the wells. statistically insufficient, so attempts to extract
62 2 The Rock Model

statistical models from the well data are often Caers J (2003) History matching under training-image-
not successful. The road to happiness therefore based geological model constraints. SPE J 8
(3):218–226
generally lies with strong deterministic con- Caers J (2011) Modeling uncertainty in the earth sciences.
trol, as determinism is the most direct method Wiley, Hoboken
of carrying the underlying reservoir concept Campbell CV (1967) Lamina, laminaset, bed, bedset. Sed-
into the model. imentology 8:7–26
Da Pra A, Anastasi P, Corradi C, Sprega G (2017) 3D
4. To achieve the desired reservoir architecture, process based model integration in reservoir
the variogram model has a leading influence if modeling – a real case application. Presented at the
pixel-based methods are employed; we sug- 79th EAGE Conference and Exhibition 2017, Paris,
gest the variogram range and its spatial anisot- extended abstract
Deutsch CV (2002) Geostatistical reservoir modeling.
ropy is the most important geostatistical input Oxford University Press, Oxford, p 376
to reconcile with the conceptual sketch. Doyen PM (2007) Seismic reservoir characterisation.
5. To get a reasonable representation of the EAGE Publications, Houten
model concept it is generally necessary to Dubrule O, Damsleth E (2001) Achievements and
challenges in petroleum geostatistics. Pet Geosci 7:1–7
impose trends (both vertical and lateral) on Fielding CR, Crane RC (1987) An application of statistical
the modelling algorithm, irrespective of the modelling to the prediction of hydrocarbon recovery
chosen algorithm, unless a process-based factors in fluvial reservoir sequences, vol 39. Society of
modelling tool is used to forward-model the Economic Paleontologists and Mineralogists (SEPM)
special publication
likely trends. Haldorsen HH, Damsleth E (1990) Stochastic modelling. J
6. Guide the geostatistical algorithms during a Petrol Technol 42:404–412
rock model build by an intuitive understanding Holden L, Hauge R, Skare Ø, Skorstad A (1998) Modeling
of the relationship between the underlying res- of fluvial reservoirs with object models. Math Geol 30
(5):473–496
ervoir concept and the geostatistical rules Howell J, Martinius A, Good T (2014) The application of
which guide the chosen algorithm. outcrop analogues in geological modelling: a review,
7. It is unlikely that the element proportions in present status and future outlook. Geol Soc London
the model will match those seen in the wells – Spec Publ 387:1–25
Isaaks EH, Srivastava RM (1989) Introduction to applied
do not expect this to be the case; the data and geostatistics. Oxford University Press, Oxford
the model are statistically different – this is Jensen JL, Corbett PWM, Pickup GE, Ringrose PS (1995)
discussed further in the next chapter. Permeability semivariograms, geological structure and
flow performance. Math Geol 28(4):419–435
Journel AG, Alabert FG (1990) New method for reservoir
mapping. J Petrol Technol 42:212–218
Matheron G (1963) Principles of geostatistics. Econ Geol
References 58:1246–1266
McIlroy D, Flint S, Howell J, Timms N (2005) Sedimen-
Aarnes I, van der Vegy H, Hauge R, Fjellvoll B, Nordahl tology and the tide-dominated Jurassic Lajas Forma-
K (2019) Using sedimentary process-based models as tion, Neuquen Basin, Argentina. Geol Soc London
training images for multipoint facies simulations. Bull Spec Publ 252:83–107
Can Petrol Geol 67:217–230 Pyrcz MJ, Deutsch CV (2014) Geostatistical Reservoir
Ainsworth RB, Sanlung M, Duivenvoorden STC (1999) Modelling, 2nd edn. Oxford University Press, Oxford
Correlation techniques, perforation strategies and Soubaras R, Dowle R (2010) Variable-depth streamer – a
recovery factors. An integrated 3-D reservoir modeling broadband marine solution. First Break 28:89–96
study, Sirikit Field, Thailand. Am Assoc Petrol Geol Strebelle S (2002) Conditional simulation of complex
Bull 83(10):1535–1551 geological structures using multiple-point statistics.
Bentley MR, Elliott AA (2008) Modelling flow along fault Math Geol 34(1):1–21
damage zones in a sandstone reservoir: an unconven- Van Wagoner JC, Bertram GT (eds) (1995) Sequence
tional modelling technique using conventional stratigraphy of Foreland basin deposits. American
modelling tools in the Douglas Field Irish Sea Association of Petroleum Geologists (AAPG), AAPG
UK. SPE paper 113958 presented at SPE Europec/ Memoir 64:137–224
EAGE conference and exhibition. Society of Petro- Van Wagoner JC, Mitchum RM, Campion KM,
leum Engineers (SPE). https://doi.org/10.2118/ Rahmanian VD (1990) Siliciclastic sequence stratigra-
113958-MS phy in well logs, cores, and outcrops. AAPG methods
References 63

in exploration series, vol 7. American Association of Yarus JM, Chambers RL (1994) Stochastic modeling and
Petroleum Geologists (AAPG) geostatistics principals, methods, and case studies.
Webber KJ, Van Geuns LC (1990) Framework for AAPG computer applications in geology, vol 3. Amer-
constructing clastic reservoir simulation models. J Pet ican Association of Petroleum Geologists (AAPG),
Technol 42:1–248 p 379

You might also like