Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

1.

01 An Introduction to the Theory of Sampling: An Essential


Part of Total Quality Management
F. F. Pitard, Francis Pitard Sampling Consultants, Broomfield, CO, USA
ª 2009 Elsevier B.V. All rights reserved.

1.01.1 Introduction 2
1.01.2 Scope 2
1.01.3 Definitions and Notations 3
1.01.4 Dividing a Complex Problem into its Basic Components 3
1.01.4.1 Dividing the Small-Scale Variability 3
1.01.4.2 Optimization of Sampling Protocols 3
1.01.4.2.1 The in situ nugget effect 3
1.01.4.2.2 The fundamental sampling error 4
1.01.4.2.3 The grouping and segregation error 4
1.01.4.3 The Practical Implementation of the Sampling Protocol 5
1.01.4.3.1 The increment delimitation error 5
1.01.4.3.2 The increment extraction error 5
1.01.4.3.3 The weighting error 5
1.01.4.4 The Preservation of Samples Integrity 5
1.01.4.4.1 Sample preparation errors 5
1.01.5 Exercises Challenging the Reader 6
1.01.5.1 Exercise #1: A Worldwide Problem for Ore Grade Control in Open Pits: Blasthole
Sampling 6
1.01.5.2 Exercise #2: Correctness of Straight-Path Cross-Stream Sampling Systems 7
1.01.5.3 Exercise #3: Correctness of Rotating Vezin Cross-Stream Sampling Systems 7
1.01.5.4 Exercise #4: Correctness of Hammer Cross-Belt Samplers 7
1.01.6 The Critical Importance of Sampling Courses 9
1.01.6.1 Case #1: A Bad Protocol for Blastholes Followed by an Incorrect Implementation 9
1.01.6.2 Case #2: An Incorrect Sampling System for the Final Tail of a Flotation Plant 9
1.01.7 The Enemies and Their Link to Geostatistics 10
1.01.7.1 Important Remark 10
1.01.8 Large-Scale Variability 10
1.01.8.1 Definition of the Variogram 10
1.01.8.1.1 Selection of a given process parameter of interest 11
1.01.8.1.2 Heterogeneity affecting the given process parameter of interest 11
1.01.8.1.3 Measuring heterogeneity variability with the variogram 11
1.01.8.2 Extrapolation of the Variogram to Time or Distance Zero 12
1.01.8.3 The Importance of Large-Scale Variability 12
1.01.8.3.1 Variability issues during exploration 12
1.01.8.3.2 Variability issues during mining 13
1.01.8.3.3 Variability issues within a processing plant 13
1.01.8.3.4 Variability issues during trade with customers 13
1.01.9 Conclusions 13
1.01.10 Recommendations 13
References 15

1
2 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

1.01.1 Introduction

Considering the example of the mining industry, out of many other possible examples, the key to total quality
management (TQM) is to optimize the recovery of natural resources and make quality products to satisfy
users on several fronts such as reliability, effectiveness, and minimal cost: it cannot and will not be done
without a thorough understanding of theory of sampling (TOS). As TOS is not taught at universities, the
resulting economic losses are plaguing the industry. Management decisions are based on precise and accurate
sampling. It is of utmost importance to communicate the benefits of correct sampling to management, to a
board of directors, to shareholders, to geologists, to drillers, to miners, to metallurgists, to chemists, to sales
people, environmentalists, geostatisticians, and to statisticians. It is the only way to get necessary cash flow,
more profit, and added share value. If stakeholders cannot see the value of correct sampling, it is the
company’s responsibility to show them through education of the management team. Training of key
personnel such as geologists, drillers, miners, metallurgists, chemists, and environmentalists is essential to
obtain results. They all must monitor and verify the quality of data, so that geostatisticians and even
conventional statisticians can perform reliable, believable risk assessments, to enable the management to
make ultimately crucial financial decisions.

1.01.2 Scope

The acquisition of a reliable database is always a valuable asset to a company. However, for such a success to
happen, a correct and balanced strategy is needed, which is illustrated as a three-legged table in Figure 1. If one
leg of the table is weak the entire table may collapse. Usually, the weakest leg is a style of management that does
not sufficiently support good sampling practices, though it is the only way to allow statisticians to do their work
in understanding variability in the deposit or in processes, and subsequently advise management for wise
courses of action. However, implementing correct sampling is easier said than done. Similar to safety issues for
which companies spend millions of dollars, it must be internally standardized through correctness, internal
guidelines, sustained training, and enforcement auditing. It must also be monitored for its added value through
improved natural resources recovery, improved conciliation between economic units (e.g., mine, mill, smelter,
and refinery), and added stakeholder value.
The collection of samples has only one objective in mind: understanding variability on a large
scale. Unfortunately, for practical reasons a sample is often made of a relatively small amount of material.
Therefore, as it is collected, another type of unwanted variability is always introduced, which is a

Company $ benefits,
added stakeholder value,
and market perception

Emphasis on causes Capability to understand


of problems by variability and to perform
proactive management A strong commitment to reliable statistical studies
good sampling
and good laboratory practices

Figure 1 The concept of the three-legged table.


An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 3

small-scale variability that may often overwhelm the large-scale variability that is important to measure.
Many people often do not realize that the variability they are looking at has little to do with the variability
they want to measure; it is essential to make the difference as clear as possible, and this is the true, in-
depth objective of TOS.

1.01.3 Definitions and Notations

The length of this paper being limited, for definitions and notations, the reader is referred to textbooks from
various authors listed in References.

1.01.4 Dividing a Complex Problem into its Basic Components

The strength of the TOS is the use of a logical Cartesian approach, which divides small- and large-scale
variability into basic components that can be further analyzed one by one. If everyone understood that basic
concept they would not struggle to understand TOS. It does not matter how deep anyone goes into the
mathematics of TOS; if this basic Cartesian concept is not understood, it will result in confusion, ineffective-
ness, and failure to reach economic objectives, which is a common observation in many industries around the
world today.

1.01.4.1 Dividing the Small-Scale Variability


The small-scale variability can be divided into four categories:
1. the optimization of sampling protocols,
2. the practical implementation of sampling protocols,
3. the preservation of samples integrity, and
4. the minimization of analytical errors.
Analytical errors are not the subject of this paper, but can be explored in guidelines for good laboratory
practices.

1.01.4.2 Optimization of Sampling Protocols


The optimization of sampling protocols involves a good understanding of small-scale heterogeneity, which
affects any component of interest in the lot to be sampled.

1.01.4.2.1 The in situ nugget effect


The TOS addresses sampling difficulties when sampling particulate materials. However, before the material is
broken up, as distance between samples tends toward zero, a small-scale Constitution Heterogeneity may
greatly affect the repeatability of a sample, with a given selected mass, drilled at any given place. The
preliminary knowledge of local mineralogy may help in the selection of a minimum mass for the basic module
of observation (e.g., selection of a drilling diameter and sample length) during exploration, ore reserves
estimations, and for the capability to be selective at a certain cutoff grade and at any given place during the
mining operation. Failure to address this problem early enough can result in stunning estimation failures. The
mean m[NE] of the in situ Nugget effect (NE) is zero; however, its large variance s2[NE] may lead to severe data
skewness and therefore unwanted local illusions followed by stunning economic consequences. It is suggested
to make a great difference between m[NE] ¼ 0 when there is free access to an entire population of possible
skewed data points in a given area, and m[NE] 6¼ 0 when too few skewed data points affected by a large variance
4 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

are available in the same given area. Too often, large mistakes are repeated over and over because too few data
points belonging to a skewed distribution are available in a given area.

1.01.4.2.2 The fundamental sampling error


As soon as the in situ material is broken up into fragments, the size distribution, shape, and density of
fragments begin to play an important role in sampling. Typical differences between individual fragments
lead to the concept of constitution heterogeneity in TOS. A good understanding of constitution hetero-
geneity is the key to optimize sample and subsample masses. Heterogeneity tests and sampling nomographs
are used to make such optimization.1–4 It is critically important to understand the various approaches that
were used in TOS to perform this task. Indeed, it would be a great mistake, as frequently happens, to
believe that Dr. Pierre M. Gy created one universal formula to calculate the variance of the fundamental
sampling error (s2[FSE]), which is the result of constitution heterogeneity for broken up materials.5 There
are many different approaches depending on the heterogeneity of the constituents of interest. The main
approaches are:
1. the general case where the constituent of interest is not liberated from its gangue,
2. the special case where the constituent of interest is liberated from its gangue and may show delayed
comminution properties,
3. the case where the constituent of interest is finely disseminated inside another major constituent, and
4. the case where the main objective of sampling is to obtain a sample representative of the particle size
distribution.
Failure to distinguish these different cases leads to a total misunderstanding of TOS. However, there are
more subtleties to understand: suggested formulas have their limitations well addressed in TOS. For example,
if the variance of the FSE becomes very large, formulas become meaningless and a massive misuse of TOS
may take place. It was never the objective of TOS to address out-of-control variances generated by unwise
protocols for which skewness introduces unfortunate illusions and misconceptions. Let us be clear on this
issue: TOS cannot help people who are not preventive in understanding the constitution heterogeneity of
their materials.
The mean m[FSE] of the FSE is zero; however, its large variance may lead to severe data skewness and
therefore unwanted local illusions. Wisdom in sampling starts with a good understanding of FSE so that
simplistic Gaussian statistics may indeed reasonably apply. The last thing anyone wants to deal with is a FSE
introducing skewness in a generated database as it would most certainly result in estimation errors very few
people understand.

1.01.4.2.3 The grouping and segregation error


Gravity is the mother of segregation. Materials segregate as soon as they are crushed, transported, and stored in
stockpiles, silos, bins, or tanks throughout processes. Increments collected at any given place within a lot of
particulate material to be sampled (i.e., small-scale heterogeneity) can be very different, leading to a certain
amount of distribution heterogeneity. Materials segregate because fragments have a different size, shape, and
density. Because segregation is a transient phenomenon changing all the time, so does distribution hetero-
geneity. The most effective action to minimize the negative effect of distribution heterogeneity in sampling
protocols is to collect as many and as small correct increments to make up a sample as practically possible. If
problems generated by small-scale distribution heterogeneity are not carefully addressed, a substantial group-
ing and segregation error (GSE) may result.
The mean m[GSE] of the GSE is zero; however, its large variance may lead to catastrophic local illusions.
Experience proves that GSE is far more difficult to minimize than what many people may think. Actually, many
standard practices suggested for homogenizing materials do the opposite and segregate materials even further.
In sampling, beware of old traditions and ceremonial misconceptions.
An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 5

1.01.4.3 The Practical Implementation of the Sampling Protocol


Now that nearly everything is known to minimize the irrelevant, harmful variability introduced by a sampling
protocol, it is still necessary to find out what happens during the practical implementation of that protocol.
During this critically important phase, new sources of annoying and harmful variability take place, perhaps the
most dangerous ones. There are three independent components that should be carefully reviewed, which are
defined in the TOS with no possible ambiguity.

1.01.4.3.1 The increment delimitation error


For an increment making up a sample to be correct, every part of the lot to be sampled must have exactly the
same chance of becoming part of the sample. Therefore, the boundaries of the selected increment must coincide
with an isotropic volume of observation which is:
 A sphere if the lot is considered three-dimensional. Nobody knows how to do this, and this is the reason why
grab sampling or in-stream stationary probe sampling leads to unsolvable sampling problems. They are
nothing more than a form of gambling.
 A cylinder if the lot can be considered two-dimensional. This is possible when drilling ore bodies or drilling
a copper cathode.
 A uniform slice across a flowing stream if the stream is considered one-dimensional. We know how to solve
these problems under the discharge of a stream. We do not know how to solve these problems when taking a
cut on the top of a conveyor belt using a cross-belt hammer sampler.
The mean m[DE] of delimitation error (DE) is not zero; therefore, this error is a dangerous bias generator. The
variance of DE can be quite large as it depends on how severe the deviation from the ideal isotropic volume of
observation is and how severe transient segregation taking place at the sampling point is; the combination of
these two problems can be devastating. Rules of increment delimitation correctness listed in the TOS are not
negotiable.

1.01.4.3.2 The increment extraction error


The increment extraction error (EE) could have been called the increment recovery error, which is well known
by geologists drilling a deposit. This error takes place when the sampling tool is selective on what it is taking.
The mean m[EE] of EE is not zero; therefore, this error is a dangerous bias generator. The variance of EE can
be quite large as it depends on how severe the recovery problem is. Sampling systems must be carefully
designed to minimize this error. Rules of increment extraction correctness listed in TOS are not negotiable.

1.01.4.3.3 The weighting error


A good sampling system must be reasonably proportional. For example, when the cutter of a cross-stream
sampler crosses a falling stream, it must take an increment the mass of which is proportional to the stream flow
rate at this instant. Excessive deviation from proportionality can lead to a bias. Proportional sampling is
probably the future of sampling.
The sum of the variances of DE, EE, and weighting error (WE) is part of the variance of the materialization
error (ME). Readers familiar with TOS may notice that the author chose to include the WE in the family
making up the ME. The reason is that WE is the direct effect of how the sampling system collects increments. If
the sampling system is not proportional, WE is likely to be significant; whereas if the sampling system is
proportional, WE may be negligible. However, the flow rate fluctuations of a stream, which is part of the large-
scale variability, may also greatly affect WE. It is important to regulate flow rates as much as possible so that
WE can be kept negligible during the sample materialization phase.

1.01.4.4 The Preservation of Samples Integrity


1.01.4.4.1 Sample preparation errors
The variance of the preparation errors (PE) is a separate component introduced between sampling stages. It is
convenient to include PE in the ME, grouping together all the errors generated by the practical implementation
6 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

2
S ME = ΣS
n
2
DEn + ΣS
n
2
EEn + ΣS
n
2
WEn + ΣS
n
2
PEn

Selective process during sampling stages

Non-selective process during preparation stages

Figure 2 Expanding the materialization error to include all errors introduced by the practical implementation of the sampling
protocol.

of a sampling protocol. PE can be the result of contamination, losses, alteration of physical or chemical
composition, human errors, ignorance, carelessness, fraud, or sabotage.
The mean of PE is not zero; therefore, these errors are dangerous bias generators. The variance of PE can be
2
quite large as it depends on many nonselective factors. Finally, the variance SME of the ME can be written as
illustrated in Figure 2.
The components of the ME are not well known by manufacturers of sampling equipment, or perhaps, a more
accurate statement would be to say they do not make a clear difference between these four independent errors.
This problem is also perpetuated by standards on sampling, most of which are obsolete, and reluctant to endorse
the TOS. Furthermore, these standards usually do not distinctly address the difference between errors generated
by the selective process (i.e., DE, EE, and WE) and those generated by the nonselective process (i.e., PE).
The key to understanding problems created by these four errors can be summarized by the following
statement: all the constituents of the lot to be sampled must be given an equal probability p of being selected and
preserved as part of the sample, which leads to the notion of equiprobabilistic sampling, which in turn leads to
the notion of correct sampling systems. Not respecting this cardinal rule almost always results in the presence
of sampling biases that cannot be accounted for and totally ruins the previous efforts made to optimize the
sampling protocol. The correctness or incorrectness of a sampling system is a structural property. If the
sampling system is not correct by design (i.e., structure), devastating effects will result, regardless of results
from bias tests that would at times tend to prove otherwise.

1.01.5 Exercises Challenging the Reader

Solutions for the following exercises are beyond the scope of this paper, but they are well addressed in other
documents from the author listed in References.

1.01.5.1 Exercise #1: A Worldwide Problem for Ore Grade Control in Open Pits: Blasthole
Sampling
Figure 3 illustrates seven areas in sampling blastholes. Each area can be the object of major deviations from
sampling correctness. At each of these areas name the possible sampling error that may take place (e.g., DE,
EE, WE, or PE). The reader has 20 min to sort them by name and to provide a solution. If it cannot be done,
further training in sampling methodology is highly recommended.6 Biases taking place from blasthole drilling
cost mining companies a fortune every year. Usually and unfortunately it is a well-hidden cost. Assuming DE,
EE, WE, and PE are well taken care of, which is a huge assumption, geostatistical simulations using results
from duplicate field samples can have access to some extent to such worrisome costs, and actually it would be
advisable for geostatisticians to address such potential financial losses in feasibility studies if they had a deep
knowledge of TOS. Someone may ask why do this if DE, EE, WE, and PE have been taken care of. Make no
mistake, conciliation problems are not only the result of sampling errors introducing biases, they are also the
result of unreasonable variances affecting the in situ Nugget effect, FSE, and GSE, and these indeed can have
their effects simulated. But, nobody can make a simulation of economic losses generated by the bias
generators DE, EE, WE, and PE.
An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 7

E
D Segregation
G

A Former sub-drill

Ideal
sample
B
Actual
sample

C Current sub-drill

Figure 3 The nightmare of ore-grade control.

1.01.5.2 Exercise #2: Correctness of Straight-Path Cross-Stream Sampling Systems


Straight-path cross-stream sampling systems are critically important to assess the mineral content of the feed to
a plant, concentrates, and tailings. Yet, most of these sampling systems are flawed by design and will never fulfill
their mission. The sampler illustrated in Figure 4 is a typical bottom-dump straight-path cross-stream sampler
and as drawn it is apparently correct. Eleven important points are illustrated where other sampling systems may
not be built correctly and generate difficulties and sampling biases at these points. At each of these areas name
the possible sampling error that may take place (e.g., DE, EE, WE, or PE). The reader has 20 min to sort them by
name and to provide a solution. If it cannot be done, further training in sampling methodology is highly
recommended.7 Biases taking place at such sampling points can make it impossible to perform correct
metallurgical accounting, and therefore it is very unlikely that the subsequent process can be optimized.
Nobody can optimize an operation with a false database.

1.01.5.3 Exercise #3: Correctness of Rotating Vezin Cross-Stream Sampling Systems


Rotating Vezin cross-stream sampling systems are critically important for sampling small streams, or perform-
ing secondary or tertiary sampling in large sampling stations. Yet, most of these sampling systems are flawed by
design and will never fulfill their mission. The sampler illustrated in Figure 5 is a well-designed Vezin sampler.
Eleven important points are illustrated where other sampling systems may not be built correctly and generate
difficulties and sampling biases at these points. At each of these areas, name the possible sampling error that may
take place (e.g., DE, EE, WE, or PE). The reader has 20 min to sort them by name and to provide a solution. If it
cannot be done, further training in sampling methodology is highly recommended.7 Biases taking place at such
sampling points can make it impossible to perform correct metallurgical accounting, and therefore it is very
unlikely that the subsequent process can be optimized. Nobody can optimize an operation with a false database.

1.01.5.4 Exercise #4: Correctness of Hammer Cross-Belt Samplers


Hammer cross-belt samplers are very popular, but they are not as correct as manufacturers may claim. The
sampler illustrated in Figure 6 is a typical design. Seven important points are illustrated where severe
8 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

1
11
3

7
5
4
2

9
10

Figure 4 A primary sampler for the feed of a plant.

6
2 3
9
Falling stream

4 5
7
10

11 8

Figure 5 A rotating Vezin sampler.


An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 9

1 4
2

Figure 6 A typical hammer cross-stream sampler.

difficulties may arise and generate sampling biases. At each of these areas name the possible sampling error that
may take place (e.g., DE, EE, WE, or PE). The reader has 20 min to sort them by name and to provide a solution.
If it cannot be done, further training in sampling methodology is highly recommended.7 Biases taking place at
these sampling points can be financially devastating.

1.01.6 The Critical Importance of Sampling Courses

All possible problems created by each point addressed in the above four exercises should be solved within
minutes. Yet, because of a massive ignorance of TOS, these problems are the object of unnecessary doubts and
arguments, time-consuming meetings, endless debates with manufacturers and engineering firms, and very
expensive bias tests followed by doubtful statistics. Someone may wonder why doubtful statistics are used to
interpret bias tests. The reason has something to do with a subtle property of segregation. Biases in sampling are
the result of one form or another of segregation. However, the prime property of segregation is being a transient
phenomenon changing all the time. Too many people think of a sampling bias as a constant bias as often
observed for analytical biases. There is no such thing as a constant bias in sampling. A sample can be biased one
way today, another way tomorrow, and can remain unbiased for a while. Therefore, a bias test for a sampling
system can only declare there is indeed a bias; however, it cannot state that there is no bias. Furthermore, it is
almost common practice to make a sampling bias test by comparing two bad samples.
Each point addressed in the above exercises can lead to devastating money losses for the unaware company.
Let us review a few well-documented cases that were presented at WCSB1 (World Conference on Sampling
and Blending1) in Denmark in 2003.

1.01.6.1 Case #1: A Bad Protocol for Blastholes Followed by an Incorrect Implementation
Over a 10-year period a loss of US$134 000 000 was the result of a bad sampling and subsampling protocol used
in a copper mine in Chile.8 The increase in the recovery of natural resources could be measured only after a
better protocol was implemented for several consecutive months.

1.01.6.2 Case #2: An Incorrect Sampling System for the Final Tail of a Flotation Plant
Over a period of 20 years at a large copper mine in Chile, a stunning loss of US$2 000 000 000 was the result of
using a flawed sampling system for the Final Tail of a copper–molybdenum flotation plant.8 Another company
that treated the tailings for that period of time was quite prosperous. The first reaction of management after a
flawless sampling station was installed was denial.
10 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

A book could be written about many other examples around the world, but usually companies are very silent
about such a catastrophic economic outcome. This is where confidentiality agreements become convenient.
There is no doubt denial can become a form of fraud.

1.01.7 The Enemies and Their Link to Geostatistics

François-Bongarçon rightly said in 2003 that ‘‘TOS and geostatistics are inseparable’’. So far, we made a list of
the enemies responsible for slowly inflating one of the most annoying sources of variability plaguing geosta-
tistics. When calculating a variogram, it is often observed that as the distance between samples becomes
practically nil a certain amount of variability remains, which is random and discontinuous in nature; it is called
V[j ¼ 0], j being the lag between samples. For simplicity, we chose to call it V[0]. Figure 7 makes a summary of
2
them, expressed as variances, and the variance SHE 1
refers to the partial variance introduced by the protocol. It
is indeed annoying when an exploration or ore-grade control database is affected by a certain amount of
variability that cannot be explained by the structural properties of the deposit. It is indeed annoying when a
process control database is affected by a certain amount of variability that cannot be explained by what is
happening in the process. It is indeed annoying when an expensive environmental assessment database is
affected by a certain amount of variability that cannot be explained by what is happening in the environment.

1.01.7.1 Important Remark


V[0] is a variance. Some sampling errors such as DE, EE, WE, and PE are bias generators. A variogram does not
see a bias, though the mean of DE, EE, WE, and PE is not zero. However, because these sampling biases
fluctuate considerably, the variogram may show an increase for V[0], which cannot be explained by the
sampling protocol or the analytical variance.

1.01.8 Large-Scale Variability

The study of the total variability in any lot to be sampled can be broken up into several components, and the
variogram helps us take a close look at each of these components. This is well done in geostatistics9 for the study
of mineral deposits, and in chronostatistics10,11 for the study of variability in a process.

1.01.8.1 Definition of the Variogram


The scientifically sound definition of the variogram as well as its limitations is beyond the scope of this paper.
For these issues the reader is referred to the References section. Nevertheless, we shall proceed with a simplistic
definition. Also, in what follows, the variability of differences between samples with a small lag must remain

ΣS 2
DEn
n
ΣS 2
WEn

ΣS ΣS
n
2 2
FSEn GSEn
n n

2
S HE1 V[0] ΣS n
2
AEn

2
S NE ΣS 2
PEn

ΣS n
2
EEn
n

Figure 7 V[0] can be inflated by many factors; they are the enemies in sampling and they quickly add up.
An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 11

reasonably constant through time if the variogram is used to study a parameter within a process. This limitation
suggests taking precautions when selecting a series of chronological data, such as making reasonably sure along
the selected chronology that no important changes have been applied to the process in the way it works. As
such, it is likely that the suggested applications lose power when the basic sampling interval in the selected
chronology is too long (e.g., days, weeks, or months). Aware of this limitation, the practitioner can use common
sense not to mix data that do not belong to each other.

1.01.8.1.1 Selection of a given process parameter of interest


The variability of parameters controlling or making up a process stream is most of the time a stochastic
phenomenon. Such phenomenon is a hybrid in which we can find random and chaotic elements taking place at
very short time intervals, but in which we can also find functional and continuous elements. Therefore, a certain
parameter of interest (e.g., the proportion of a given size fraction in a stream feeding a SAG mill) characterizing
a process stream can be described as follows:
f ½t  ¼ fL þ fA ½t  þ fB ½t  þ fC ½t  ð1Þ
where:
fL is a constant term, such as the average proportion of a given size fraction of a stream feeding a SAG mill over f [t].
fA[t] is a random component characterized by neighboring fragments that are different and the resulting
proportion of a given size fraction changing in a chaotic and unpredictable way at very short time intervals,
regardless of the fact that these fragments may originate at the same place earlier in the process.
fB[t] is a nonrandom, nonperiodic component, essentially continuous, which is an inherent property of the
feed heading to the plant, for example, the slowly changing hardness of the ore.
fC[t] is a periodic component characterized by the fact that people and machinery often work in a cyclic way, for
example, a segregating stockpile introducing a cycle in the proportion of coarse fragments and fines feeding a SAG mill.

1.01.8.1.2 Heterogeneity affecting the given process parameter of interest


Taking into account the fact that heterogeneity of a constant, such as fL, is obviously zero, the total hetero-
geneity affecting the parameter of interest in the stream is fully characterized by fA[t], fB[t], and fC[t].

1.01.8.1.3 Measuring heterogeneity variability with the variogram


The most natural way to compare two values such as f1 (e.g., the proportion of a given size fraction in the stream
at time t1) and f2 (e.g., the proportion of a given size fraction in the stream at time t2), t1 and t2 being separated by
a time lag called j, is to measure their difference d:
d ¼ f1 – f2 ð2Þ
However, what is mostly relevant is the average difference d between many measurements N a given interval j
apart:
1 X 
d¼ fmþj – fm ð3Þ
N –j m

Such average difference converges toward zero; therefore, the squared difference should be used instead.
Furthermore, differences account for variability twice; therefore, the convention has been taken to calculate the
semi-variogram, leading to the following formula used for an absolute variogram:
1 X 2
V ½j  ¼ fmþj – fm ð4Þ
2½N – j  m

A relative, dimensionless variogram could be calculated as well, making it easier to compare variograms from
different experiments:
1 X 2
V ½j  ¼ 2 fmþj – fm ð5Þ
2½N – j  fL m
12 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

The study of many variograms shows that they are usually made of four major components:
V ½j  ¼ VA ½j  þ VB ½j  þ VC ½j  þ VD ½j  ð6Þ
where:
VA[j] is a very short-range term, random, and discontinuous. At the limit, when j ¼ 0, this leads to a very
important term in chronostatistics called V[j ¼ 0], simplified in further discussions to V[0], which is the
variability from sampling, subsampling, and measurement that does not exist in the process.
VB[j] is a long-range term, usually nonrandom, and continuous. This variability is the one that needs to be
controlled or tamed in a process.
VC[j] is a periodic term, continuous, tied to ways people work or to ways some process machines work. This is
a source of variability usually poorly understood, leading to process overcorrections, and therefore losses of
process efficiency. The economic impact of this source of variability, if misunderstood, is enormous.
VD[j] is a random residual variability tied to the variogram precision when the variance V[j] is calculated with
too few pairs N–j. VD[j] tends toward zero when the number of pairs increases. It is not a good practice to
calculate any point on a variogram with less than 20 pairs. Actually, 30 pairs or more is strongly recommended.

1.01.8.2 Extrapolation of the Variogram to Time or Distance Zero


An accurate extrapolation of the variogram to time or distance zero is required for many of its applications. The
most effective solution is to extrapolate either the first- or the second-order integral of the variogram4. Indeed,
W[j] and W9[j] smooth out residual noise due to VD[j], cycles, and other features interfering with good
extrapolation. However, as demonstrated in the sampling theory,1–4 the variogram and its integrals all intercept
the variance axis at the same place V[0].

1.01.8.3 The Importance of Large-Scale Variability


The ultimate goal of sampling is to understand the variability on a large scale. It is extremely important in a
very large number of areas. Let us give a few examples:
 To find new natural resources
 To quantify natural resources
 To show reasonable continuity of natural resources
 To understand conciliation problems during mining
 To optimize processes at mines and plants
 To improve the quality of products
 To secure fair money return from products
 To diminish penalties
 To curtail fraud
 To minimize environmental liability
 To lift the market perception of the company’s fiscal health
 To improve profitability trends, etc.
The list could go on and on, but let us be more specific in some well-known areas and see what could interfere
in a negative way to understand large-scale variability.

1.01.8.3.1 Variability issues during exploration


During exploration it is important to measure the anisotropy of a geological unit. We need to find out whether
grade trends may be different: North–South than East–West or than along a vertical axis. Variogram ranges
need to be defined in different directions. Density of drilling needs to be optimized in different directions. Ore
continuity and zone of influence need to be defined in different directions. Ultimately, a reliable geological
model must be created. With a large V[0] damaging the database, these critically important tasks leading to a
reliable feasibility study become weak.
An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 13

1.01.8.3.2 Variability issues during mining


During mining it is important to select a logical grade control drilling pattern and drilling density. The
selection of a logical Kriging technique depends on quality data,8 and so does the selection of realistic and
economic cutoff grades. All this may affect the selection of a long-term pit design. A reasonable acceptance level
for conciliation differences between ore-grade control and the prediction from the geological model is crucial.
With a large V[0] damaging the database, these critically important tasks leading to a reasonable and economic
recovery of natural resources become weak, which necessarily result in financial losses.

1.01.8.3.3 Variability issues within a processing plant


Within a processing plant, believable metallurgical accounting needs to monitor performance. Metallurgists
need to control key process-operating parameters. Process trends need to be tamed in due time. Process cycles,
always very costly, need to be identified and either eliminated or minimized. Reliable variographic control
charts must be constantly updated at many places. Overcorrections of the process must be prevented. With a
large V[0] damaging the database, these critically important tasks leading to a reasonable and economic
recovery of natural resources become weak, which necessarily result in financial losses.

1.01.8.3.4 Variability issues during trade with customers


Customers like a fair price, but hate bad surprises on product quality. Penalty application is a common way of
doing business: ‘what costs me must cost you!’ The quality of a product cannot be controlled after the fact, and
the key for success is to implement the many things that lead to a good product, all the way from geology,
mining, and the processing plants. With a large V[0] damaging the database, these critically important tasks to
keep quality of products within client specifications become weak, which necessarily result in further financial
losses.

1.01.9 Conclusions

There is no such a thing as continuous improvement without a clear understanding of TOS followed by a
strong commitment to correct sampling practices. Many statisticians believe that interleaved sampling can solve
everything if followed by a careful statistical analysis of the results. There is nothing wrong in their approach;
however, there are two possible outcomes:
1. The variance analysis shows that there is no problem and all parties involved may feel comfortable enough.
2. The variance analysis shows that there is a problem. Now, without a clear understanding of TOS it is very
unlikely they will find the causes and cure them effectively in the most economic way. The strategy that is
suggested in TOS by dividing the complex problem into its basic components is the inescapable way to reach
solutions quickly and effectively.

1.01.10 Recommendations

The objective of this paper is only to help people looking at TOS for the first time to get organized. It is well
known that newcomers often get discouraged as the way TOS is frequently presented can be intimidating. If the
reader gets well organized, looking at the challenge with a clear Cartesian spirit, TOS is actually a lot simpler
than what people may think.12,13 Follow the strategy illustrated in Figure 8 and quick progress is likely to
occur. With emphasis on causes of sampling problems, the difficult climb to continuous improvement may take
place. With emphasis on effects of sampling problems, as it is often the case, there is no progress possible.
After a commitment has been taken for continuous improvement, it is still necessary to create a road map so
that economic benefits can quickly be measured. A typical road map is illustrated in Figure 9.
14 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

If cause eliminated:
standardize Short courses,
If cause not workshops,
eliminated: and training
re-analyze

Compulsory Analyze existing


action on causes data and find
of sampling structural
problems problems

Lost opportunities with


emphasis on effects of
problems

Continuous improvement of
mining process
with emphasis on causes of
problems

Figure 8 Make a commitment for continuous improvement.

Director of standards of mining process:


The synergy necessary for mining process efficiency

Selection of standards useful to a Selection and offering Guidelines of best


mining company of short courses, practices
Implementation of company's workshops, and training Selection of world experts
guidelines

Identification of structural
sampling problems
and continuous improvement
of mining process

Compulsory
Accountability
actions

Communication with Communication with top


QA/QC and Laboratory management at a
Managers company’s operations

Figure 9 A typical road map suggested for the mining industry.


An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management 15

References

1. Gy, P. Sampling of Particulate Materials, Theory and Practice. Developments in Geomathematics; Elsevier Scientific Publishing:
Amsterdam, 1979 and 1983; Vol. 4.
2. Gy, P. Heterogeneite – Echantillonnage – Homogeneisation: Ensemble coherent de theories; Masson Editeur: Paris; 1988, ISBN:
2-225-81313-2. ISSN: 0765-0221.
3. Gy, P. Sampling of Heterogeneous and Dynamic Material Systems: Theories of Heterogeneity, Sampling and Homogenizing;
Elsevier: Amsterdam, 1992.
4. Pitard, F. Pierre Gy’s Sampling Theory and Sampling Practice, 2nd ed.; CRC Press: Boca Raton, FL, 1993.
5. Pitard, F. Effects of Residual Variances on the Estimation of the Variance of the Fundamental Error. Chemometr. Intell. Lab. 2004,
74(1), 149–164. In the text, specifically referred to equations (2), (3), (4), (5).
6. Pitard, F. Blasthole Sampling for Grade Control: The Many Problems and Solutions. Sampling 2008, The Australasian Institute of
Mining and Metallurgy, Perth, Australia, 2008; May 27–28, 2008.
7. Pitard, F. Sampling Correctness – A Comprehensive Guideline. Second World Conference on Sampling and Blending 2005,
Publication Series No 4/2005; The Australian Institute of Mining and Metallurgy, 2005; ISBN: 1-920-80628-8.
8. Carrasco, P.; Carrasco, P.; Jara, E.; The Economic Impact of Correct Sampling and Analysis Practices in the Copper Mining
Industry. Chemometr. Intell. Lab. 2004, 74(1), 209–231.
9. François-Bongarçon, D.; Theory of Sampling and Geostatistics: An Intimate Link. Chemometr. Intell. Lab. 2004, 74(1), 143–148.
10. Pitard, F. Practical Statistical Tools for Managers, Metallurgical Plant Design and Operating Strategies Conference,Sydney April
15–16, The Australasian Institute of Mining and Metallurgy, Sydney, April 15–16, 2002.
11. Pitard, F. Chronostatistics – A Powerful, Pragmatic, New Science for Metallurgists. Metallurgical Plant Design and Operating
Strategies (MetPlant 2006), The Australasian Institute of Mining and Metallurgy, Perth, W.A. September 18–19, 2006.
12. Esbensen, K. H.; Minkkinen, P. Guest Editors. In Chemometrics and Intelligent Laboratory Systems, Vol. 74, 1, 2004, Special
issue: 50 years of Pierre Gy’s Theory of Sampling, Proceedings First World Conference on Sampling and Blending (WCSB1).
Tutorials on Sampling: Theory and Practice.
13. Lars, P. Pierre Gy’s Theory of Sampling (TOS) – In Practice: Laboratory and Industrial Didactics Including a First Foray into Image
Analytical Sampling, Phd Thesis, ACABS Research Group, Aalborg University Esbjerg, Niels Bohrs Vej 8, Dk-67000 Esbjerg,
Denmark, 2005.
16 An Introduction to the Theory of Sampling: An Essential Part of Total Quality Management

Biographical Sketch

Mr. Francis F. Pitard is a consulting expert in Sampling, Statistical Process Control, and
Total Quality Management for 23 years. He is President of Francis Pitard Sampling
Consultants (www.fpscsampling.com) and Technical Director of Mineral Stats Inc.
(www.mineralstats.com) in Broomfield, CO, USA. He provides consulting services in many
countries. Mr. Pitard has 6 years of experience with the French Atomic Energy Commission
and 15 years with Amax Extractive R&D. He has taught Sampling Theory, SPC, and TQM
to approximately 6000 people and to more than 275 companies around the world through
Continuing Education Offices of the Colorado School of Mines, for the Australian Mineral
Foundation, for the Mining Department of the Universidad de Chile, and the University of
Witwatersrand. He has degrees in chemistry from the Gay-Lussac Institute in Paris and from
the Academy of Paris.

You might also like