Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Chem417: Nuclear Chemistry and Radiochemistry, Course Notes Appendix

(Dated: March 15, 2019)


These are an incomplete version of the Appendix of the class notes for 2019. For example, they
are intended to be augmented at positions marked “Discussion” with notes and diagrams taken in
class and to be supplemented by some powerpoint slides that I show in class.

I. STATISTICS, PROBABILITY AND • Discussion (3)


RADIOACTIVITY
• In this case, one can define the true mean as
X
Radioactive decay is intrinsically probabilistic. The µ= Xi P (Xi ),
decay of a nucleus is governed by quantum mechanics. In i
this theory, only the decay probability per unit time can where i runs over all possible outcomes Xi .
be calculated, and the precise time of decay of any indi-
vidual nucleus cannot be predicted. To correctly quantify • If the range of outcomes is continuous rather than
radioactivity thus requires some basic notions of proba- discrete,
bility and statistics.
Z
Probability is a concept that we can use to answer µ = xp(x)dx,
questions about random processes, e.g. if I flip a fair
coin 8 times, what is the chance of getting 7 heads? This where p(x) is called the probability density func-
particular question can be answered using the binomial tion.
probability distribution, see below. • X̄ for the finite set of data is an estimate of µ that
becomes a better estimate as N increases.

A. Statistical Measures of Precision • f is an estimate of P


• Each Xi is also an estimate of µ, but how precise?
• For any measurement, it is essential to know how One expects that the average of say 10 values will
precise it is. be a more precise estimate than a single measure-
ment, but how can we show this is the case?
• Often this is assessed by making repeated measure-
ments. • Define the deviation of measurement Xi from the
sample mean,
• Suppose we measure the quantity X N times to
get the set of results: Xi for i = 1 . . . N . For ex- di = Xi − X̄.
ample, this could be heights of people, pH, number
• The average of the deviations is zero, i.e.
of heads in a series of coin tosses, or number of
N
radioactive decays per second. 1 X
di = 0,
• Typically, the measurements Xi will show some N i=1
random fluctuations about their average value,
from the definition of the mean X̄.
N
1 X • Define the sample variance
X̄ = Xi = hXi,
N i=1 N
2 1 X 2
σX = d .
the sample average or the sample mean. N − 1 i=1 i
• In many cases, the measurements Xi reflect a “true From this, the sample standard deviation is
value” obscured by some random noise, i.e. the p
σX = sample variance.
results of the experiment are probabilistic.
• One can estimate the probability of some outcome • The variance of the ideal probability distribution is
by the frequency of its occurrence in the set of N σµ2 =
X
(Xi − µ)2 P (Xi )
trials,
Zi
Number of times Xj occurs in N trials
f (Xj ) = . = (x − µ)2 p(x)dx
N Z
As N → ∞, = (x2 − 2µx + µ2 )p(x)dx
f (Xj ) → P (Xj ), = hx2 i − 2µµ + µ2
the probability of result Xj . = hx2 i − µ2 ,
2

i.e. the average of the square of x minus the square • Example: If you have N nuclei and measure
of the average. whether each decays in the next ∆t seconds. As-
suming the decay of one doesn’t interfere with the
• σX is an estimate of σµ . decay of another, then this is a binomial experi-
• σX is an estimate of the precision of any individual ment.
measurement Xi as an estimate of µ, i.e. • A lot of binomial experiments are carried out all
the time, even in Vegas.
Xi ± σX .
• If you ask the question: Of the N trials, how many
• The average of N measurements is a more precise are success (heads, survival, etc.), call that number
estimate of µ, y. y can be 0, 1, 2, . . . N and in general there is
some probability for each of these values.
σX
X̄ ± √ ,
N • Simple counting arguments give the probability,

where the precision here is called the standard error  N



py (1 − p)N −y y = 0, 1, . . . N
P (y) = y (1)
of the mean.
0 Otherwise
• Averaging N measurements improves√ the precision
Here Ny is the number of combinations of size y

of the estimate by a factor of 1/ N .
from a pool of N ,
 
N N!
B. Probability and Radioactive Decay = ,
y y!(N − y)!

• Radioactive decay is intrinsically probabilistic. It is the number of ways the y successes can occur in
not a deterministic signal with some superimposed the N trials. N ! is the factorial of N . Note that
random noise.    
N N
= = 1,
• Consider a nucleus with τ1/2 , i.e. λ = ln 2/τ1/2 . N 0
• The probability to decay in the next ∆t seconds is and
 
p = λ∆t N
= N,
1
• The probability it will survive another ∆t seconds
is just 1 − p. as you would expect.

• The outcome is binary, either it decays or not. • The probabibility P (y) from Eq. (1) is Pbin , the
binomial probability distribution.
• This is similar to tossing a coin: heads or tails are
the only possibilities.
D. The Poisson Distribution
• A fair coin has p = 1 − p = 0.5.
• Is a coin toss intrinsically probabilistic? • A useful limit of the binomial distribution.
• Assume:
C. Binomial Experiments 1. We have a very large number of trials N
2. The probability of success is very small for
Consider the following type of experiment:
each trial, p  1.
1. N trials
then
2. Each trial is binary, i.e. 2 outcomes possible.
e−α αy
Pbin (y) ≈ P (y) = ,
3. Trials are independent of one another y!

4. The probability of each outcome (p and 1 − p) are where α = N p.


constant from trial to trial.
• This P (y) is called the Poisson probability distribu-
This is called a binomial experiment. tion, PP
3

• Rule of thumb: for N > 100 and p < 1% , the • But we know a better estimate of the true average
approximation is very good. is the mean of the M measurements,
• Poisson processes: any set of rare, random, inde- M
1 X
pendent events occuring in time, space etc. fol- X̄ = Xi ,
low a Poisson distribution. For example, computer M i=1
failures, cosmic ray showers, flaws in boiler tanks, √
radioactive decay, etc. which has the standard error of the mean σX / M ,
where
• Poisson is still a discrete probability distribution, s
the allowed values of y are the integers i = 1 X
0, 1, 2, . . . N . σX = (Xi − X̄)2
M −1 i
• True mean:
N • Example: Consider the table of counts collected in
10 different 1 minute intervals of radioactive decays
X
µ= iPP (i) = α
i=0 from a long-lived source (τ1/2  1 min).

1. Compute the sample mean and standard devi-


• True variance:
ation to check whether the data appear Pois-
N
X sonian.
σµ2 = (i − µ)2 PP (i) = α, 2. What is the error in the mean?
i=0
3. What is the averge countrate and its error if
• So if you know α, you have you treat the entire data set as a single mea-
√ surement 10 min. long.
µ ± σµ = α ± α,
• The above example shows how to combine a num-
or if you have an experimental estimate of α, X, ber of equivalent measurements, but what about
then you have an estimate of when the measurements are not all equivalent?
√ Here equivalent means equally precise.
µ ± σµ ≈ X ± X.
• The proper estimate combining a set of inequivalent
• Example: Counting Radioactive Decays measurements is the weighted average:
X Xi
• Discussion (4) X̄ = A ,
i
σi2
P −2
where A = i σi . Clearly this reduces to the
E. The Gaussian or Normal Distribution
regular average if all the σi are the same. The un-
certainty in this weighted average is
It is nice to have a continuous p(y) rather than a dis-
crete P (y), so you can use the tools of calculus. 2
σX̄ = A−1 .
1 (x−α)2
You can check that this reduces to the standard
PP (y) ≈ √ e− 2α = PG (y).
2πα error of the mean if all the σi are the same.
Again the true mean and variance are both α. This ap- • Example: Consider two measurements of the same
proximation is good when y is a large integer, i.e. a large quantity U ± u and V ± v. An estimate of the
number of counts, close to its ideal average value (while true mean that correctly combines these two is the
N is still much larger). PG is the Gaussian or Normal weighted average (with its error) that is,
probability distribution.
s
U V
u2 + v 2 1
X̄ = 1 1 ± 1 1 .
F. Combining Measurements with Uncertainties u2 + v 2 u2 + v2

• Consider M repeated measurements of the same • If these measurements are equally precise, i.e. u =
Poisson variable, Xi for i = 1, 2, . . . , M . Each Xi v, then
is the number of “successes” in N trials.
√ U +V u
X̄ = ±√ .
• Each Xi has standard deviation Xi . 2 2
4

• If we know how precise X and Y are, i.e. σX and


TABLE I. Simple Error Propagation
σY , it is important to be able to estimate how pre-
Z σZ
p cise Z is, σZ .
2
X +Y σX + σY2
p
X −Y 2
σX + σY2
p
X × Y XY (σX /X)2 + (σY /Y )2
p
X/Y (X/Y ) (σX /X)2 + (σY /Y )2

• Table I shows the result for some simple common


G. Error Propagation
functions f .

• Two independent measured quantities X, Y are of-


ten combined to form a third:

Z = f (X, Y ).

You might also like