Professional Documents
Culture Documents
Ijsred V2i5p56
Ijsred V2i5p56
Ijsred V2i5p56
Available at www.ijsred.com
RESEARCH ARTICLE OPEN ACCESS
----------------------------------------************************----------------------------------
Abstract:
In this paper, an important mathematical concept which has many applications to the probabilistic
models are presented. Some of the important applications of the moment- generating function to the theory
of probability are discussed. Each probability distribution has a unique moment-generating function,
which means they are especially useful for solving problems like finding the distribution for sums of
random variables. Reproductive properties of probability distributions with illustrated examples are also
described.
B. Definition
Let X be a continuous random variable with
probability density function f. The expected value
of X is defined as x=a x=b
+∞ Fig. 1 X has uniformly distribution
E (X) = ∫ xf ( x ) dx, (2)
−∞ E. The Poisson Distribution
if the improper integral is a absolutely convergent, Let X be a discrete random variable assuming
that is, the possible values: 0,1,2,…,n... . If
+∞
−α k
∫ x f ( x ) dx.
P(X = k) =
e α
, k = 0,1, 2, … , n,... . (4)
−∞ k!
then X has a Poisson distribution with parameter
C. Binomial Distribution
α > 0.
Consider an experiment ε and let A be some
event associated with ε. Suppose that P ( A ) = p and F. Geometric Distribution
hence P ( A ) = 1 − p. Consider n independent Assume, as in the discussion of the binomial
repetitions of ε. Hence the sample space consists of distribution, that we perform ε repeatedly, that the
all possible sequences {a1,a 2 ,...,a n } , where each a i repetitions are independent and that an each
is either A or A, depending on whether A or A repetition P ( A ) = p and P ( A ) = 1 − p = q remain the
occurred on the ith repetition of ε. Furthermore, same. Suppose that we repeat the experiment until
r =1
r=2
r=4
x=µ x
Fig. 2 X has normal distribution
Fig. 4 X has Gamma distribution
k =0 k
( )
= ∑ pe t (1 − p )
n −k
2π −∞
2 2
n σ t ∞
tµ+ 1 2
t 1
= pe + (1 − p ) .
(14) =e 2
∫
2π −∞
exp − [s − σt ] ds.
2
Let s − σt = γ; then ds = dγ and
D. Example σ2 t 2 ∞ γ2
tµ+ 1 −
Suppose that X has a Poisson distribution with MX ( t ) = e 2
∫
2π −∞
e 2 dγ
parameter λ. Thus σ t
2 2
∞ −λ k tµ+
e λ 2
M X ( t ) = ∑ e tk = e . (17)
k =0 k!
G. Example
t k
= e−λ ∑
∞
( λe ) Let X have a Gamma distribution with
k! parameters α and r. Then
k =0
x 2 x3 xn
ex = 1 + x + + + ... + + ... . Thus = M′′ ( 0 ) − M′ ( 0 ) .
2
2! 3! n!
e
tx
= 1 + tx +
( tx )2 + ... + ( tx )n + ... .
2! n!
Now A. Theorem
Suppose that the random variable X has MX . Let
( tX )2 + ... + ( tX )n + ...
MX ( t ) = E e ( ) tX
= E 1 + tX +
2! n!
Y = αX + β. Then MY , the moment-generator
function of the random variable Y, is given by
= 1 + tE ( X ) +
2
t E X ( ) + ... + t E ( X ) + ... .
2 n n
2! n! M Y ( t ) = eβ t M X ( α t ) . (20)
Since M X is a function of the real variable t, the
derivative of M X ( t ) with respect to t, that is, M′ ( t ) .
( ) + ... + t E ( X ) + ... .
t 2 E X3 n −1 n
Proof:
M ′X ( t ) . = E ( X ) + tE X ( )+ 2
2! ( n − 1)!
M Y ( t ) = E e Yt = E e
( αX +β ) t The length of a rod is a normally distributed
( ) random variable with mean 4 inches and
variance 0.01 inch2. Two such rods are placed
( )
= eβ t E eαtX
end to end and fitted into a slot. The length of this
= eβ t M X ( αt ) . slot is 8 inches with a tolerance of ±0.01 inch. The
probability that the two rods will fit can be
B. Theorem evaluated.
Letting L1 and L 2 represent the lengths of rod 1
Suppose that X and Y are independent random
variables. Let Z = X + Y. Let MX ( t ) , MY ( t ) and M Z ( t ) and rod 2, thus L = L1 + L 2 is normally distributed
be the moment-generating functions of the random with E ( L ) = 8 and V ( L ) = 0.02. Hence
variables X, Y and Z, respectively. Then 7.9 − 8 L − 8 8.1 − 8
P [ 7.9 ≤ L ≤ 8.1] = P ≤ ≤
M Z ( t ) = MX ( t ) MY ( t ) . (21) 0.14 0.14 0.14
Proof: = Φ ( +0.714 ) − Φ ( −0.714 ) = 0.526,
( )
M Z ( t ) = E e2t
from the tables of the normal distribution [4].
= E e
( X + Y )t C. Theorem
Let X1,..., X n be independent random variables.
( )
=E e e
Xt Yt
Suppose that Xi has a Poisson distribution with
= E (e ) E (e ) = M
Xt Yt
X ( t ) MY ( t ). parameter αi , i = 1, 2,..., n. Let Z = X1 + ... + X n . Then Z
has a Poisson distribution with parameter
V. REPRODUCTIVE PROPERTIES OF α = α1 + ... + α n .
DISTRIBUTIONS
Proof:
If two or more independent random variables
having a certain distribution are added, the resulting For the case of n = 2 :
random variable has distribution of the same type as
α1 ( e −1) α 2 ( e −1)
t t
V. CONCLUSIONS
The moment-generating function as defined above