MA-202 Booklet

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

MA-202: Probability and Statistics

Information Booklet and Tutorial Sheets


Even Semester 2023–2024

Instructors
Dr. Amit Kumar (Convener)

Department of Mathematical Sciences


Indian Institute of Technology (BHU) Varanasi
Varanasi – 221005, India.

Name:
————————————————————————
Roll Number:
————————————————————————
Branch:
————————————————————————
Contents
Basic Information 1
Course Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Textbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Reference Books . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Teaching Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Evaluation Scheme . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Lectures and Tutorials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Policy for Attendance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Extra Lectures and Tutorials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Instructors and Their Coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Timings for Lectures and Tutorials . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Tutorial Sheets 7
Tutorial 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Tutorial 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Tutorial 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Tutorial 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Tutorial 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Tutorial 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Tutorial 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Tutorial 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Tutorial 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Tutorial 10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

Answers of Tutorials 24
Basic Information

Course Contents
UNIT I: Probability
Classical, Relative Frequency and Axiomatic Definition of Probability, Properties of Probabil-
ity Function, Conditional Probability, Independence of Events, Theorem of Total Probability,
Bayes’ Theorem.

UNIT II: Random Variable and Its Distribution


Definition of Random Variable, Distribution Function and Its Properties, Types of Probability
Distributions (Discrete, Continuous and Mixed Type), Probability Mass Function, Probability
Density Function, Mathematical Expectation, Moments, Variance, Standard Deviation, Mea-
sures of Skewness and Kurtosis, Probability and Moment Generating Functions and Their
Properties, Mode, Median, Quantiles.

UNIT III: Special Discrete Distributions


Discrete Uniform Distribution, Bernoulli and Binomial Distribution, Geometric and Negative
Binomial Distribution, Poisson Distribution, Poisson Limit Theorem, Hyper-Geometric Distri-
bution.

UNIT IV: Special Continuous Distributions


Continuous Uniform Distribution, Exponential Distribution, Gamma Distribution (with param-
eters r and λ), Standard Gamma Distribution (λ = 1), Beta Distribution, Cauchy Distribution
(with parameters λ and µ) and Its Moments, Normal Distribution and Its Properties.

UNIT V: Function of Random Variables and Its Distribution


Function of Random Variable, Methods to Find Distribution of Function of a Random Variable
(Distribution Function, Jacobian and MGF Methods) and Their Expectations.

UNIT VI: Random Vector and Its Joint Distribution


Definition of Random Vector, Distribution Function of a Random Vector and Its Properties,
Joint, Marginal and Conditional Distributions, Product Moments, Covariance and Correlation,
Joint Moment Generating Function and Its Properties, Multinomial Distribution, Bivariate
Normal Distribution.

UNIT VII: Function of Random Vector and Its Distribution


Function of Random Vectors, Methods to Find Distribution of Function of a Random Variable
(Distribution Function, Jacobian and MGF Methods) and Their Expectations.

1
UNIT VIII: Asymptotic Distributions
Convergence in Distribution, Convergence in Probability, Convergence Almost Surely, Markov
and Chebyshev’s Inequality, Weak Law of Large Numbers, Strong Law of Large Numbers,
Central Limit Theorem.

UNIT IX: Statistics and Sampling Distributions


Chi-Square Distribution, Student’s t−distribution, Snedcor’s F − Distribution and Their Rela-
tion to Normal Distribution, Introduction to Statistical Inference, Population, Random Sample,
Statistic, Parameters, Joint Distribution of Sample Mean and Sample Variance Based on a Ran-
dom Sample from Normal Distribution.

UNIT X: Point Estimation


Point Estimation, Unbiased Estimators, Consistent Estimators, Method of Moments, Method
of Maximum Likelihood Estimator, Invariance of Maximum Likelihood Estimators, Criteria for
Comparing Estimators.

UNIT XI: Interval Estimation


Interval Estimation, Confidence Intervals, Confidence Intervals for Normal Populations: Mean,
Difference of Means, Variance and Ratio of Variance, Confidence Intervals for Proportion and
Difference of Proportions.

UNIT XII: Testing of Hypotheses


Null and Alternative Hypotheses, Simple and Composite Hypotheses, Critical Regions, Type
I & II Errors, Neyman-Pearson Lemma, Most Powerful and Uniformly Most Powerful Tests
and Their Examples, p−value, Likelihood Ratio Tests: Likelihood Ratio Tests for Statistical
Hypotheses in One and Two Sample Problems Involving Normal Populations, Tests for Pro-
portions, Chi-Square Goodness of Fit Test, Contingency Tables.

Textbooks
1. Spiegel, M. R., Schiller, J. J. and Srinivasan, R. A., Probability and Statistics. 3rd ed.,
McGraw-Hill, New York, 2009.

2. Chung, K. L. and AitSahlia, F., Elementary Probability Theory, 4th ed., Undergraduate
Texts in Mathematics, Springer-Verlag, New York, 2003.

3. Gupta, S. C. and Kapoor, V. K., Fundamentals of Mathematical Statistic, 10th ed., Sultan
Chand & Sons, New Delhi, .

2
Reference Books
1. Ross, S. M., Introduction to Probability Models, 6th ed., Academic Press, San Diego, CA,
USA, 1997.

2. Kolmogorov, A. N., Foundation of the Theory of Probability, Chelsea, New York, 1950.

3. Feller, W., An Introduction to Probability Theory and its Applications, Vol. I and II,
New York, NY: Wiley, 1968-1971.

4. Casella, G. and Berger, R. L., Statistical Inference, 2nd ed. Belmont CA: Duxbury, 2002.

Teaching Plan
The week-wise teaching plan is as follows:

Week 1: Dec 28 - 29, 2023 UNIT I: Probability


(0/1/2 Lecture(s)) Classical, Relative Frequency and Axiomatic Definition
of Probability, Properties of Probability Function, Con-
ditional Probability, Independence of Events, Theorem of
Total Probability, Bayes’ Theorem.
Week 2: Jan 01 - 05, 2024 UNIT II: Random Variable and Its Distribution
(3 Lectures) Definition of Random Variable, Distribution Function
and Its Properties, Types of Probability Distributions
(Discrete, Continuous and Mixed Type), Probability
Mass Function, Probability Density Function.
Week 3: Jan 08- 12, 2024 Mathematical Expectation,Moments, Variance, Standard
(3 Lectures) Deviation, Measures of Skewness and Kurtosis, Probabil-
ity and Moment Generating Functions and Their Prop-
erties, Mode, Median, Quantiles.
Week 4: Jan 15 - 19, 2024 UNIT III: Special Discrete Distributions
(2/3 Lectures) Discrete Uniform Distribution, Bernoulli and Binomial
Class Suspended: Jan 19, 2024 Distribution, Geometric and Negative Binomial Distri-
(Kashiyatra) bution, Poisson Distribution, Poisson Limit Theorem,
Hyper-Geometric Distribution.
Week 5: Jan 22 - 26, 2024 UNIT IV: Special Continuous Distributions
(2/3 Lectures) Continuous Uniform Distribution, Exponential Distribu-
Holiday: Jan 26, 2024 (Republic tion, Gamma Distribution (with parameters r and λ),
Day) Standard Gamma Distribution (λ = 1), Beta Distribu-
tion, Cauchy Distribution (with parameters λ and µ) and
Its Moments, Normal Distribution and Its Properties.

3
Week 6: Jan 29 - Feb 02, UNIT V: Function of Random Variables and Its
2024
(3 Lectures) Distribution
Function of Random Variable, Methods to Find Distri-
bution of Function of a Random Variable (Distribution
Function, Jacobian and MGF Methods) and Their Ex-
pectations.
Week 7: Feb 05 - 09, 2024 UNIT VI: Random Vector and Its Joint Distribu-
(3 Lectures) tion
Definition of Random Vector, Distribution Function of a
Random Vector and Its Properties, Joint, Marginal and
Conditional Distributions, Product Moments, Covariance
and Correlation, Joint Moment Generating Function and
Its Properties, Multinomial Distribution, Bivariate Nor-
mal Distribution.
Week 8: Feb 12 - 16, 2024 UNIT VII: Function of Random Vector and Its
(2/3 Lectures) Distribution
Holiday: Feb 14, 2024 (Basant Function of Random Vectors, Methods to Find Distri-
Panchami) bution of Function of a Random Variable (Distribution
Function, Jacobian and MGF Methods) and Their Ex-
pectations.
Week 9: Feb 19 - 23, 2024 Mid-Term Examination
Week 10: Feb 26 - Mar 01, UNIT VIII: Asymptotic Distributions
2023 Convergence in Distribution, Convergence in Probability,
(3 Lectures) Convergence Almost Surely, Markov and Chebyshev’s In-
equality, Weak Law of Large Numbers, Strong Law of
Large Numbers, Central Limit Theorem.
Week 11: Mar 04 - 08, 2024 UNIT IX: Statistics and Sampling Distributions
(3 Lectures) χ2 −Distribution, Student’s t−distribution, Snedcor’s F −
Holiday: Mar 08, 2024 (Maha Distribution and Their Relation to Normal Distribution.
Shivratri)
Week 12: Mar 11 - 15, 2024 Introduction to Statistical Inference, Population, Ran-
(1/2/3 Lecture(s)) dom Sample, Statistic, Parameters, Joint Distribution of
Class Suspended: Mar 11, 2024 Sample Mean and Sample Variance Based on a Random
(Election) and Mar 15, 2024 Sample from Normal Distribution.
(Technex & Institute Day )
Week 13: Mar 18 - 22, 2024 UNIT X: Point Estimation

4
(3 Lectures) Point Estimation, Unbiased Estimators, Consistent Esti-
mators, Method of Moments, Method of Maximum Like-
lihood Estimator, Invariance of Maximum Likelihood Es-
timators, Criteria for Comparing Estimators.
Week 14: Mar 23 - 29, 2024 Intra-Semester Recess
(0/1 Lectures)
Holiday: Mar 25, 2024 (Holi)
and Mar 29, 2024 (Good Friday)
Week 15: Apr 01 - 05, 2024 UNIT XI: Interval Estimation
(3 Lectures) Interval Estimation, Confidence Intervals, Confidence In-
tervals for Normal Population(s): Mean, Difference of
Means, Variance and Ratio of Variance, Confidence In-
tervals for Proportion and Difference of Proportions.
Week 16: Apr 08 - 12, 2024 UNIT XII: Testing of Hypotheses
(3 Lectures) Null and Alternative Hypotheses, Simple and Composite
Holiday: Apr 11, 2024 (Idu’l- Hypotheses, Critical Regions, Type I & II Errors,
Fitr) Neyman-Pearson Lemma, Most Powerful and Uniformly
Most Powerful Tests and Their Examples, p−value.
Week 17: Apr 15 - 19, 2024 Likelihood Ratio Tests: Likelihood Ratio Tests for Sta-
(3 Lectures) tistical Hypotheses in One and Two Sample Problems
Involving Normal Populations, Tests for Proportions
Week 18: Apr 22 - 23, 2024 Chi-Square Goodness of Fit Test, Contingency Tables.
(0/1/2 Lecture(s))
Week 19 - 21: Apr 25 - May End-Term Examination
10, 2024

There will be approximately 37-47 lectures possible for this semester after considering all holi-
days and the Mid-Term Exam.

Evaluation Scheme
The following is the evaluation scheme for this course:

Two Assignments (5% each, one before mid-term and one after mid-term) 10%
Attendance 10%
Mid-Term Examination 30%
End-Term Examination 50%
The syllabus for the Mid-Term and End-Term Examination will be informed by the due time.
If there are some changes in the evaluation scheme then it will be informed by the respective
teacher in due time.

5
Lectures and Tutorials
Every week we have three lectures each of which is about 55 minutes duration. In addition,
there will be a tutorial of 55 minutes duration every week. Consult the textbook (and if you
wish, the reference books) regularly.
For the purpose of tutorials, your section will be assigned 3-4 teaching assistants (TAs). The
aim of the tutorials is to clear your doubts and to give you practice for problem solving. Based
on the material covered, you are expected to try the problems before coming to the tutorial
class. In case you have doubts, please seek the help of your TAs.

Policy for Attendance


Attendance in lectures and tutorials is compulsory. Students who do not meet 75% attendance
requirement may be automatically given a failing grade. In case you miss lectures for valid
(medical) reasons, get a medical certificate and keep it with you. You can produce it if you fall
short of attendance.

Extra Lectures and Tutorials


The extra lectures and tutorials may be taken if required. The details will be communicated
during the class by the respected teacher.

Instructors and Their Coordinates

Branch Instructor’s Name Address


Electrical Engineering Dr. Amit Kumar Room No. 115, Department
Mining Engineering of Mathematical Sciences, IIT
(BHU).

More details about the instructor(s) can be found on website.

Timings for Lectures and Tutorials


The timing of lectures and tutorials will be informed by the institute (or respective teacher) in
advance via email. The venue of the lecture will be informed by the institute in due time.

6
Tutorial Sheets
Tutorial 1
1. For any two events A and B, prove that

(a) P (Ac ∩ B) = P (B) − P (A ∩ B).


(b) P (A ∩ B c ) = P (A) − P (A ∩ B).

2. Let events A and B be such that

3 1 2
P (A ∪ B) = , P (A ∩ B) = and P (Ac ) = .
4 4 3

Find P (B) and P (A ∩ B c ).

3. Three newspapers A, B and C are published in a city and a survey of readers indicates
the following: 20% read A, 16% read B, 14% read C, 8% read both A and B, 5% read
both A and C, 4% read both B and C, 2% read all the three. For a person chosen at
random, find the probability that the person reads none of the newspapers.

4. A problem in statistics is given to the three students A, B and C whose chances of solving
it are 12 , 13 and 14 , respectively. What is the probability that the problem is solved?

5. Suppose there are n (≤ 365) persons at a birthday party. No person has a birthday on
29th February. What is the probability that at least two persons share the same birthday?
For n = 60, deduce that the chance of at least two persons sharing the same birthday is
greater than 99%.

6. If P (A) = a and P (B) = b, then show that P (A | B) ≥ (a + b − 1)/b.

7. Three fair dice are thrown once. Given that no two dice show the same face.

(a) What is the probability that the sum of faces is 7?


(b) What is the probability that one is an ace?

8. If A and B are independent and A ⊆ B then show that either P (A) = 0 or P (B) = 1.

9. A single die is rolled; then n coins are tossed, where n is the number shown on the die.
What is the probability of getting exactly two heads?

10. The contents of Urn I, II and III are as follows:


1 white, 2 red and 3 black balls,
2 white, 2 red and 2 black balls and
3 white, 1 red and 2 black balls.
One urn is chosen at random and two balls are drawn. They happen to be white and red.
What is the probability that they come from Urn I (or II or III)?

7
Tutorial 2

1. Let X be a discrete random variable with cumulative distribution function given by





 0, x < −2,


 1/10,


 −2 ≤ x < −1,
FX (x) = 3/10, −1 ≤ x < 1,





 6/10, 1 ≤ x < 2,

 1, x ≥ 2.

Find the probability mass function of X.

2. Let X be a random variable that can take the values −2, −1, 0, 1, 2 and

P (X = −2) = P (X = −1),
P (X = 2) = P (X = 1),
P (X > 0) = P (X < 0) = P (X = 0).

Find the probability mass function of X and its distribution function.

3. Find k such that the function f defined by



kx2 , 0 < x < 1,
fX (x) =
0, otherwise

is a probability density function. Also, determine P (1/3 < X ≤ 1/2).

4. Let X be a random variable with density function

k
fX (x) = , −∞ < x < ∞.
1 + x2

Determine k and the distribution function.

5. Let X be a random variable with probability mass function

x -3 6 9
pX (x) 1/6 1/2 1/3

Find E(X), E(X 2 ) and E(2X + 1)2 .

8
6. Find the mean and variance of a discrete random variable whose cumulative distribution
function is given by



 0, x < −1,



1/8, −1 ≤ x < 0,





FX (x) = 1/4, 0 ≤ x < 1,



1/2, 1 ≤ x < 2,







1, x ≥ 2.

7. The probability mass function of a discrete random variable X is given by





 0.1, x = −2,



0.2, x = 0,





pX (x) = 0.3, x = 2,



0.4, x = 5,







0, otherwise.

Find the expectation, the second central moment and standard deviation of X.

8. If a person gets |(2x + 5) where x denotes the number appearing when a balanced die is
rolled once, then how much money can be expected in the long run per game?

9. Let X be continuous random variable with density function



kxe−λx , x ≥ 0, λ > 0,
fX (x) =
0, otherwise.

Determine the constant k and obtain the mean and variance of X.

10. Let X be a random variable with density function


 "  2 #
1 x−α
fX (x) = k 1− , α − β < x < α + β,
β β

where −∞ < α < ∞ and β > 0.

(a) Find k so that fX (·) is a probability density function.


(b) Find the mean and variance of X.

9
Tutorial 3

1. Suppose that X is a random variable with E(X) = 10 and Var(X) = 25. Find the
positive numbers a and b such that Y = aX − b has mean 0 and variance 1.

2. A fair coin is tossed until a head appears. Let X denote the number of tosses required.

(a) Find the pmf of X.

(b) Find the mgf of X.

(c) Using the mgf, find the mean and variance of X.

3. Let X be a random variable with pdf

fX (x) = λe−λx , x > 0.

Find the mgf, mean and variance of X.

4. Let X be a discrete random variable with pmf



1
, for x = 1, 2, 3, . . . ,


x(1+x)
P (X = x) =
0,

otherwise.

Show that mgf of X exists however none of its moments exist.

5. Let X be a discrete random variable with pmf

P (X = x) = q x−1 p, for x = 1, 2, 3, . . . .

Find the probability generating function and hence the mean, variance, skewness and
kurtosis.

6. Let X be a continuous random variable with pdf

1 r−1 −x
fX (x) = x e , r, x > 0.
Γ(r)

Find the characteristic function and hence the mean, variance, skewness and kurtosis.

10
7. A multiple-choice test consists of 8 questions with 3 answers to each question (of which
only one is correct). A student answers each question by rolling a balanced die and
checking the first answer if he get 1 or 2, the second answer if he gets 3 or 4, the and
the 3rd answer if he gets 5 or 6. To get a distinction, the student must secure at least
75% correct answers. If there is no negative marking. What is the probability that the
student secure a distinction?

8. In 10 independent throws of a defective die, the probability that an even number will
appear 5 times is twice the probability that an even number will appear 4 times. Find
the probability that even number will not appear at all in 10 independent throws of the
die.

9. A certain type of missile hits its target with probability 0.3. Find the number of missiles
that should be fired so that there is at least a 90% probability of hitting the target.

10. The mathematics department has 8 graduate assistants who are assigned to the same
office. Each assistant is just as likely to study at home as in office. Find the minimum
number m of desk that should be put in the office so that each assistant has a desk at
least 90% of the time.

11
Tutorial 4

1. Find the median of binomial 5, 12 distribution.




2. Find the mode or modes of binomial (n, p) distribution.

3. If there is a war every 15 years on the average, then find the probability that there will
be no war in 25 years.

4. In a book of 520 pages, 390 typo-graphical errors occur. Assuming Poisson law for the
number of errors per page, find the probability that a random sample of 5 pages will
contain no error.

5. An urn contains 1 white and 99 black balls. If 1,000 drawings are made with replacements,
then what is the probability that 10 drawings will yield white balls?

6. The probability of hitting a target is 0.001 for each shot. Find the probability of hitting
the target with two or more bullets if the number of shots is 5,000.

7. If X ∼ U (0, 2) and Y ∼ Exp(λ), find the value of λ such that

P (X < 1) = P (Y < 1).

8. Let X ∼ N (30, 52 ). Then find the probabilities that

(a) P (26 ≤ X ≤ 40)


(b) P (X ≥ 45)
(c) P (|X − 30| > 5)

9. The local authorities in a certain city install 10,000 electric lamps in the street of the city.
If these lamps have an average life of 1,000 burning hours with a standard deviation of
200 hours. Assuming that the life of a lamp is normally distributed, find the number of
lamps expected to fail

(i) in the first 800 burning hours.


(ii) between 800 and 1,200 burning hours.

After what period of burning hours would you expect that

(a) 10% of the lamps would fail?


(b) 10% of the lamps would be still burning?

10. Find the median and mode of N (µ, σ 2 ) distribution.

12
Tutorial 5

1. The cdf of a discrete random variables X is given by




 0, x < 0,
1/4, 0 ≤ x < 1,

FX (x) =

 1/2, 1 ≤ x < 2,
1, x ≥ 2.

Find the pmf for X and the cdf of Y = X 2 .


2. A discrete random variable X assumes each of the values of the set {−10, −9, . . . , 9, 10}
with equal probability. Compute the following probabilities:
(a) P (4X ≤ 2).
(b) P (4X + 4 ≤ 2).
(c) P (X 2 − X ≤ 2).
(d) P (|X − 2| ≤ 2).
3. The pmf of a discrete random variable X is given by


 0.2, x = −2,
 0.3, x = −1,


pX (x) = 0.4, x = 1,
0.1, x = 2,




0, otherwise.

Find the expectation of the following function of X.


(a) Y = 3X − 1.
(b) Z = −X.
(c) W = |X|.
4. Let X be a random variable with pdf

 0, x ≤ 0,
1
fX (x) = 2
, 0 < x ≤ 1,
 1
2x2
, x > 1.
1
Show that X
has the same distribution as X.
1 1

5. If X ∼ N (0, 1) then show that X 2 ∼ G ,
2 2
, where G (r, λ) represents the gamma
distribution with parameters r and λ.
6. Let X and Y be two discrete random variables with joint pmf given by
X
−1 0 1
Y
0 a 2a a
1 3a 2a a
2 2a a 2a

13
Find
(i) marginal distributions of X and Y .
(ii) conditional distribution of X given Y = 2.
7. Two tetrahedra with sides numbered 1 to 4 are rolled. Let X denote the number on the
down turned face of the first tetrahedra and Y be the larger of the down turned numbers.
Find
(a) the joint pmf of X and Y .
(b) the marginal distributions of X and Y .
(c) P (X ≤ 2, Y ≤ 3).
(d) the conditional distribution of Y given X = 2 (and X = 3).
(e) E(Y | X = 2) and E(Y | X = 3).
8. The joint pdf of X and Y is given by
x+y
pX,Y (x, y) = , x = 1, 2, 3 and y = 1, 2.
21
Show that the marginal pmfs of X and Y are
2x + 3 3y + 6
pX (x) = , x = 1, 2, 3 and pY (y) = , y = 1, 2,
21 21
respectively. Also, find
(a) P (X ≤ 2) and P (Y ≤ 2).
(b) the conditional distribution of Y given X = x.
9. The joint pdf of (X, Y ) is given by

α(x + y), 0 ≤ x ≤ y ≤ 1,
fX,Y (x, y) =
0, othercwise.

Find α and P (X + Y ≤ 1). Also, find the cdf of the joint distribution of (X, Y ).
10. If X and Y are two random variables having joint pdf
 1
8
(6 − x − y), 0 < x < 2, 2 < y < 4,
fX,Y (x, y) =
0, otherwise.

Find
(a) P (X < 1, Y < 3).
(b) P (X + Y < 3).
(c) P (X < 1 | Y < 3).
(d) the marginal distributions of X and Y .
(e) the conditional distribution of Y given X = x and X given Y = y.
(f) E (Y | X = 1) and E(X | Y = 3).

14
Tutorial 6

1. Suppose that a random variable X is uniformly distributed over the interval (0, 1). As-
sume that the conditional distribution of Y given X = x has a binomial distribution with
parameters n and p = x. Find the distribution of Y and hence, find its expected value.

2. Let X1 and X2 be independent random variables with pmf given by

1
P (Xi = −1) = P (Xi = 1) = , for i = 1, 2.
2

Are X1 and X1 X2 independent?

3. Let X and Y be two discrete random variables with joint pmf

Y 0 1 2
X
1 0.3 0.2 0.1
2 0.1 0.0 0.3

(a) Are X and Y independent?

(b) Determine the correlation coefficient between X and Y .

4. The joint pmf of X and Y is given in the following table:

Y 5 7
X
1
3 p 3
−p
1 1
6 2
−p 6
+p

If Cov(X, Y ) = −1/2 then obtain pij , for i = 3, 6 and j = 5, 7 and hence, calculate
P (Y = 5 | X = 6).

5. The joint pdf of X and Y is given by



 2, 0 < x < 1, 0 < y < x,
fX,Y (x, y) =
 0, otherwise.

(a) Find the marginal density of X and Y .

15
(b) Find the conditional density of Y given X = x and X given Y = y.

(c) Are X and Y independent?

6. Let the joint pdf of X and Y be



 6x, 0 < x < y < 1,
fX,Y (x, y) =
 0, otherwise.

Find the correlation coefficient ρX,Y , E(X + Y ), Var(X + Y ) and E(Y | X).

7. Let the joint pdf of X and Y be



 1− x+y
3
, 0 ≤ x ≤ 1, 0 ≤ y ≤ 2,
fX,Y (x, y) =
 0, otherwise.

Find

(a) E(X | Y = 0.5) and E(Y | X = 0.5).

(b) the correlation coefficient ρX,Y .

8. Let X and Y be two discrete random variables with the joint pmf

X Y -2 0 2
1 1 1
-1 9 9 9
1 2 1
0 9 9 9
1 1
1 0 9 9

Find the joint pmf of U = X 2 and V = |Y |.

9. Let X and Y be two independent random variable having Γ(ℓ) and Γ(m) distribution,
respectively. Then find the distribution of (a) X + Y and (b) X|Y .

10. If the random variable X and Y have joint density function



xy

96
, 0 < x < 4, 1 < y < 5,
fX,Y (x, y) =
 0, otherwise.

Find the pdf of X + 2Y .

16
Tutorial 7

1. Let X1 , X2 , . . . , Xn be iid random variables with common density function



 1 , 0 < x < θ, (0 < θ < ∞)

θ
f (x) =
0,

otherwise.

d
Let X(n) = max(X1 , X2 , . . . , Xn ). Show that X(n) → X, where the cdf of X is given by

0

x < θ,
FX (x) =
1, x ≥ θ.

iid p
2. Let X1 , X2 , . . . , Xn ∼ U (0, θ). Show that X(n) → 0.

3. For a discrete random variable with pmf

1 6 1
pX (x) = I(−1) (x) + I(0) (x) + I(1) (x),
8 8 8

evaluate P (|X − µ| ≥ 2σ) and show that Chebyshev’s inequality can not be improved.

4. If X is a continuous random variable with E(X) = µ satisfying P (X ≤ 0) = 0, show that

1
P (X > 2µ) ≤ .
2

5. Let {Xn } be a sequence of mutually independent random variables such that

2  √  1
P (Xn = 0) = and P Xn = ± 3 = .
3 6

Show that the WLLN holds.

6. Let {Xn } be a sequence of mutually independent random variables such that

1
1 − 2−n and P Xn = ±2−n = 2−n−1 .
 
P (Xn = ±1) =
2

Does the WLLN hold for this sequence?

17
7. If 10 fair dice are rolled. Use CLT to find the approximate probability that the sum
obtained is between 30 and 40.

8. A die is thrown 720 times. Find an approximate value of the probability of the following
events:

(a) Six lies between 100 and 140.

(b) Six comes for more than 130 times.

9. Use central limit theorem to show


 
X − np
P a≤ √ ≤ b → ϕ(b) − ϕ(a) as n → ∞,
npq

where X follows Binomial distribution with parameters n and p and


Z x
1 1 2
ϕ(x) = √ e− 2 u du.
2π −∞

10. Let X1 , X2 , . . . , Xn be iid random variables each having exponential distribution with
parameter λ = 1. Use central limit theorem to show that the distribution of X̄n =
1
Pn
n i=1 Xi tends to normal as n tends to infinity.

18
Tutorial 8

1. Let X ∼ χ216 . Find P (X > 32).

2. Find the mean, variance, skewness and kurtosis of χ2n -distribution.


n
iid
X
3. Let X1 , X2 , . . . , Xn ∼ N (0, 1). Show that Xi2 ∼ χ2n .
i=1

4. Let X ∼ N (0, 1) and Y ∼ χ2n , and let X and Y be independent. Prove that the pdf of
the statistic T = √X is
Y /n

n+1
  −(n+1)/2
Γ t2
fT (t) = n
2
√ 1+ , −∞ < t < ∞.
Γ 2
nπ n

5. If T ∼ tn then, for k < n, show that



 0, if k is odd
k

E T = k/2
n k!Γ((n − k)/2)
 , if k is even.
2k (k/2)!Γ(n/2)

Hence, find the mean, variance, skewness and kurtosis of T .


iid
6. Let X1 , X2 , . . . , Xn ∼ N (µ, σ 2 ). Show that

n(X̄n − µ)
∼ tn−1 .
s

7. If T ∼ tn then show that T 2 ∼ F1,n .

8. If P (Fm,n < fα,m,n ) = α the show that

1
f1−α,n,m = .
fα,m,n

9. Let s21 and s22 be the sample variances from two independent samples of sizes n1 = 5 and
n2 = 4 from two populations having the same unknown variance σ 2 . Find (approximately)
the probability that

s21 1 s21
2
< or > 6.25.
s2 5.2 s22

iid 
10. Let X1 , X2 , . . . , Xn ∼ N (µ, σ 2 ). Prove that X̄n and X1 − X̄n , X2 − X̄n , . . . .Xn − X̄n
are independent. Hence, show that X̄n and s2 are independent.

19
Tutorial 9

1. Show that
Pn
Xi ( ni=1 Xi − 1)
P
i=1
n(n − 1)

is an unbiased estimate of θ2 , for the sample X1 , X2 , . . . , Xn drawn on X which takes the


values 0 and 1 with respective probabilities (1 − θ) and θ.

iid s2
2. Let X1 , X2 , . . . , Xn ∼ N (µ, σ 2 ). Show that X̄n − is an unbiased estimator for µ2 .
n
3. If X1 , X2 , . . . , Xn are random observations on a Bernoulli variate X taking the value 0

with probability (1 − p) and the value 1 with probability p, show that X̄n 1 − X̄n is a
consistent estimator of p(1 − p).

iid
4. Let X1 , X2 , . . . , Xn ∼ U (a, b). Find the method of moments estimator for a and b.

iid
5. Let X1 , X2 , . . . , Xn ∼ B(α, β). Find the method of moments estimator for α and β.

6. Prove that the maximum likelihood estimate of the parameter α of a population having
density function

2
f (x; α) = (α − x), 0 < x < α,
α2

for a sample of unit size is 2x, x being the sample value. Also, show that the estimate is
biased.

7. Obtain the maximum likelihood estimate of θ for

f (x; θ) = (1 + θ)xθ , 0 < x < 1,

based on an independent sample of size n.

8. A random sample of size n = 100 is taken from a population with σ = 5.1. Given that the
sample mean is x̄ = 21.6, construct a 95% confidence interval for the population mean µ.

9. We know that silk fibers are very tough but in short supply. Engineers are making break-
throughs to create synthetic silk fibers that can improve everything from car bumpers

20
to bullet-proof vests or to make artificial blood vessels. One research group reports the
summary statistics

n = 18, x̄ = 22.6 and s = 15.7

for the toughness (MJ/m3 ) of processed fibers. Construct a 95% confidence interval for
the mean toughness of these fibers. Assume that the population is normal.

10. A major manufacturer of processed meats monitors the amount of each ingredient. The
weight(lb) of cheese per run is measured on n = 80 occasions.
72.2 67.8 78.0 64.4 76.3 72.3 73.1 71.7 66.2 63.3 85.4 67.4
66.3 76.3 57.7 50.3 77.4 63.1 73.9 67.4 74.7 68.2 87.4 86.4
69.4 58.0 63.3 72.7 73.6 68.8 63.3 63.3 73.0 64.8 73.1 70.9
85.9 74.4 75.9 72.3 84.3 61.8 79.2 64.3 65.4 66.7 77.2 50.0
70.3 90.4 63.9 62.1 68.2 55.1 52.6 68.5 55.2 73.5 53.7 61.7
47.9 72.3 61.1 71.8 83.1 71.2 58.8 61.8 86.8 64.5 52.3 58.3
65.9 80.2 75.1 59.9 62.3 48.8 64.3 75.4
Assume the population is normal, construct a 95% confidence interval for the population
standard deviation σ.

21
Tutorial 10

1. Assuming that a new toothpaste company make a claim that the “Adding water to tooth-
paste protects against cavities”. So, in order to check whether the claim by the company
is correct or not by using the hypothesis testing, one of the statisticians took the sample
of 4 persons and population mean value are µ = µ0 whose overall standard deviation is
2. Find the probability of rejecting H0 when X̄n > µ0 + 2.

2. A LEDs producing company reveal a fact that their bulbs will going to last more than
50000 hours on an average thought out their lifetime. A sample of 20 families was taken
and and reject the null hypothesis when sample mean if greater than 56000. Overall
standard deviation for the sampling result is found to be 6000 hours. Find the probability
of type I error.

3. Assuming that on average hours that ladies used to spend on watching TV in a day is
6 hours. A sample of 35 families is selected and reject the null hypothesis when sample
mean is greater than 10. Overall, standard deviation of hours of the respective families
is 24. Find the probability of type I error.

4. Suppose that the manufacturer of a new medicine company wants to test the null hy-
pothesis p = 0.90 against the alternative hypothesis p = 0.60. His test statistic is X, the
observed number of successes in 20 trials and he will accept the null hypothesis if X > 14
otherwise he will reject it. Find α (type I error) and β (type II error).

5. Suppose that we want to test the null hypothesis that the mean of a normal population
with σ 2 = 1 is µ = µ0 against the alternative hypothesis that it is µ = µ1 , where µ1 > µ0 .
Find the value of k such that X̄n > k. Provides a critical region of size α = 0.05 for a
random sample of size n.

6. Let X1 and X2 constitutes a random sample from normal population with σ 2 = 1. If the
null hypothesis µ = µ0 is to be rejected in favor of alternative hypothesis µ = µ1 > µ0
when X̄n > µ0 + 1, what is the size of the critical region?

7. Suppose X1 , X2 , . . . , Xn is a random sample from a normal population with mean µ and


variance 16. Find the test with the best critical region, that is, find the most powerful
test, with a sample size of n = 16 and a significance level α = 0.05 to test the simple null
hypothesis H0 : µ = 10 against the simple alternative hypothesis H1 : µ = 15.

8. A food processing company packages honey in small glass jars. Each jar is supposed to
contain 10 fluid ounces of the sweet and gooey good stuff. Previous experience suggests
that the volume X, the volume in fluid ounces of a randomly selected jar of the company’s
honey is normally distributed with a known variance of 2. Derive the likelihood ratio test
for testing, at a significance level of α = 0.05, the null hypothesis H0 : µ = 10 against the
alternative hypothesis H1 : µ ̸= 10.

9. Scientists need to be able to detect small amounts of contaminants in the environment.


As a check on current capabilities, measurements of lead content (µg/L) are taken from

22
twelve water specimens spiked with a known concentration. (courtesy of Paul Berthouex).

2.4 2.9 2.7 2.6 2.9 2.0 2.8 2.2 2.4 2.4 2.0 2.5

Test the null hypothesis µ = 2.25 against the alternative hypothesis µ ̸= 2.25 at the 0.01
level of significance.
10. In 64 randomly selected hours of production, the mean and the standard deviation of the
number of acceptable pieces produced by an automatic stamping machine are x̄ = 1, 038
and s = 146. At the 0.05 level of significance, does this enable us to reject the null
hypothesis µ = 1, 000 against the alternative hypothesis µ > 1, 000?
11. The lapping process which is used to grind certain silicon wafers to the proper thickness
is acceptable only if σ, the population standard deviation of the thickness of dice cut
from the wafers, is at most 0.50 mil. Use the 0.05 level of significance to test the null
hypothesis σ = 0.50 against the alternative hypothesis σ > 0.50, if the thicknesses of 15
dice cut from such wafers have a standard deviation of 0.64 mil.
12. If 15 determinations of the purity of gold have a standard deviation of 0.0015, test the
null hypothesis that σ = 0.002 for such determinations. Use the alternative hypothesis
σ ̸= 0.002 and the level of significance α = 0.05.
13. The following table gives the number of breakdown in a factory in various days of a week:
Day Mon Tue Wed Thur Fri Sat Sun
Number of Breakdowns 14 22 16 18 12 19 11

Us χ2 −test, check the breakdowns are uniformly distributed or not.


14. When the first proof of 392 pages of a book of 1200 pages were read, the distribution of
printing mistakes were found to be as follows:
No. of mistakes in a page 0 1 2 3 4 5 6
No. of pages 275 72 30 7 5 2 1
Fit a Poisson distribution to the above data and test the goodness of fit at the 0.05 level
of significance.
15. Consider the given data of company
Stock Returns f
10 but <20 2
20 but <30 4
30 but <40 8
40 but <50 15
50 but <60 25
60 but <70 10
70 but <80 5
Test at the 0.05 level of significance whether the data can be looked upon as values of
random variable having normal distribution.

23
Answers of Tutorials
Tutorial 1
2. 2/3 and 1/12.

3. 13/20.

4. 3/4.
    
1 2 n−1
5. 1 − 1 − 1− ··· 1 − .
365 365 365
7. (a) 1/20 (b) 1/2.

9. 99/384.

10. 2/11.

Tutorial 2
1. The pmf is given by

x −2 −1 1 2
1 2 3 4
pX (x) 10 10 10 10

2. The pmf is given by

x -2 -1 0 1 2
pX (x) 1/6 1/6 1/3 1/6 1/6

and the cdf is given by




 0, if x < −2,

1/6, if −2 ≤ x < −1,





 2/6, if −1 ≤ x < 0,
FX (x) = P (X ≤ x) =


 4/6, if 0 ≤ x < 1,
1 ≤ x < 2,



 5/6, if

1, if x ≥ 2.

1 1 19

3. k = 3 and P 3
<x≤ 2
= 216
.

4. k = 1/π and FX (x) = 1


π
tan−1 x + 12 , −∞ < x < ∞.

5. E(X) = 11/2, E(X 2 ) = 93/2 and E(2X + 1)2 = 209.

6. E(X) = 9/8 and Var(X) = 71/64.

24
7. E(X) = 2.4, E(X − µ)2 = 5.84 and σ = 2.42.

8. 12.

9. k = λ2 , E(X) = 2/λ and Var(X) = 2/λ2 .

10. (a) k = 3/4 (b) E(X) = α and Var(X) = β 2 /5.

Tutorial 3
1. a = 1/5 and b = 2.
1
2. (a) P (X = x) = 2x
, x = 1, 2, 3, . . ..
(b) MX (t) = et /(2 − et ), et < 2.
(c) E(X) = 2 and Var(X) = 2.

3. MX (t) = λ/(λ − t), λ > t, E(X) = 1/λ and Var(X) = 1/λ2 .


√ p2
5. GX (t) = pt/(1 − qt), t < 1/q, E(X) = 1/p, σ 2 = q/p2 , β1 = (2 − p)/ q and β2 = 6 + q
.

6. ϕX (t) = (1 − it)−r , E(X) = σ 2 = r, β1 = 2/ r and β2 = 6/r.

7. 0.0197.
10
8. 83 .

9. 7.

10. 6.

Tutorial 4
1. 2.5.

2. Case I. If (n + 1)p is an integer then (n + 1)p − 1 and (n + 1)p both are mode.
Case II. If (n + 1)p is not an integer then the integral part of (n + 1)p is a mode.

3. 0.188.

4. e−3.75 .

5. 0.12574.

6. 0.96.

7. λ = ln 2.

8. (a) 0.7653 (b) 0.0013 (c) 0.6826.

9. (i) 1587 (ii) 6826 (a) 744 (b) 1256.

10. median = µ = mode.

25
Tutorial 5
1. The pmf of Y = X 2 is given by

1


 4
, y=0

1, y = 1

pY (y) = 4
1


 2
, y=4


0, otherwise.

The cdf of Y = X 2 is given by





 0, y < 0,

 1 , 0 ≤ y < 1,

FY (y) = 4
1


 2
, 1 ≤ y < 4,


1, y ≥ 4.

2. (a) 11/21 (b) 10/21 (c) 4/21 (d) 2/7.

3. (a) −1.3 (b) 0.1 (c) 1.3.

6. (a) The marginal of X is given by


x -1 0 1
pX (x) 2/5 1/3 4/15
The marginal of Y is given by
y 0 1 2
pY (y) 4/15 2/5 1/3
(b) P (X = −1 | Y = 2) = 25 , P (X = 0 | Y = 2) = 1
5
and P (X = 1 | Y = 2) = 25 .

7. (a) The joint pmf of X and Y is given by

Y 1 2 3 4 fY (y)
1 1
1 16
0 0 0 16
1 2 3
2 16 16
0 0 16
1 1 3 5
3 16 16 16
0 16
1 1 1 4 7
4 16 16 16 16 16
4 4 4 4
fX (x) 16 16 16 16
1
(b) The marginal of X is given by

26
x 1 2 3 4
pX (x) 4/16 4/16 4/16 4/16
The marginal of Y is given by
y 1 2 3 4
pY (y) 1/16 3/16 5/16 7/16
(c) 3/8.
(d) The conditional distribution of Y given X = 2 is given by

1 1 1
P (Y = 2 | X = 2) = , P (Y = 3 | X = 2) = and P (Y = 4 | X = 2) = .
2 4 4
The conditional distribution of Y given X = 3 is given by

3 1
P (Y = 3 | X = 3) = and P (Y = 4 | X = 3) = .
4 4

(e) E(Y | X = 2) = 11/4 and E(Y | X = 3) = 13/4.

8. (a) 4/7 and 1.


(b) The conditional distribution of Y for X = x is

x+y
pY |X=x (y | x) = , x = 1, 2, 3, y = 1, 2.
2x + 3

9. α = 2, P (X + Y ≤ 1) = 1/3 and the joint cdf is given by




 0, x < 0 or y < 0 or both,

2 2 3
 x y + xy − x , 0 ≤ x ≤ y ≤ 1,



FX,Y (x, y) = x + x2 − x3 , 0 < x < 1 and y > 1,

y3, x > y and 0 < y < 1,





 1, x > 1 and y > 1.

10. (a) 3/8.


(b) 5/24.
(c) 3/5.
(d) The marginal distribution of X is given by
(
1
4
(3 − x), 0 < x < 2,
fX (x) =
0, otherwise.

27
The marginal distributions of Y is given by
(
1
4
(5 − y), 2 < y < 4,
fY (y) =
0, otherwise.

(e) The conditional distributions of X and Y are

6−x−y
fX|Y (x | y) = , 0 < x < 2, 2 < y < 4
2(5 − y)
6−x−y
fY |X (y | x) = , 0 < x < 2, 2 < y < 4.
2(3 − x)

(f) E(Y | X = 1) = 17/6 and E(X | Y = 3) = 5/6.

Tutorial 6
1. The pmf of Y is given by

1
fY (y) = , y = 0, 1, . . . , n
n+1

and E(Y ) = n/2.

2. Yes.

3. (a) No (b) 0.46.

4. The joint pmf is given by

Y
5 7
X
1 1
3 12 4
5 1
6 12 4

and P (Y = 5 | X = 6) = 5/8.

5. (a) The marginal density function of X and Y are given by

fX (x) = 2x, 0 < x < 1.


fY (y) = 2(1 − y), 0 < y < 1.

(b) The conditional density function of Y given X = x is given by

1
fY |X (y | x) = , 0 < x < 1.
x

28
The conditional density function of X given Y = y is given by

1
fX|Y (x | y) = , 0 < y < 1.
1−y

(c) No.

6. √1 , 5 , 11 and x+1
.
3 4 80 2

11 7
7. (a) 24
and 9
(b) −0.0817861.

8. The joint pmf of U = X 2 and V = |Y | is given by

V 0 2
U
2 2
0 9 9
2 1
1 9 3

9. X + Y ∼ Γ(ℓ + m) and X/Y ∼ β(ℓ, m), where β(ℓ, m) denotes the beta distribution of
second kind.

10. The pdf of X + 2Y is given by

(u − 2)2 (u + 4)


 , 2 < u < 6,
2304




(3u − 8)

fX+2Y (u) = , 6 < u < 10,


 144
3
 348u − u − 2128 , 10 < u < 14.



2304

Tutorial 7
3. 1/4.

6. Yes.

7. 0.65.

8. (a) 0.9544 (b) 0.1587.

Tutorial 8
1. 0.01.

2. n, 2n, 8/ n and 12/n.
n 6
5. µ = 0, σ 2 = n−2
, for n > 2, β1 = 0, for n > 3 and β2 = n−4
for n > 4.

9. 0.154599.

29
Tutorial 9
v v
u n u n
u3 X u3 X
4. âMME = X̄n − t (Xi − X̄n )2 and b̂MME = X̄n + t (Xi − X̄n )2 .
n i=1 n i=1
Pn
1
(1 − X̄n ) X̄n − n1 ni=1 Xi2
2
 P 
X̄n X̄n − n i=1 Xi
5. α̂MME = Pn 2 and β̂MME = Pn 2 .
1 1
n i=1 Xi − X̄n n i=1 X i − X̄ n

−n
7. θ̂MLE = Qn − 1.
ln ( i=1 xi )
8. (20.6, 22.6).

9. (14.79, 30.41).

10. (8.29, 11.35).

Tutorial 10
1. 0.02275.

2. 0.

3. 0.16154.

4. 0.0112531 and 0.125599.


1.645
5. k = µ0 + √ .
n

6. 0.08.

7. x̄ ≥ 11.645.

9. The null hypothesis µ = 2.25 can not be rejected.

10. The null hypothesis µ = 1000 is rejected.

11. The null hypothesis σ = 0.50 can not be rejected.

12. The null hypothesis σ = 0.002 can not be rejected.

13 The null hypothesis (the breakdowns are uniformly distributed) can not be rejected.

14. The Poisson distribution is not a good fit to the given data.

15. The normal distribution is not a good fit to the given data.

30

You might also like