Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Chapter 2.

Probability 2

2. Probability

(a) [1 point] Describe sample space associated with flipping a coin until either heads
or tails occurs twice. Choose one.

(i) {𝐻𝐻𝑇, 𝑇 𝐻𝐻, 𝐻𝑇 𝐻, 𝑇 𝑇, 𝐻𝑇 𝑇, 𝑇 𝐻𝑇 }


(ii) {𝐻𝐻, 𝑇 𝐻𝐻, 𝐻𝑇 𝐻𝑇, 𝑇 𝑇, 𝐻𝑇 𝑇, 𝑇 𝐻𝑇 }
(iii) {𝐻𝐻, 𝑇 𝐻𝐻, 𝐻𝑇 𝐻, 𝑇 𝑇, 𝐻𝑇 𝑇, 𝑇 𝐻𝑇 }
(iv) {𝐻𝐻, 𝑇 𝐻𝐻, 𝐻𝑇 𝐻, 𝑇 𝑇, 𝑇 𝑇 𝐻, 𝑇 𝐻𝑇 }
(v) {𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇 𝐻, 𝑇 𝑇, 𝑇 𝑇 𝐻, 𝑇 𝐻𝑇 }

(b) [1 point] Number of four–digit numbers that can be formed from digits 1, 2 and
3, if each four–digit number must be odd is (choose one)

(i) 27
(ii) 35
(iii) 44
(iv) 54
(v) 67

(c) [1 point] In two rolls of a fair die, let event 𝐴 be the event that no fours, fives or
sixes are rolled. Then, 𝑃 (𝐴) = (choose one)
8
(i) 36
9
(ii) 36
(iii) 10
36
11
(iv) 36
(v) 1336

(d) [1 point] Let 𝐸 and 𝐹 be two events of an experiment where 𝑃 (𝐸) = 0.35,
𝑃 (𝐹 ) = 0.15 and 𝑃 (𝐸 ∩ 𝐹 ) = 0.03. Then 𝑃 (𝐸¯ ∪ 𝐹¯ ) =

(i) 0.96
(ii) 0.97
(iii) 0.98
(iv) 0.99
(v) 1.00
Chapter 2. Probability 3

(e) [1 point] A survey was conducted comparing age with number of visits per year
to doctor. One person is chosen at random.

age → youth middle–aged elderly row totals


visits 1 to 3 70 95 35 200
4 to 8 130 450 30 610
9 to 11 90 30 70 190
column totals 290 575 135 1000

Chance person is a youth, given s/he makes 4–8 visits is (choose closest one):

(i) 0.112
(ii) 0.130
(iii) 0.183
(iv) 0.213
(v) 0.303

(f ) [1 point] Two tickets drawn at random without replacement from following box.

1𝑎 2𝑎 1𝑏 3𝑏 2𝑐 3𝑐

Probability first ticket is a “1” and second card is a “2” is (choose closest one)

(i) 0.1333
(ii) 0.2163
(iii) 0.2566
(iv) 0.3777
(v) 0.4333

(g) [1 point] Urn A has 10 red and 9 blue marbles; urn B has 10 red and 10 blue
marbles. A fair coin is tossed. If coin comes up heads, a marble from urn A is
chosen, otherwise a marble from urn B is chosen. Chance coin is flipped heads
given a red marble is chosen is (choose closest one)
17
(i) 39
(ii) 18
39
(iii) 19
39
20
(iv) 39
(v) 2139
Chapter 3. Discrete Random Variables and Their Probability Distributions 4

3. Discrete Random Variables and Their Probability Distributions

(a) [1 point] Number of sales of household appliances, 𝑌 , Whirlpool representative


Darlene makes in a day is given by following probability distribution.

𝑦 0 1 2 3 4 5
𝑝(𝑦) 0.10 0.28 0.18 0.11 0.16 0.17

Expected number of sales she makes is (choose closest one):

(i) 0.41
(ii) 1.45
(iii) 2.46
(iv) 3.45
(v) 3.76

(b) [1 point] Number of sales of household appliances, 𝑌 , Whirlpool representative


Darlene makes in a day is given by following probability distribution.

𝑦 0 1 2 3 4 5
𝑝(𝑦) 0.10 0.28 0.18 0.11 0.16 0.17

Standard deviation in number of sales she makes is (choose closest one):

(i) 0.37
(ii) 0.40
(iii) 1.66
(iv) 2.75
(v) 3.76

(c) [1 point] If 𝑉 (𝑌 ) = 6, then 𝑉 (2𝑌 − 4) = (choose one)

(i) 8
(ii) 16
(iii) 20
(iv) 24
(v) 32
Chapter 3. Discrete Random Variables and Their Probability Distributions 5

(d) [1 point] On a multiple choice exam with 5 possible answers for each of 10 ques-
tions, what is probability a student gets 8 or more correct answers just by
guessing? Choose closest one. [Hint: binomial.]
(i) 5.7926 × 10−5
(ii) 6.7926 × 10−5
(iii) 7.7926 × 10−5
(iv) 8.7926 × 10−5
(v) 9.7926 × 10−5
(e) [1 point] There is a 43% chance of making a basket on a free throw and each
throw is independent of each other throw. What is expected number of throws
to make first basket? Choose one. [Hint: geometric.]
(i) 2.33
(ii) 4.65
(iii) 6.11
(iv) 8.39
(v) 10.42
(f ) [1 point] There is a 95% chance of passing any exam. What is variance in number
of attempts until third exam is passed? Choose closest one. [Hint: negative
binomial.]
(i) 0.146
(ii) 0.156
(iii) 0.166
(iv) 0.176
(v) 0.186
(g) [1 point] Eight journalists randomly picked from a pack of 240 of which 15 are
also photographers. Chance 3 of 8 picked are photographers is (choose one)
( )( ) ( )( ) ( )( )
8 232 15 225 15 225
3 5 3 5 3 5
(i) ( ) (ii) ( ) (iii) ( )
240 225 240
8 8 8
( )( ) ( )( )
15 225 15 5
5 3 3 5
(iv) ( ) (v) ( )
240 15
8 8
Chapter 3. Discrete Random Variables and Their Probability Distributions 6

(h) [1 point] Average of 𝜆 = 7 particles hit a magnetic detection field per microsec-
ond. What is probability at most 5 particles hit in one microsecond? Choose
closest one. [Hint: poisson.]

(i) 0.231
(ii) 0.254
(iii) 0.273
(iv) 0.293
(v) 0.301

(i) [1 point] Identify the moment generating function


1 𝑡
4
𝑒
𝑚(𝑡) = .
1 − 34 𝑒𝑡

(i) binomial, 𝝁 = 4
(ii) binomial, 𝝈 = 4
(iii) geometric, 𝝁 = 4
(iv) geometric, 𝝈 = 4
(v) poisson, 𝝁 = 4

(j) [1 point] According to Tchebysheff’s Theorem, if 𝜇 = 2 and 𝜎 = 0.5 for random


variable 𝑌 , then 𝑃 (1 < 𝑌 < 3) ≥ 𝑎 where 𝑎 = (choose one)

(i) 0.75
(ii) 0.80
(iii) 0.85
(iv) 0.90
(v) 0.95
Chapter 4. Continuous Variables and Their Probability Distributions 7

4. Continuous Variables and Their Probability Distributions

(a) [1 point] Let 𝑌 be a continuous random variable where


{
𝑘𝑦 + 5 if 0 ≤ 𝑦 ≤ 10
𝑓 (𝑦) =
0 otherwise

Then constant 𝑘 is (choose one)

(i) − 47
50

(ii) − 48
50

(iii) − 49
50

(iv) − 50
50
(v) does not exist

(b) [1 point] Let 𝑌 be a continuous random variable where


{
1
if −3 ≤ 𝑦 ≤ 15
𝑓 (𝑦) = 𝑘
0 otherwise

Then constant 𝑘 = (choose one)

(i) 3
(ii) 9
(iii) 12
(iv) 15
(v) 18

(c) [1 point] Let 𝑌 be a continuous random variable where


{
1
if −3 ≤ 𝑦 ≤ 15
𝑓 (𝑦) = 18
0 otherwise

Then, for −3 ≤ 𝑦 ≤ 15, distribution 𝐹 (𝑦) =


𝒚−3
(i) 18
𝒚
(ii) 15
(iii) 𝒚−3
15
𝒚
(iv) 18
(v) 𝒚+318
Chapter 4. Continuous Variables and Their Probability Distributions 8

(d) [1 point] Let 𝑌 be a continuous random variable where



⎨ 0,
 𝑦 < −3,
𝑦+3
𝐹 (𝑦) = , −3 ≤ 𝑦 < 15,

⎩ 18
1, 𝑦 ≥ 15.

𝑃 (−2 < 𝑌 < 9) ≈ (choose closest one)

(i) 0.61
(ii) 0.68
(iii) 0.73
(iv) 0.79
(v) 0.81

(e) [1 point] Let 𝑌 be a continuous random variable where


{
1
if −3 ≤ 𝑦 ≤ 15
𝑓 (𝑦) = 18
0 otherwise

Then expected value 𝜇 = (choose closest one)

(i) 3
(ii) 6
(iii) 9
(iv) 15
(v) 18

(f ) [1 point] Let 𝑌 be a continuous random variable where


{
1
if −3 ≤ 𝑦 ≤ 15
𝑓 (𝑦) = 18
0 otherwise

Then variance 𝜎 2 = (choose closest one)

(i) 23
(ii) 24
(iii) 25
(iv) 26
(v) 27
Chapter 4. Continuous Variables and Their Probability Distributions 9

(g) [1 point] Let 𝑌 be a continuous random variable where


{
1
if −3 ≤ 𝑦 ≤ 15
𝑓 (𝑦) = 18
0 otherwise

Then 𝐸[2𝑌 3 − 𝑌 2 ] = (choose closest one)

(i) 1241
(ii) 1341
(iii) 1441
(iv) 1541
(v) 1641

(h) [1 point] Let 𝑍 be a standard normal variable.


𝑃 (−2.3 < 𝑍 < 0.14) = (choose closest one)

(i) 0.4449
(ii) 0.5449
(iii) 0.6449
(iv) 0.7449
(v) 0.8449

(i) [1 point] Gamma function evaluated at 6 is Γ(6) = (choose one)

(i) 6
(ii) 24
(iii) 120
(iv) 720
(v) 5040

(j) [1 point] A chi–squared random variable is a special case of a gamma random


variable with parameters (𝛼, 𝛽) = (choose one)
( )
𝝂
(i) 2
,0
( )
𝝂 1
(ii) ,
2 2
( )
𝝂
(iii) 2
,1
( )
𝝂
(iv) 2
,2
( )
𝝂
(v) 3
,2
Chapter 4. Continuous Variables and Their Probability Distributions 10

(k) [1 point] Memoryless property of exponential distribution is (choose one)

(i) 𝑷 (𝒀 > 𝒔 + 𝒕∣𝒀 > 𝒕) = 𝑷 (𝒀 > 𝒕); 𝒔, 𝒕 ≥ 0


(ii) 𝑷 (𝒀 > 𝒔 + 𝒕∣𝒀 > 𝒕) = 𝑷 (𝒀 > 𝒔); 𝒔, 𝒕 ≥ 0
(iii) 𝑷 (𝒀 > 𝒔∣𝒀 > 𝒕) = 𝑷 (𝒀 > 𝒔); 𝒔, 𝒕 ≥ 0
(iv) 𝑷 (𝒀 > 𝒔 + 𝒕∣𝒀 > 𝒕) = 𝑷 (𝒀 > 𝒔 + 𝒕); 𝒔, 𝒕 ≥ 0
(v) 𝑷 (𝒀 > 𝒔∣𝒀 > 𝒕) = 𝑷 (𝒀 > 𝒕)𝑷 (𝒀 > 𝒔); 𝒔, 𝒕 ≥ 0

(l) [1 point] For a Beta random variable, parameters (𝛼, 𝛽) = (4.5, 6.5),
𝜇 = (choose closest one)

(i) 0.409
(ii) 0.419
(iii) 0.429
(iv) 0.439
(v) 0.449
2 𝑡2 /2
(m) [1 point] Moment–generating function for normal random variable 𝑌 is 𝑒𝜇𝑡+𝜎
2
and so, for 𝑚(𝑡) = 𝑒−5𝑡+6𝑡 , 𝑃 (𝑌 ≤ −7) ≈ (choose closest one)

(i) 0.104
(ii) 0.211
(iii) 0.233
(iv) 0.254
(v) 0.282
Chapter 5. Multivariate Probability Distributions 11

5. Multivariate Probability Distributions

(a) [1 point] Consider joint density 𝑝(𝑦1 , 𝑦2 )

𝑦2 ↓ 𝑦1 → 1 2 3
-1 0 0.4 0.1
-2 0.3 0.2 0

The marginal density for 𝑌2 is (choose one)

𝑦1 -1 -2
(i)
𝑝(𝑦1 ) 0.5 0.5
𝑦1 1 2 3
(ii)
𝑝(𝑦1 ) 0.3 0.6 0.1
𝑦2 -1 -2
(iii)
𝑝(𝑦2 ) 0.5 0.5
𝑦2 1 2 3
(iv)
𝑝(𝑦2 ) 0.3 0.6 0.1
𝑦2 -1 -2
(v)
𝑝(𝑦1 ) 0.5 0.5

(b) [1 point] Consider joint density 𝑝(𝑦1 , 𝑦2 )

𝑦2 ↓ 𝑦1 → 1 2 3
-1 0 0.4 0.1
-2 0.3 0.2 0

𝐹 (3, −1) = (choose closest one)

(i) 0.1
(ii) 0.2
(iii) 0.3
(iv) 0.4
(v) 0.5
Chapter 5. Multivariate Probability Distributions 12

(c) [1 point] Consider joint density of 𝑌1 and 𝑌2


{
1
(3𝑦1 + 5𝑦2 ), 0 ≤ 𝑦1 ≤ 1, 0 ≤ 𝑦2 ≤ 1
𝑓 (𝑦1 , 𝑦2 ) = 4
0, otherwise

and also marginal densities for 𝑌1 and 𝑌2


{
3 5
𝑦
4 1
+ 0 < 𝑦1 < 1
𝑓1 (𝑦1 ) = 8
0 elsewhere

and {
3
+ 45 𝑦2 0 < 𝑦2 < 1
𝑓2 (𝑦2 ) = 8
0 elsewhere
Then 𝑓 (𝑦1 ∣𝑦2 ) = (choose one)
1
(3𝒚1 +5𝒚2 )
(i) 4
3
+5 𝒚
8 4 1
3
𝒚 +5
4 1
(ii) 1
8
(3𝒚1 +5𝒚2 )
4
1
(3𝒚1 +5𝒚2 )
(iii) 4
3
𝒚 +5
4 1 8
3
+5 𝒚
4 2
(iv) 1
8
(3𝒚1 +5𝒚2 )
4
1
(3𝒚1 +5𝒚2 )
(v) 4
3
+5 𝒚
8 4 2

(d) [1 point] Random variables 𝑌1 and 𝑌2 independent if (choose one)

(i) 𝑓 (𝑦1 , 𝑦2 ) = 𝑓1 (𝑦1 )𝑓2 (𝑦2 )


(ii) 𝑓 (𝑦1 , 𝑦2 ) ∕= 𝑓1 (𝑦1 )𝑓2 (𝑦2 )
(iii) 𝑓 (𝑦1 , 𝑦2 ) = 𝑓1 (𝑦1 ) + 𝑓2 (𝑦2 )
(iv) 𝑓 (𝑦1 , 𝑦2 ) ∕= 𝑓1 (𝑦1 ) + 𝑓2 (𝑦2 )
𝑓1 (𝑦1 )
(v) 𝑓 (𝑦1 , 𝑦2 ) ∕= 𝑓2 (𝑦2 )
Chapter 5. Multivariate Probability Distributions 13

(e) [1 point] Consider joint density 𝑝(𝑦1 , 𝑦2 )

𝑦2 ↓ 𝑦1 → 1 2 3
-1 0 0.4 0.1
-2 0.3 0.2 0

𝑉 (𝑌1 ) = (choose one)

(i) 0.16
(ii) 0.22
(iii) 0.28
(iv) 0.32
(v) 0.36

(f ) [1 point] Let {
6(1 − 𝑦2 ), 0 ≤ 𝑦1 ≤ 𝑦2 ≤ 1
𝑓 (𝑦1 , 𝑦2 ) =
0, otherwise
Then 𝐸(𝑌1 𝑌2 ) = (circle one)
1
(i) 20
2
(ii) 20
3
(iii) 20
4
(iv) 20
5
(v) 20
Chapter 5. Multivariate Probability Distributions 14

(g) [1 point] If {
6(1 − 𝑦2 ), 0 ≤ 𝑦1 ≤ 𝑦2 ≤ 1
𝑓 (𝑦1 , 𝑦2 ) =
0, otherwise
and two marginal densities are

𝑓1 (𝑦1 ) = 3 − 6𝑦1 + 3𝑦12

and
𝑓2 (𝑦2 ) = 6𝑦2 − 6𝑦22
3
and 𝐸(𝑌1 𝑌2 ) = 20
, then Cov(𝑌1 , 𝑌2 ) = (choose one)
1
(i) 40
2
(ii) 40
3
(iii) 40
4
(iv) 40
5
(v) 40

(h) [1 point] Consider joint density 𝑝(𝑦1 , 𝑦2 )

𝑦2 ↓ 𝑦1 → 1 2 3
-1 0 0.4 0.1
-2 0.3 0.2 0

Cov(3𝑌1 , 4𝑌2 ) = (choose one)

(i) 2.1
(ii) 2.2
(iii) 2.3
(iv) 2.4
(v) 2.5
Chapter 5. Multivariate Probability Distributions 15

(i) [1 point] Consider density


{ 𝑦1 +𝑦2
1
𝑦 𝑒−
8 1
2 , 𝑦1 > 0, 𝑦2 > 0
𝑓 (𝑦1 , 𝑦2 ) =
0, otherwise

then 𝑌1 and 𝑌2 are independent, where (choose one)

(i) 𝑌1 is gamma where 𝛼 = 2 and 𝛽 = 2, and 𝑌2 is exponential where 𝜈 = 1


(ii) 𝑌1 is gamma where 𝛼 = 2 and 𝛽 = 2, and 𝑌2 is exponential where 𝜈 = 3
(iii) 𝑌1 is gamma where 𝛼 = 2 and 𝛽 = 3, and 𝑌2 is exponential where 𝜈 = 2
(iv) 𝑌1 is gamma where 𝛼 = 2 and 𝛽 = 2, and 𝑌2 is exponential where 𝜈 = 2
(v) 𝑌1 is gamma where 𝛼 = 3 and 𝛽 = 2, and 𝑌2 is exponential where 𝜈 = 2

(j) [1 point] There are 9 different faculty members and 3 subjects: mathematics,
statistics and physics. There is a 60%, 35% and 15% chance a faculty mem-
ber teaches mathematics, statistics and physics, respectively. Let 𝑌1 , 𝑌2 and
𝑌3 represent number of faculty teaching mathematics, statistics and physics,
respectively. Then 𝑉 (𝑌1 + 3𝑌2 ) = (choose closest one)

(i) 9.0475
(ii) 9.1475
(iii) 9.2475
(iv) 9.3475
(v) 9.4475
Chapter 6. Functions of Random Variables 16

6. Functions of Random Variables

(a) [1 point] Let 𝑌 be a continuous random variable where


{
3 2
𝑦 , −1 ≤ 𝑦 ≤ 1
𝑓 (𝑦) = 2
0 elsewhere

Determine density for 𝑈 = 3 − 𝑌 . Choose one.

(i) 𝒇𝑼 (𝒖) = 1
2
(3 − 𝒖)2 , 2≤𝒖≤4
(ii) 𝒇𝑼 (𝒖) = 22 (3 − 𝒖)2 , 2≤𝒖≤3
(iii) 𝒇𝑼 (𝒖) = 32 (3 − 𝒖)2 , 2≤𝒖≤4
(iv) 𝒇𝑼 (𝒖) = 42 (3 − 𝒖)2 , 2≤𝒖≤4
(v) 𝒇𝑼 (𝒖) = 52 (3 − 𝒖)2 , 2≤𝒖≤6

(b) [1 point] Let 𝑌 be a continuous random variable where


{
1
, 9 ≤ 𝑦 ≤ 11
𝑓 (𝑦) = 2
0 elsewhere

𝑑 −1
If 𝑈 = 2𝑌 2 , then 𝑓𝑈 (𝑢) = 𝑓𝑌 (ℎ−1 (𝑢)) 𝑑𝑢
ℎ (𝑢) = (choose one)
[( ) 1 ] ( )− 1 ( )− 1
𝒖 1 𝒖 1 𝒖
(i) 𝒇𝒀 2
2
4 2
3
= 8 2
3

[( ) 1 ] ( )− 1 ( )− 1
𝒖 1 𝒖 1 𝒖
(ii) 𝒇𝒀 2
2
3 2
2
= 6 2
2

[( ) 1 ] ( )− 1 ( )− 1
𝒖 1 𝒖 1 𝒖
(iii) 𝒇𝒀 3
2
4 2
2
= 12 2
3

[( ) 1 ] ( )− 1 ( )− 1
𝒖 1 𝒖 1 𝒖
(iv) 𝒇𝒀 2
2
4 2
2
= 8 2
2

[( ) 1 ] ( )− 1 ( )− 1
𝒖 1 𝒖 1 𝒖
(v) 𝒇𝒀 2
3
4 2
3
= 12 3
3
Chapter 6. Functions of Random Variables 17

(c) [1 point] Consider independent geometric variables 𝑌1 , 𝑌2 , 𝑌3 , all with parameter


𝑝, 𝑖 = 1, 2, 3, and so all with moment generating function,
[ ]
𝑝𝑒𝑡
𝑚𝑌𝑖 (𝑡) = , 𝑖 = 1, 2, 3.
1 − (1 − 𝑝)𝑒𝑡

Calculate moment generating function of 𝑈 = 𝑌1 + 𝑌2 + 𝑌3 to determine distri-


bution of 𝑈 (choose one):
∑3
(i) binomial, with parameters (𝑟 = 𝑖=1 𝑛𝑖 , 𝑝)
(ii) negative binomial with parameters (𝑟 = 3, 𝑝)
∑3
(iii) negative binomial with parameters (𝑟 = 𝑖=1 𝑛𝑖 , 𝑝)
(iv) geometric with parameter 𝑝
(v) geometric with parameter 3

(d) [1 point] Consider 𝑌1 , . . . , 𝑌𝑛 independent beta with 𝛼 = 4 and 𝛽 = 1,


[ ] [ ]
Γ(𝛼 + 𝛽) 𝛼−1 Γ(5)
𝑦 (1 − 𝑦)𝛽−1 = 𝑦 4−1 (1 − 𝑦)1−1 = 4𝑦 3 ,
Γ(𝛼)Γ(𝛽) Γ(4)Γ(1)

with distribution function


∫ 𝑦
𝐹𝑌 (𝑦) = 4𝑡3 𝑑𝑡 = 𝑦 4 .
0

Expected value of 𝑌(𝑛) = max(𝑌1 , . . . , 𝑌𝑛 ) is (choose one)


𝒏
(i) 4𝒏+1
4𝒏
(ii) 4𝒏+1
4𝒏
(iii) 4𝒏+3
𝒏
(iv) 4𝒏+5
𝒏
(v) 4𝒏+6
Chapter 7. Sampling Distributions and the Central Limit Theorem 18

7. Sampling Distributions and the Central Limit Theorem

(a) [1 point] Assume number of fish caught, 𝑌 , at a lake on any trip, is a random
variable with following distribution.

𝑦 1 2 3
𝑝(𝑦) 0.1 0.8 0.1

2
Two parameters, 𝜇𝑋¯ and 𝜎𝑋 ¯ , for sampling distribution of average number of
fish caught on two trips to lake are given by, respectively, (choose closest pair)

(i) (2, 0.1)


(ii) (2, 0.2)
(iii) (2, 0.3)
(iv) (2, 0.4)
(v) (2, 0.5)

(b) [1 point] Consider 𝑇 , follows a 𝑡 distribution where 𝑛 = 15.


If 𝑃 (𝑇 ≤ 𝜙0.75 ) = 0.75, 𝜙0.75 = (choose one)

(i) 0.61
(ii) 0.65
(iii) 0.69
(iv) 0.73
(v) 0.77

(c) [1 point] Suppose lake level, 𝑌 , on any given day in Lake Michigan is normally
distributed, variance in lake level, 𝑆12 , is measured over 𝑛1 = 5 random days at
St. Joseph harbor, variance in lake level, 𝑆22 is measured
( 2
over
)
𝑛2 = 7 random
2 2 𝑆1
days at South Haven harbor. If 𝜎1 = 3𝜎2 and 𝑃 𝑆 2 < 𝑏 = 0.95, then 𝑏 =
2
(choose one)

(i) 9.34
(ii) 10.24
(iii) 10.75
(iv) 10.85
(v) 11.03
Chapter 7. Sampling Distributions and the Central Limit Theorem 19

(d) [1 point] We want to know fraction of times a measuring instrument is incorrect.


How many measurements should be taken by instrument if we want sample
fraction incorrect to within 0.05 of population fraction incorrect with probability
0.80? (Hint: maximum number occurs at 𝑝 = 0.5.) Choose one.

(i) 160
(ii) 164
(iii) 170
(iv) 174
(v) 180

You might also like