Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Ashland MultiComm Services (AMS) provides high-quality telecommunications services in the Greater

Ashland area. AMS traces its roots to a small company that redistributed the broadcast television signals
from nearby major metropolitan areas but has evolved into a provider of a wide range of broadband
services for residential customers.

AMS management believes that a combination of increased promotional expenditures, adjustment in


subscription fees, and improved customer service will allow AMS to successfully face these challenges.
To help determine the proper mix of strategies to be taken, AMS management has decided to organize a
research team to undertake a study. The managers suggest that the research team examine the
company’s own historical data for number of subscribers, revenues, and subscription renewal rates for
the past few years. They direct the team to examine year-to-date data as well, as the managers suspect
that some of the changes they have seen have been a relatively recent phenomenon.

1. What type of data source would the company’s own historical data be? Identify other possible
data sources that the research team might use to examine the current marketplace for
residential broadband services in a city such as Ashland.
 The data that is being used by the researchers is already existing, as it is historical data.
This is known as secondary data. Primary data is data that is collected directly by the
researchers who then perform the analysis on it.

2. What type of data collection techniques might the team employ?

The data collection team may employ any of the following techniques:

 Personal one-on-one interviews


 Questionnaires and surveys
 Observations collected from pre-existing data
 Focus groups, where a meeting is held and all the interested surveyors gathered are
asked questions regarding the service of AMS

3. In their suggestions and directions, the AMS managers have named a number of possible
variables to study, but offered no operational definitions for those variables. What types of
possible misunderstandings could arise if the team and managers do not first properly define
each variable cited?
 The variables identified from the company’s historical data are: number of subscribers,
revenues, and subscription renewal rates for the past few years. The managers direct
their team to examine year-to-date data as well, as the managers suspect that some of
the changes they have seen are relatively recent phenomena.
 Problems that could arise due to these non-operational definitions:
o The segregation of customers. The customers should be categorized according to
the services they receive, such as digital cable television, local and long-distance
telephone services, and high-speed Internet access. The analysis should then be
completed on each of these sub-categories, as opposed to a general all-inclusive
approach. If all of the data is grouped together, it makes it more difficult to pinpoint
the weakest service or the problem area. As each service provides a different
amount and type of data, the results will likely be impacted by grouping them all
together and it may be difficult to come to both accurate and useful conclusions
about the variables targeted.
o The variables should also be categorized based on services because the costs
associated with the services are different, such as installation costs or data
transmission costs.
o Analyzing based on a full year may also provide a source of error, because seasonal
trends may cause customers to utilize their services differently.

Meanwhile, the AMS technical services department has embarked on a quality improvement effort. Its
first project relates to maintaining the target upload speed for its Internet service subscribers. Upload
speeds are measured on a standard scale in which the target value is 1.0. Data collected over the past
year indicate that the upload speed is approximately normally distributed, with a mean of 1.005 and a
standard deviation of 0.10. Each day, one upload speed is measured. The upload speed is considered
acceptable if the measurement on the standard scale is between 0.95 and 1.05.

4. Assuming that the distribution of upload speed has not changed from what it was in the past
year, what is the probability that the upload speed is
 less than 1.0?
 between 0.95 and 1.0?
 between 1.0 and 1.05?
 less than 0.95 or greater than 1.05?

The mean = 1.005 and the standard deviation = .10

The following formula can be used to solve for the Z value used to determine the probability
distribution: Z = (xbar - mu)/(stdev/sqrt(n))

a) Probability that the upload speed is less than 1.0

= P[Z < (1 - 1.005)/(0.10/sqrt(1))]


= P[Z < -0.05], the Normal Distribution function in Excel can be used to calculate the probability.
= NORMSDIST(-0.05)
= 0.4801

b) Probability that the upload speed is between .95 and 1.0

= P[Z < (1 - 1.005)/(0.10/sqrt(1))] - P[Z < (0.95 - 1.005)/(0.10/sqrt(1))]


= NORMSDIST(-0.05) - NORMSDIST(-0.55)
= 0.48 - 0.2911
= 0.1889

c) Probability that the upload speed is between 1.0 and 1.05

= P[Z < (1.05 - 1.005)/(0.10/sqrt(1))] - P[Z < (1 - 1.005)/(0.10/sqrt(1))]


= NORMSDIST(0.45) - NORMSDIST(-0.05)
= 0.6736 - 0.4801
= 0.1935

d) probability that the upload speed is less than .95 or greater than 1.05

= P[Z < (0.95 - 1.005)/(0.10/sqrt(1))] + 1 - P[Z < (1.05 - 1.005)/(0.10/sqrt(1))]


= NORMSDIST(-0.55) + 1 - NORMSDIST(0.45)
= 0.2911 + 1 - 0.6736
= 0.6175

5. The objective of the operations team is to reduce the probability that the upload speed is below
1.0. Should the team focus on process improvement that increases the mean upload speed to
1.05 or on process improvement that reduces the standard deviation of the upload speed to
0.075? Explain.

If the std dev is reduced to 0.075, the probability that the speed is less than 1 is

= P[Z < (1 - 1.005)/(0.075/sqrt(1))]


= P[Z < -0.067]
= 0.4734

If the mean is improved to 1.05, then the probability that the speed is less than 1 is

= P[Z < (1 - 1.05)/(0.10/sqrt(1))]


= P[Z < -0.5]
= 0.3085

Therefore, the operations team should focus on process improvement that increases the mean
upload speed to 1.05 as the probability of upload speed below 1.0 is lesser in this case.

Continuing the quality improvement effort described above, the target upload speed for AMS Internet
service subscribers has been monitored. As before, upload speeds are measured on a standard scale in
which the target value is 1.0. Data collected over the past year indicate that the upload speeds are
approximately normally distributed, with a mean of 1.005 and a standard deviation of 0.10.

6. Each day, at 25 random times, the upload speed is measured. Assuming that the distribution has
not changed from what it was in the past year, what is the probability that the mean upload
speed is
a) less than 1.0?
b) between 0.95 and 1.0?
c) between 1.0 and 1.05?
d) less than 0.95 or greater than 1.05?
e) Suppose that the mean upload speed of today’s sample of 25 is 0.952. What conclusion can
you reach about the mean upload speed today based on this result? Explain.

a) Probability that the upload speed is less than 1.0:


The excel function used is:
=NORMDIST(1,1.005,0.02,1)
= 0.4013

b) Probability that the upload speed is between 0.95 and 1.0:


=NORMDIST(1,1.005,0.02,1) - NORMDIST(0.95,1.005,0.02,1)
= 0.3983

c) Probability that the upload speed is less than 0.95 or greater than 1.05?
=NORMDIST(1.05,1.005,0.02,1)-NORMDIST(1,1.005,0.02,1)
= 0.5868

d) Probability that the upload speed is less than 0.95 or greater than 1.05?
= 1 -{ P( 0.95 < mean < 1) +P(1< mean < 1.05)}
= 1 - 0.9848
= 0.0152

e) Comparing the mean speed of the sample today to the sample mean:
z = (0.952 - 1.005)/(0.10/√25) = -2.65
Today’s upload speed is within 3 standard deviations of the sample mean speed, which is
within the normal range.

7. Compare the results of AMS item 6 (a through (d above to those of AMS item 4. What
conclusions can you reach concerning the differences?

Numeral Sample Size 1 Sample Size 25


A, 1 0.4801 0.4013
B, 2 0.1889 0.3983
C, 3 0.1935 0.5868
D, 4 0.6175 0.0152

Comparing the differences, it can be seen when the sample size was larger, the probability
values were different. It is likely that the probability values are more accurate, and should be
used compared to the single sample value probabilities.

8. Continuing its monitoring of the upload speed described in item 4 above, the technical
operations department wants to ensure that the mean target upload speed for all Internet
service subscribers is at least 0.97 on a standard scale in which the target value is 1.0. Each day,
upload speed was measured 50 times, with the following results:
9. Compute the sample statistics and determine whether there is evidence that the population
mean upload speed is less than 0.97. Note your conclusions based on your statistical test.

Given: = 0.97, = 0.05. From the given data, s = 0.1560, x-bar = 0.9587, n = 50,

 The Hypothesis: H0: 0.97, Ha: < 0.97. This is a Left Tailed Test.
 The Test Statistic: Since the population standard deviation is unknown, the students t-
test is used. The test statistic is given by the equation:

 The p-value: The p-value for t = -0.152 and degrees of freedom (df) = n-1 = 49, is 0.4399.
 The Critical Value: The critical value (Left Tail) at = 0.05, for df = 49, t-critical = -1.677
 The Decision Rule: If t-observed is < -t-critical, then H0 will be rejected. Also, if the p-
value is < , then H0 will be rejected.
 The Decision: Since t-observed (-0.152) is > -t-critical (-1.677), H0 is not rejected. Also,
since the p-value (0.4399) is > (0.05), H0 is not rejected.
 Conclusions: We do not have sufficient evidence to prove that the population mean
upload speed is less than 0.97.

Sources:
Levine, David M. Statistics for Managers Using Microsoft Excel, 8th Edition. Pearson, 20160621.
VitalBook file. (Levine 28, 213, 235,304)

You might also like