What Is Normal Distribution?

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

WHAT IS NORMAL DISTRIBUTION?

Normal distribution, also known as the Gaussian distribution, is a probability


distribution that is symmetric about the mean, showing that data near the mean
are more frequent in occurrence than data far from the mean.

In graphical form, the normal distribution appears as a "bell curve".

The normal distribution is a symmetrical, bell-shaped distribution in which the


mean, median and mode are all equal. It is a central component of inferential
statistics.
The standard normal distribution is a normal distribution represented in z
scores. It always has a mean of zero and a standard deviation of one.
We can use the standard normal table to calculate the area under the curve
between any two points

The Formula for the Normal Distribution


The normal distribution follows the following formula. Note that only the
values of the mean (μ ) and standard deviation (σ) are necessary

Normal Distribution Formula.

where:

 x = value of the variable or data being examined and f(x) the probability
function
 μ = the mean
 σ = the standard deviation
WHY MEAN VALUE IS ZERO?

z= (x-mean)/(SD) mean must be 0 and SD must be 1. if u equate any other value, then whole
equation will be infinite right. all data set (when graphically visualised) will follow a bell-
shaped symmetrical curve centred around the mean.

If all of the values in the sample are identical, the sample standard deviation will be zero.

When we convert our data into z scores, the mean will always end up being zero (it is, after
all, zero steps away from itself) and the standard deviation will always be one. Data
expressed in terms of z scores are known as the standard normal distribution, shown below in
all of its glory.

The standard normal distribution, also called the z-distribution, is a special normal
distribution where the mean is 0 and the standard deviation is 1. Any normal distribution can
be standardized by converting its values into z-scores. Z-scores tell you how many standard
deviations from the mean each value lies.

Mean is the average of the data that can be calculated by dividing the sum of the data by the
numbers of the data. The mean of any normal distribution is not zero. However, we can
normalize the data so that it has zero mean and one standard deviation, that is called as
standard normal distribution.

To standardize a value from a normal distribution, convert the individual value into a z-
score: Subtract the mean from your individual value. Divide the difference by the standard
deviation.

WHY STANDARD DEVIATION VALUES EQUAL TO ONE:

When you standardize a normal distribution, the mean becomes 0 and the standard deviation
becomes 1. This allows you to easily calculate the probability of certain values occurring in
your distribution, or to compare data sets with different means and standard deviations.

The mean of 0 and standard deviation of 1 usually applies to the standard normal distribution,
often called the bell curve. The most likely value is the mean and it falls off as you get farther
away. If you have a truly flat distribution then there is no value more likely than another.
The standard deviation of the z-scores is always 1. The graph of the z-score distribution
always has the same shape as the original distribution of sample values. The sum of the
squared z-scores is always equal to the number of z-score values.

Skewness::

 measures the degree of symmetry of a distribution. The normal distribution is


symmetric and has a skewness of zero.

If the distribution of a data set instead has a skewness less than zero, or
negative skewness (left-skewness), then the left tail of the distribution is longer
than the right tail; positive skewness (right-skewness) implies that the right tail
of the distribution is longer than the left.

Kurtosis
Kurtosis measures the thickness of the tail ends of a distribution in relation to
the tails of a distribution. The normal distribution has a kurtosis equal to 3.0.

Distributions with larger kurtosis greater than 3.0 exhibit tail data exceeding
the tails of the normal distribution (e.g., five or more standard deviations from
the mean). This excess kurtosis is known in statistics as leptokurtic, but is more
colloquially known as "fat tails." The occurrence of fat tails in financial
markets describes what is known as tail risk.

Distributions with low kurtosis less than 3.0 (platykurtic) exhibit tails that are
generally less extreme ("skinnier") than the tails of the normal distribution.

POSITIVE SKEWNESS:
In statistics, a positively skewed (or right-skewed) distribution is a type of distribution in
which most values are clustered around the left tail of the distribution while the right tail of
the distribution is longer. The positively skewed distribution is the direct opposite of the
negatively skewed distribution.
Income distribution is a prominent example of positively skewed distribution. This is because
a large percentage of the total people residing in a particular state tends to fall under the
category of a low-income earning group, while only a few people fall under the high-income
earning group. In positively skewed, the mean of the data is greater than the median (a large
number of data-pushed on the right-hand side). In other words, the results are bent towards
the lower side.
What causes positive skewness?
Another cause of skewness is start-up effects. For example, if a procedure initially has a lot of
successes during a long start-up period, this could create a positive skew on the data. (On the
opposite hand, a start-up period with several initial failures can negatively skew data.)
Skewness tells us the direction of outliers. In a positive skew, the tail of a distribution curve
is longer on the right side. This means the outliers of the distribution curve are further out
towards the right and closer to the mean on the left.
A positive skew means that the extreme data results are larger. This skews the data in that it
brings the mean (average) up. The mean will be larger than the median in a skewed data set.
A negative skew means the opposite: that the extreme data results are smaller.

NEGATIVE SKEWNESS:
Negative skew refers to a longer or fatter tail on the left side of the distribution, while
positive skew refers to a longer or fatter tail on the right. These two skews refer to the
direction or weight of the distribution. In addition, a distribution can have a zero skew.

An example of negatively skewed data could be the exam scores of a group of college
students who took a relatively simple exam. If you draw a curve of the group of students'
exam scores on a graph, the curve is likely to be skewed to the left.

What causes negative skewness?

Left-skewed distributions are also called negatively-skewed distributions. That's because


there is a long tail in the negative direction on the number line. The mean is also to the left of
the peak.

In statistics, a negatively skewed (also known as left-skewed) distribution is a type of


distribution in which more values are concentrated on the right side (tail) of the distribution
graph while the left tail of the distribution graph is longer.
Negative skew refers to a longer or fatter tail on the left side of the distribution, while
positive skew refers to a longer or fatter tail on the right. These two skews refer to the
direction or weight of the distribution.

You might also like