Professional Documents
Culture Documents
S7-Slo1 & Slo2
S7-Slo1 & Slo2
The simple definition is that entropy is that it is the measure of the disorder of
a system. An ordered system has low entropy, while a disordered system has high
entropy. Physicists often state the definition a bit differently, where entropy is the
energy of a closed system that is unavailable to do work.
Entropy is the randomness or estimating the disorder of the input processed in ML. In
other words, entropy is the machine learning metric that measures the system's
impurity or unpredictability.
Shannon's theorem
Shannon's theorem is a fundamental result in information theory that
establishes a relationship between the entropy of a dataset and the maximum data
compression rate that can be achieved.
According to Shannon's theorem, the maximum data compression rate that can be
achieved is equal to the entropy of the dataset divided by the logarithm of the number
of possible outcomes.
In other words, Shannon's theorem states that the maximum data compression rate is
inversely proportional to the entropy of the data. The higher the entropy of the data,
the more difficult it is to compress and the lower the maximum data compression rate
that can be achieved.
Overall, Shannon's theorem is a fundamental result in information theory that helps to
understand the relationship between the entropy of a dataset and the maximum data
compression rate that can be achieved. It is an important tool for understanding data
compression limits and designing efficient data compression algorithms.