Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

12/03/2021 An Application of Information Theory in Music | CCTP-607: "Big Ideas": AI to the Cloud

CCTP-607: "Big Ideas": AI to the Cloud


Spring 2019

An Application of Information Theory in Music


This week’s reading introduced Shannon’s inmformation theory. What’s fascinating in his argument is that in-
formation is independent from meanings. He hold the idea that information can be measured and
standardized. Information theory allows us to have a deeper understanding of information and data in a funda-
mental way.

In his paper, Shannon argued that “the fundamental problem of communication is that of reproducing at one
point either exactly or approximately a message selected at another point.” (Shannon, 1948) For example, mu-
sic can be thought of as the transmission of information from one point to another. To put it in a communica-
tion system, the sound of music is a message and an encoder generates a distinct signal for the message. Sig-
nals go through a channel that connects transmitter and receiver. A decoder on the receiver end converts the
signals back into sound waves that we can perceive. 

According to Shannon, “information is entropy.” Entropy is a measure of disorder or uncertainty about the state
of a system. The more disordered a set of states is, the higher the entropy. Shannon considered entropy to be
the measure of the inherent information in a source (Gleick, 2011). Denning also pointed out that Information
is existing as physically observable patterns. Based on that, Febres and Ja é found a way to classify di erent
musical genres automatically.

https://blogs.commons.georgetown.edu/cctp-607-spring2019/2019/02/13/an-application-of-information-theory-in-music/ 1/3
12/03/2021 An Application of Information Theory in Music | CCTP-607: "Big Ideas": AI to the Cloud

Febres and Ja é solved the music classi cation by using the entropy of MIDI les. A MIDI le is a digital repre-
sentation of a piece of music that can be read by a wide variety of computers, music players and electronic
instruments. Each le contains information about a piece of music’s pitch and velocity, volume, vibrato, and so
on.  This enables music to be reproduced accurately from one point to another. In fact, a MIDI le is composed
of an ordered series of 0s and 1s, which allows them to compress each set of symbols into the minimum num-
ber necessary to generate the original music. After that, they measured the entropy associated with each piece
of music based on the fundamental set. They eventually found that music from the same genre shared similar
values for second order entropy. This case is an application of information theory, and it is really inspiring that
information theory has the potential be applied into many other elds.

References

Peter J. Denning and Craig H. Martell. Great Principles of Computing, Chap. 3, “Information.”

Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign, IL: University
of Illinois, 1949).

James Gleick, The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).
https://blogs.commons.georgetown.edu/cctp-607-spring2019/2019/02/13/an-application-of-information-theory-in-music/ 2/3
12/03/2021 An Application of Information Theory in Music | CCTP-607: "Big Ideas": AI to the Cloud

Martin Irvine, “Introduction to the Technical Theory of Information“

Musical Genres Classi ed Using the Entropy of MIDI Files, MIT Technology Review
https://www.technologyreview.com/s/542506/musical-genres-classi ed-using-the-entropy-of-midi- les/

This entry was posted in Week 5 on February 13, 2019 [https://blogs.commons.georgetown.edu/cctp-607-


spring2019/2019/02/13/an-application-of-information-theory-in-music/] by Beiyuan Gu.

https://blogs.commons.georgetown.edu/cctp-607-spring2019/2019/02/13/an-application-of-information-theory-in-music/ 3/3

You might also like