E Conference Entropy

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Eect of parameter q in the estimation of

entropy
Jayro Santiago-Paz

Let X be a random variable that take values of the set {x1 , x2 , ..., xM },
pi := P (X = xi ) the probability of occurrence, and M the cardinality of the

nite set, hence the Shannon entropy is:


H S (P ) =

M
X

pi logpi .

(1)

i=1

Figure 1(a) show the Shannon entropy of a r.v. X ,


H S (P ) = H S (p, 1 p) = p log



 
1
1
+ (1 p) log
,
p
1p

(2)

which takes on one of a set of possible values AX = {a, b}, having probabilities Px = {p, (1 p)} and pP [0, 1]. For a r.v. with possibles values A =
3
S
{a, b, c}, P (X) = {r, s, t}, and i=1
 PXi = 1,1the
 Shannon entropy is H (P ) =
1
1
S
H (r, s, t) = r log r + (s) log s + (t) log t , see Figure 1(b) for graphic representation. The convention that 0 log(0) = 0 is followed, which can be justied
by the fact that lmx0 x log(x) = 0.

(a) Shannon entropy for r.v X with


probabilities Px = {p, (1 p)}.

(b) Shannon entropy for r.v X with


probabilities Px = {r, s, t}.

Figura 1: Shannon entropy for r.v with dierent probabilities.


Similar to the eq. (2) one can dene the entropy in the case of Renyi and
Tsallis:
1
H R (P, q) =
log (pq + (1 p)q ) ,
(3)
1q

and
H T (P, q) =

1
[1 (pq + (1 p)q )] .
q1

(4)

Figure 2 shows the plots of Shannon entropy (2) compared to renyi entropy
(3) and Tsallis entropy (4) for various values of q parameter.

Figura 2: Entropy estimators for r.v X with probabilities Px = {p, (1 p)}.

(a) Renyi entropy with

q = 0,1.

(b) Renyi entropy with

q = 0,5.

(c) Renyi entropy with

q = 1,5.

(d) Renyi entropy with

q = 5,0.

Figura 3: Renyi entropy for r.v X with probabilities Px = {r, s, t} and some
values of q.

(a) Tsallis entropy with

q = 0,1.

(b) Tsallis entropy with

q = 0,5.

(c) Tsallis entropy with

q = 1,5.

(d) Tsallis entropy with

q = 5,0.

Figura 4: Tsallis entropy for r.v X with probabilities Px = {r, s, t} and some
values of q.

You might also like