Properties of Shannon q-ary entropy and binary entropy functions #
The binary entropy function
binEntropy p := - p * log p - (1 - p) * log (1 - p)
is the Shannon entropy of a Bernoulli random variable with success probability p
.
More generally, the q-ary entropy function is the Shannon entropy of the random variable
with possible outcomes {1, ..., q}
, where outcome 1
has probability 1 - p
and all other outcomes are equally likely.
qaryEntropy (q : ℕ) (p : ℝ) := p * log (q - 1) - p * log p - (1 - p) * log (1 - p)
This file assumes that entropy is measured in Nats, hence the use of natural logarithms. Most lemmas are also valid using a logarithm in a different base.
Main declarations #
Real.binEntropy
: the binary entropy functionReal.qaryEntropy
: theq
-ary entropy function
Main results #
The functions are also defined outside the interval Icc 0 1
due to log x = log |x|
.
- They are continuous everywhere (
binEntropy_continuous
andqaryEntropy_continuous
). - They are differentiable everywhere except at points
0
or1
(hasDerivAt_binEntropy
andhasDerivAt_qaryEntropy
). In addition, due to junk values,deriv binEntropy p = log (1 - p) - log p
holds everywhere (deriv_binEntropy
). - they are strictly increasing on
Icc 0 (1 - 1/q))
(qaryEntropy_strictMonoOn
,binEntropy_strictMonoOn
) and strictly decreasing onIcc (1 - 1/q) 1
(binEntropy_strictAntiOn
andqaryEntropy_strictAntiOn
). - they are strictly concave on
Icc 0 1
(strictConcaveOn_qaryEntropy
andstrictConcave_binEntropy
).
Tags #
entropy, Shannon, binary, nit, nepit
Binary entropy #
The binary entropy function
binEntropy p := - p * log p - (1-p) * log (1 - p)
is the Shannon entropy of a Bernoulli random variable with success probability p
.
Instances For
binEntropy
is symmetric about 1/2.
binEntropy
is symmetric about 1/2.
Outside the usual range of binEntropy
, it is negative. This is due to log p = log |p|
.
Outside the usual range of binEntropy
, it is negative. This is due to log p = log |p|
.
Outside the usual range of binEntropy
, it is negative. This is due to log p = log |p|
Outside the usual range of binEntropy
, it is negative. This is due to log p = log |p|
For probability p ≠ 0.5
, binEntropy p < log 2
.
Binary entropy is continuous everywhere.
This is due to definition of Real.log
for negative numbers.
q
-ary entropy #
Shannon q-ary Entropy function (measured in Nats, i.e., using natural logs).
It's the Shannon entropy of a random variable with possible outcomes {1, ..., q}
where outcome 1
has probability 1 - p
and all other outcomes are equally likely.
The usual domain of definition is p ∈ [0,1], i.e., input is a probability.
This is a generalization of the binary entropy function binEntropy
.
Equations
- Real.qaryEntropy q p = p * Real.log ↑(↑q - 1) + Real.binEntropy p
Instances For
Outside the usual range of qaryEntropy
, it is negative. This is due to log p = log |p|
.
Outside the usual range of qaryEntropy
, it is negative. This is due to log p = log |p|
.
The q-ary entropy function is continuous everywhere.
This is due to definition of Real.log
for negative numbers.
Binary entropy has derivative log (1 - p) - log p
.
Strict monotonicity of entropy #
Qary entropy is strictly increasing in the interval [0, 1 - q⁻¹].
Qary entropy is strictly decreasing in the interval [1 - q⁻¹, 1].
Binary entropy is strictly increasing in interval [0, 1/2].
Binary entropy is strictly decreasing in interval [1/2, 1].