Variance of random variables #
We define the variance of a real-valued random variable as Var[X] = 𝔼[(X - 𝔼[X])^2]
(in the
ProbabilityTheory
locale).
Main definitions #
ProbabilityTheory.evariance
: the variance of a real-valued random variable as an extended non-negative real.ProbabilityTheory.variance
: the variance of a real-valued random variable as a real number.
Main results #
ProbabilityTheory.variance_le_expectation_sq
: the inequalityVar[X] ≤ 𝔼[X^2]
.ProbabilityTheory.meas_ge_le_variance_div_sq
: Chebyshev's inequality, i.e.,ℙ {ω | c ≤ |X ω - 𝔼[X]|} ≤ ENNReal.ofReal (Var[X] / c ^ 2)
.ProbabilityTheory.meas_ge_le_evariance_div_sq
: Chebyshev's inequality formulated withevariance
without requiring the random variables to be L².ProbabilityTheory.IndepFun.variance_add
: the variance of the sum of two independent random variables is the sum of the variances.ProbabilityTheory.IndepFun.variance_sum
: the variance of a finite sum of pairwise independent random variables is the sum of the variances.ProbabilityTheory.variance_le_sub_mul_sub
: the variance of a random variableX
satisfyinga ≤ X ≤ b
almost everywhere is at most(b - 𝔼 X) * (𝔼 X - a)
.ProbabilityTheory.variance_le_sq_of_bounded
: the variance of a random variableX
satisfyinga ≤ X ≤ b
almost everywhere is at most((b - a) / 2) ^ 2
.
The ℝ≥0∞
-valued variance of a real-valued random variable defined as the Lebesgue integral of
(X - 𝔼[X])^2
.
Instances For
The ℝ
-valued variance of a real-valued random variable defined by applying ENNReal.toReal
to evariance
.
Equations
- ProbabilityTheory.variance X μ = (ProbabilityTheory.evariance X μ).toReal
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Chebyshev's inequality for ℝ≥0∞
-valued variance.
Chebyshev's inequality: one can control the deviation probability of a real random variable from its expectation in terms of the variance.
The variance of the sum of two independent random variables is the sum of the variances.
The variance of a finite sum of pairwise independent random variables is the sum of the variances.
The Bhatia-Davis inequality on variance
The variance of a random variable X
satisfying a ≤ X ≤ b
almost everywhere is at most
(b - 𝔼 X) * (𝔼 X - a)
.
Popoviciu's inequality on variance
The variance of a random variable X
satisfying a ≤ X ≤ b
almost everywhere is at most
((b - a) / 2) ^ 2
.