Technology
Exploring the Variance and Expectation Inequality for Random Variables
Exploring the Variance and Expectation Inequality for Random Variables
The relationship between the expectation and the square of the expectation of a random variable is a fundamental concept in probability theory and statistics. Specifically, we aim to prove that for any random variable (X), the expectation of its square is greater than or equal to the square of its expectation: [E[X^2] geq [E[X]]^2.]
Proof Using the Definition of Variance and Linearity of Expectation
To prove the inequality (E[X^2] geq [E[X]]^2 ), we can start with the definition of variance. The variance of a random variable (X) is defined as:
[ text{Var}(X) E[(X - E[X])^2]. ]Expanding the squared term, we have:
[ E[(X - E[X])^2] E[X^2 - 2X E[X] (E[X])^2]. ]Using the linearity of expectation, we can rewrite this as:
[ E[X^2 - 2X E[X] (E[X])^2] E[X^2] - 2E[X]E[X] (E[X])^2 E[X^2] - 2(E[X])^2 (E[X])^2 E[X^2] - (E[X])^2. ]This simplifies to:
[ text{Var}(X) E[X^2] - (E[X])^2. ]Since variance is always non-negative, we have (text{Var}(X) geq 0). Therefore, it follows that:
[ E[X^2] - (E[X])^2 geq 0, ]which implies:
[ E[X^2] geq (E[X])^2. ]Equality Condition
Equality holds when the variance is zero, which occurs if and only if (X) is a constant random variable. This means that (X) takes on a single value with probability 1, i.e., (X c) almost surely for some constant (c). In this case, we have:
[ E[X] c quad text{and} quad E[X^2] c^2. ]Therefore, [E[X^2] (E[X])^2.]
Using Jensen's Inequality
An alternative way to prove the same result is by using Jensen's Inequality. Jensen's Inequality states that for a convex function (f), the following holds:
[ E[f(X)] geq f(E[X]). ]The function (f(x) x^2) is convex, and therefore:
[ E[X^2] geq E[X]^2. ]This confirms our earlier result.
Relationship Between Variance and Expectation
The variance of a random variable (X) can also be expressed as:
[ text{Var}(X) E[X^2] - (E[X])^2. ]Since (text{Var}(X) geq 0), it follows that:
[ E[X^2] geq (E[X])^2. ]Equality holds if and only if (X) is a constant random variable, as discussed earlier.
Expectation and Linearity
Expectation is a linear operation, meaning that for any random variables (X) and (Y) and any constants (a) and (b), we have:
[ E[aX bY] aE[X] bE[Y]. ]The expectation of a random variable (X) is a number called its mean, usually denoted (mu_X) or simply (mu) when only one random variable is under discussion. The variance of (X) is defined as:
[ text{Var}(X) E[X^2] - mu^2. ]Using the linearity of expectation, we can show that:
[ E[X^2] - mu^2 text{Var}(X) geq 0. ]Therefore, we have:
[ E[X^2] geq mu^2. ]Equality holds when (text{Var}(X) 0), which is the case when (X) is a constant random variable.