Expectation of Random Variables

1. Expectation of a Random Variable

The expectation of a random variable is the average value of {X}.

Definition 1 The expectation, mean or first moment of {X} is defined to be

\displaystyle   \mathbb{E}(X) = \int x f(x) dx = \begin{cases} \sum_x x f(x) X \text{ is discrete} \\ \int_x x f(x) dx X \text{ is continuous}. \end{cases} \ \ \ \ \ (1)

The following notations are also used.

\displaystyle  \mathbb{E}(X) = \mathbb{E}X = \int x \thinspace dF(x) = \mu_X = \mu \ \ \ \ \ (2)

Theorem 2 The Rule of the Lazy Statistician: Let {Y = r(X)}, then the expectation of Y is

\displaystyle  \mathbb{E}(Y) = \int r(X) \thinspace dF_X(x). \ \ \ \ \ (3)

2. Properties of Expectation

Theorem 3 If {X_1, \dotsc, X_n} are random variables and {a_1, \dotsc, a_n} are constants, then

\displaystyle  \mathbb{E}\left(\sum_{i=1}^n a_iX_i\right) = \sum_{i=1}^na_i\mathbb{E}(X_i). \ \ \ \ \ (4)

Theorem 4 If {X_1, \dotsc, X_n} are independent random variables, then

\displaystyle  \mathbb{E}\left(\prod_{i=1}^n X_i\right) = \prod_{i=1}^n\mathbb{E}(X_i). \ \ \ \ \ (5)

3. Variance and Covariance

Definition 5 Let {X} be a random variable with mean {\mu}. The variance of {X}, denoted by {\mathbb{V}(X)}, {\mathbb{V}X}, or {\sigma^2} or {\sigma_X^2} is defined by:

\displaystyle  \mathbb{V}(X) = \mathbb{E}((X - \mu)^2) = \int (X - \mu)^2 \thinspace dF(x) \ \ \ \ \ (6)

assuming the variance exists. The standard deviation is the square root of the variance.

Definition 6 If {X_1, \dotsc, X_n} are random variables, then we define the sample mean as

\displaystyle  \overline{X}_n = \frac{1}{n}\left(\sum_{i=1}^n X_i\right). \ \ \ \ \ (7)

Definition 7 If {X_1, \dotsc, X_n} are random variables, then we define the sample variance as

\displaystyle  S_n^2 = \frac{1}{n - 1}\left(\sum_{i=1}^n (\overline{X}_n - X_i)^2\right). \ \ \ \ \ (8)

4. Properties of Variance

Theorem 8

\displaystyle  \mathbb{V}(X) = \mathbb{E}(X^2) - \mu^2. \ \ \ \ \ (9)

Theorem 9 If {a} and {b} are constants, then

\displaystyle  \mathbb{V}(aX + b) = a^2 \mathbb{V}(X). \ \ \ \ \ (10)

Theorem 10 If {X_1, \dotsc, X_n} are random variables and {a_1, \dotsc, a_n} are constants, then

\displaystyle  \mathbb{V}\left(\sum_{i=1}^n a_iX_i\right) = \sum_{i=1}^n{a_i}^2\mathbb{V}(X_i). \ \ \ \ \ (11)

Theorem 11 If {X_1, \dotsc, X_n} are \textsc{iid} random variables with mean {\mu} and variance {\sigma^2}, then

\displaystyle  \mathbb{E}(\overline{X}_n) = \mu, \quad \mathbb{V}(\overline{X}_n) = \frac{\sigma^2}{n} \quad \text{ and } \quad \mathbb{E}(S_n^2) = \sigma^2. \ \ \ \ \ (12)