1. Introduction
Definition 1
Random Variable: A random variable is a mapping
which assigns real numbers
to outcomes
in
.
2. Distribution Functions
Definition 2
Distribution Function: Given a random variable , the cumulative distribution function (also called the \textsc{cdf}) is a function
defined by:
Theorem 3 Let
have \textsc{cdf}
and let
have \textsc{cdf}
. If
for all
then
for all
.
Definition 4
is discrete if it takes countably many infinite values.
We define the probability function or the probability mass function for
by
.
Definition 5 A random variable
is said to be continuous if there exists a function
such that
for all
,
![]()
for all with
we have
The function
is called the probability density function and we have
and
at all points for which
is differentiable.
3. Important Discrete Random Variables
Remark 1 We write
to denote that the random variable
has a \textsc{cdf}
.
3.1. The Point Mass Distribution
\textsc{The Point Mass Distribution}. X has a point mass distribution at , written
if
. Hence
is
3.2. The Discrete Uniform Distribution
\textsc{The Discrete Uniform Distribution}. Let be a given integer. Let
have a probability mass function given by:
Then has a discrete uniform distribution on
.
3.3. The Bernoulli Distribution
\textsc{The Bernoulli Distribution}. Let be a random variable with
and
for some
. We say that
has a Bernoulli Distribution written as
. The probability function
is given by
.
3.4. The Binomial Distribution
\textsc{The Binomial Distribution}. Flip a coin times and let
denote the number of heads. If
denotes the probability of getting heads in a single coin toss and the tosses are assumed to be independent then the \textsc{pdf} of
can be shown to be:
3.5. The Geometric Distribution
\textsc{The Geometric Distribution}. has a geometric distribution with parameter
, written as
if
is the number of flips needed until the first head appears.
3.6. The Poisson Distribution
\textsc{The Poisson Distribution}. has a Poisson distribution with parameter
, written as
if
is the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since the last event.
4. Important Continuous Random Variables
4.1. The Uniform Distribution
\textsc{The Uniform Distribution}. For , X has a uniform distribution over
, written
, if
4.2. The Normal Distribution
\textsc{The Normal Distribution}. We say that has a normal (or Gaussian) distribution with parameters
and
, written as
if
The parameter is the “center” (or mean) of the distribution and
is the “spread” (or standard deviation) of the distribution. We say that
has a standard Normal distribution if
and
. A standard Normal random variable is denoted by
. The \textsc{pdf} and \textsc{cdf} of a standard Normal are denoted by
and
. The \textsc{pdf} is plotted in Figure There is no closed-form expression for
. Here are some useful facts:
It follows from that if
Example 1 Suppose that
. Find
.
Solution:
Example 2 For the above problem, also find the value
of
such that
. Solution:
From the normal table, we have that
![]()
4.3. The Exponential Distribution
\textsc{The Exponential Distribution}. has an exponential distribution with parameter
, written as
, if
4.4. The Gamma Distribution
\textsc{The Gamma Distribution}. For , the Gamma function is defined as
has a Gamma distribution with parameters
and
(where
), written as
, if
is the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since the last event.
5. Bivariate Distributions
Definition 6 Given a pair of discrete random variables
and
, their joint mass function is defined as
![]()
Definition 7 For two continuous random variables,
and
, we call a function
a \textsc{pdf} of random variables
if
for all
,
![]()
For any set we have
Example 3 For
, let
have density
Find the value of
.
Solution: We equate the integral of
over
to
and find
.
6. Marginal Distributions
Definition 8 For the discrete case, if
have a joint mass distribution
then the marginal distribution of
is given by
and of
is given by
Definition 9 For the continuous case, if
have a probability distribution function
then the marginal distribution of
is given by
and of
is given by
7. Independent Random Variables
Definition 10 Two random variables,
and
are said to be independent if for every
and
we have
Theorem 11 Let
and
have a joint \textsc{pdf}
. Then
if
for all values of
and
.
8. Conditional Distributions
Definition 12 Let
and
have a joint \textsc{pdf}
. Then the conditional distribution of
given Y is defined as
9. Multivariate Distributions and \textsc
Samples}
Definition 13 Independence of
random variables: Let
where
are random variables. Let
denote their \textsc{pdf}. We say that
are independent if for every
,
Definition 14 If
are independent random variables with the same marginal distribution
, we say that
are \textsc{iid} (identically and independently distributed) random variables and we write:
If
has density
then we also write
. We also call
a random sample of size
from
.
10. The Multivariate Normal Distribution
\textsc{The Multivariate Normal Distribution} In the multivariate normal distribution, the parameter is a vector and the parameter
is a matrix
. Let
where are independent. The joint density of
is