Expectation

Let X be a random variable. Then the expectation of X is defined by

\begin{eqnarray*}E[X]&=&\sum_{x}\ x\cdot f(x) \ \ \ \ \ \ \ \ \ \text{(discrete variable)}\\E[X]&=&\int_{-\infty}^{\infty}\ x\cdot f(x)\ dx\ \ \ \ \text{(continuous variable)}\end{eqnarray*}

where \(f(x)\) is the probability function such that \(P(X=x)=f(x)\).


The expectation of X gives a representative or average values of X.

Notice that if the probability are all equal in the case of discrete variable X, we have

$$E[X]=\frac{x_{1}+x_{2}+\cdots +x_{n}}{n}$$

which is called the arithmetic mean.

Example: Suppose that a fair dice is to be tossed. Then, the dice turn up \(1, 2,\cdots ,6\) and the probability of these are all \(\frac{1}{6}\). Thus the expectation is

$$E[X]=\frac{1+2+3+4+5+6}{6}=3.5$$

Example

The Table represents the values of X and the probability function.

Find the variance and the standard deviation of X.

Solution :

By the definition, the expected value of X is given as below:

\begin{eqnarray*}E[X]&=&\sum_{x}\ x\cdot f(x)\\&=&1\times 0.1+2\times 0.4+3\times 0.3+5\times 0.1+6\times 0.1\\&=&2.9\end{eqnarray*}

\(X\)\(P(X=x)\)
\(=f(x)\)
\(1\)\(0.1\)
\(2\)\(0.4\)
\(3\)\(0.3\)
\(4\)\(0\)
\(5\)\(0.1\)
\(6\)\(0.1\)
Others\(0\)

Properties

Let X and Y be random variables, and a,b be any constants.

\(E[a]=a\)Proof
\(E[aX+b]=aE[X]+b\)Proof
\(E[aX+bY]=aE[X]+bE[Y]\)Proof
\(E[X-E[X]]=E[X]-E[X]=0\)Proof

Independent

If X and Y are independent random variables, then

$$E[XY]=E[X]E[Y]$$

See also Independent random variables.

Expectations (one-variable function)

Let X be a random variable, having the probability function such that

$$P(X=x)=f(x).$$

Then the expectation of \(g(x)\) is given by

\begin{eqnarray*}E[g(X)]&=&\sum_{x}\ g(x)\cdot f(x) \ \ \ \ \ \ \ \ \ \text{(discrete variable)}\\E[g(X)]&=&\int_{-\infty}^{\infty}\ g(x)\cdot f(x)\ dx\ \ \ \ \text{(continuous variable).}\end{eqnarray*}

Note that \(Y=g(X)\) is also a random variable when X is a random variable.

two-variable functions

Let X and Y be random variables, having the joint probability function such that

$$P(X=x, Y=y)=f(x, y)$$

Then the expectation of g(X, Y) is given by

\begin{eqnarray*}E[g(X, Y)]&=&\sum_{x, y}\ g(x, y)\cdot f(x, y) \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \text{(discrete variable)}\\E[g(X, Y)]&=&\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}\ g(x, y)\cdot f(x, y)\ dxdy\ \ \ \ \text{(continuous variable)}\end{eqnarray*}

If X and Y are independent,

$$E[g(X, Y)]=E[g_{X}(X)]\times E[g_{Y}(Y)]$$

where \(g_{X}(X)\cdot g_{Y}(Y)=g(X, Y)\).

Conditional Expectations

Let X and Y be discrete random variables. Then the conditional expectation of X given Y is defined by

\begin{eqnarray*}E[X|Y=y]&=&\sum_{x} x\ P(X=x|Y=y)\\E[X|Y=y]&=&\int_{-\infty}^{\infty} x\cdot f_{X|Y}(x|y)\ dx\end{eqnarray*}

We note that the conditional density function of X given Y is

$$f_{X|Y}(x|y)=\frac{f(x, y)}{f_{Y}(y)}=\frac{P(X=x, Y=y)}{P(Y=y)}=P(X=x|Y=y)$$

where \(f(x, y)\) is the joint probability function and \(f_{Y}(y)\) is the marginal probability function of Y.