The moment generating function of X is defined by
$$M_{X}(t)=E[\rm{e}^{Xt}]$$
where \(E[\cdot]\) is the expectation.
Since moment generating functions correspond uniquely to the probability distribution, it is used in various proofs and derivation of distributions.
Also, if you find the moment generating function, you can easily find the expected value and variance of X by the properties.
Discrete / Continuous Case
Moment generating functions can be defined for both discrete and continuous case.
$$\begin{eqnarray*}E[\rm{e}^{Xt}]&=&\sum_{x} \rm{e}^{xt}f(x)\ \ \ (X \text{:discrete variable}) \\E[\rm{e}^{Xt}]&=&\int_{-\infty}^{\infty} \rm{e}^{xt}f(x) \rm{dx} \ \ \ (X \text{:continuous variable})\end{eqnarray*}$$
Properties
\(E[X^{k}]= [\frac{d^{k}}{dt^{k}}M_{X}(t)]_{t=0}\) | Proof |
\(E[X]= M’_{X}(0)\) | Proof |
\(V[X]=E[X^{2}]-{E[X]}^{2}=M”_{X}(0)-M’_{X}(0)^{2}\) | Proof |
If X and Y are independent, \(M_{X+Y}(t)=M_{X}(t)M_{Y}(t)\) | Proof |
\(M_{aX+b}(t)=\mathrm{e}^{tb}M_{X}(at)\) | Proof |
Example
Suppose that a random variable X is binomially distributed. Then find expectation and variance of X by using the moment generating function.
Solution : If X is binomially distributed, the probability function is given by
$$f(x)=P(X=x)={}_{n} C_{x}p^{x}q^{n-x}$$
where \(p+q=1\).
Then the moment generating function is obtained as below :
\begin{eqnarray*}M_{X}(t)=E[\rm{e}^{Xt}]&=&\displaystyle\sum_{x} \rm{e}^{xt}f(x)\\&& \\&=&\displaystyle\sum_{x=0}^{n} \rm{e}^{xt}{}_{n} C_{x}p^{x}q^{n-x}\\&& \\&=&\displaystyle\sum_{x=0}^{n} {}_{n} C_{x}(p\rm{e}^{t})^{x}q^{n-x}\\&& \\&=&(p\mathrm{e}^{t}+q)^{n}\end{eqnarray*}
Now, we have
\begin{eqnarray*}M’_{X}(t)&=&np\rm{e}^{t}(p\mathrm{e}^{t}+q)^{n-1}\\M”_{X}(t)&=&np\rm{e}^{t}(p\mathrm{e}^{t}+q)^{n-2}\{(p\mathrm{e}^{t}+q)+\rm{e}^{t}(n-1)p\}\end{eqnarray*}
Thus we obtain
\begin{eqnarray*}E[X]&=& M’_{X}(0)=np(p+q)=np\ \ \ (\text{ ∵ } p+q=1)\\&& \\E[X^{2}]&=&M”(0)=np(1+np-p)\\&& \\V[X]&=&E[X^{2}]-{E[X]}^{2}=M”_{X}(0)-M’_{X}(0)^{2}\\&=&np-np^{2}=np(1-p)=npq\end{eqnarray*}