Let X and Y be discrete random variables. Then the joint probability function \(f(x, y)\) is defined by
$$f(x, y)=P(X=x, Y=y)$$
where \(P(X=x, Y=y)\) is the probability of the event “X=x and Y=y”, and \(f(x,y)\) is also called the joint probability distribution.
And also, notice that the probability function \(f(x,y)\) satisfies the following properties:
(1)\(f(x_{i}, y_{j})\geq 0\)
(2)\(\displaystyle\sum_{x}\sum_{y} f(x, y) =1\)
and (2) states that the total probability of all entries is 1.
Independent Random Variables
If X and Y are independent, the joint probability function can be written as
$$f(x,y)=f_{X}(x)f_{Y}(y)\ \ \ \text{or equivalently}\ \ \ P(X=x, Y=y)=P(X=x)P(Y=y)$$
where \(f_{X}(x)\) and \(f_{Y}(y)\) are the marginal probability functions.
Continuous Case
Let X and Y be continuous random variables. Then if the probability that X lies between a and b, and Y lies between c and d is represented as
$$P(a\leq X\leq b,\ c\leq Y\leq d)=\int_{a}^{b} \int_{c}^{d} f(x, y) dxdy$$
we call \(f(x,y)\) the joint probability function or the joint density function.
Notice that the function \(f(x,y)\) satisfies the following two properties:
(1)\(f(x, y)\geq 0\)
(2)\(\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} f(x, y)\ dxdy=1\)
and (2) states that the total probability of all entries is 1.