Jointly Distributed Random Variables

Posted by Beetle B. on Wed 07 June 2017

Probability

Given two random variables \(X,Y\), the joint pdf is given by \(p(x,y)=P(X=x,Y=y)\).

Let \(A\) be an event. Then the joint pmf is:

\begin{equation*} P[(X,Y)\in A]=\sum\sum_{(x,y)\in A}p(x,y) \end{equation*}

The marginal pdf of \(X\) is denoted by \(p_{X}(x)\):

\begin{equation*} p_{X}(x)=\sum_{y}p(x,y) \end{equation*}

The joint density function is:

\begin{equation*} \int\int_{A}f(x,y)\ dxdy \end{equation*}

The marginal pdf is:

\begin{equation*} f_{X}(x)=\int_{-\infty}^{\infty}f(x,y)\ dy \end{equation*}

Independence

Two random variables \(X\) and \(Y\) are independent if \(\forall(x,y),p(x,y)=p_{X}(x)p_{Y}(y)\)

I think the author makes the claim that to be independent, \(f(x,y)\) must be of the form \(g(x)k(y)\) and the region of positive density must be a rectangle aligned with the axes.

Multiple random variables are independent if they are independent for all subsets of \(X_{1},\dots,X_{n}\).

Conditional Probability

Let \(X,Y\) be two random variables. The conditional pdf of \(Y\) given \(X=x\) is:

\begin{equation*} f_{Y|X}(y|x)=\frac{f(x,y)}{f_{X}(x)} \end{equation*}