Monday, September 3, 2012

Characterization of independence of random variables

Sure you know that two random variables $X$ and $Y$ are independent if and only if $P_{XY}(x,y) = P_{X}(x)P_{Y}(y)$. What you might not know is that if
$$
\max_{f ,g\in \mathcal{F}\times \mathcal{G}} cor (f(X), g(Y)) = 0
$$
for $\mathcal{F}$ and $\mathcal{G}$ a sufficiently large family of functions and contain all continuous functions (each on the domain X and Y take values, say $\mathcal{X}$ and $\mathcal{Y}$), then independence holds.

This was first discovered by Sarmanov in 1958 and Generalized to the multivariate case by Lancaster in 1962. With this base, Bach and Jordan created the Principal Component Analysis.

Despite not being a mainstream characterization of independence, the book by Jacod and Protter entitled Probability Essentials lists it.

Fukumizu, Gretton and Sridemdupudur have also been working on independence measures using the RKHS theory. If one uses characteristic kernels, which can span again all the Fourier features, then one has again both families of continuous functions. They compute the mean function of the measure as
$$
\mu_X = \int k(x,\cdot) d P_X (x)
$$
Now, given two mean functions $\mu_X \in H_X$ and $\mu_Y \in H_Y$, independence holds if and only if the Hilbert-Schmidt norm of the covariance operator $C_{XY} : H_X \rightarrow H_Y$ is zero.

Another characterization of covariance was given by Rényi in 1959. Let $\hat{f}_{X}(\omega)$, $\hat{f}_{Y}(\eta)$ be the Fourier transforms of both probability density functions, and $\hat{f}_{XY}(\omega,\eta)$ be the joint probability density function, if
$$
\int \hat{f}_{XY}(\omega,\eta) - \hat{f}_X(\omega)\hat{f}_Y(\eta) d\omega d\eta = 0
$$
then independence holds. From there, Székely and Rizzo created a statistic and a non-linear correlation estimator.


No comments:

Post a Comment