Measuring statistical dependence with Hilbert-Schmidt norms.

*(English)*Zbl 1168.62354
Jain, Sanjay (ed.) et al., Algorithmic learning theory. 16th international conference, ALT 2005, Singapore, October 8–11, 2005. Proceedings. Berlin: Springer (ISBN 3-540-29242-X/pbk). Lecture Notes in Computer Science 3734. Lecture Notes in Artificial Intelligence, 63-77 (2005).

Summary: We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. First, the empirical estimate is simpler than any other kernel dependence test, and requires no user-defined regularisation. Second, there is a clearly defined population quantity which the empirical estimate approaches in the large sample limit, with exponential convergence guaranteed between the two: this ensures that independence tests based on HSIC do not suffer from slow learning rates. Finally, we show in the context of independent components analysis (ICA) that the performance of HSIC is competitive with that of previously published kernel-based criteria, and of other recently published ICA methods.

For the entire collection see [Zbl 1089.68011].

For the entire collection see [Zbl 1089.68011].