Non-Gaussian Component Analysis using Entropy Methods

Abstract: Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable X in n-dimensional Euclidean space. There is an unknown subspace V of the n-dimensional Euclidean space such that the orthogonal projection of X onto V is standard multidimensional Gaussian and the orthogonal projection of X onto the U, the orthogonal complement of V, is non-Gaussian, in the sense that all its one-dimensional marginals are different from the Gaussian in a certain metric defined in terms of moments. The NGCA problem is to approximate the non-Gaussian subspace U given samples of X.

We give an algorithm that takes polynomial time in the dimension n and has an inverse polynomial dependence on the error parameter measuring the angle distance between the non-Gaussian subspace and the subspace output by the algorithm. Our algorithm is based on relative entropy as the contrast function and fits under the projection pursuit framework. The techniques we develop for analyzing our algorithm maybe of use for other related problems.

The talk is based on joint work with Navin Goyal and was presented at STOC 2019.