Ilin, Alexander; Raiko, Tapani Practical approaches to principal component analysis in the presence of missing values. (English) Zbl 1242.62047 J. Mach. Learn. Res. 11, 1957-2000 (2010). Summary: Principal component analysis (PCA) is a classical data analysis technique that finds linear transformations of data that retain the maximal amount of variance. We study the case where some of the data values are missing, and show that this problem has many features which are usually associated with nonlinear models, such as overfitting and bad locally optimal solutions. A probabilistic formulation of PCA provides a good foundation for handling missing values, and we provide formulas for doing that. In case of high dimensional and very sparse data, overfitting becomes a severe problem and traditional algorithms for PCA are very slow. We introduce a novel fast algorithm and extend it to variational Bayesian learning. Different versions of PCA are compared in artificial experiments, demonstrating the effects of regularization and modeling of posterior variance. The scalability of the proposed algorithm is demonstrated by applying it to the Netflix problem. Cited in 11 Documents MSC: 62H25 Factor analysis and principal components; correspondence analysis 65C60 Computational problems in statistics (MSC2010) 62F15 Bayesian inference Keywords:missing values; overfitting; regularization; variational Bayes PDF BibTeX XML Cite \textit{A. Ilin} and \textit{T. Raiko}, J. Mach. Learn. Res. 11, 1957--2000 (2010; Zbl 1242.62047) Full Text: Link OpenURL