nips nips2003 nips2003-115 nips2003-115-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Nathan Srebro, Tommi S. Jaakkola
Abstract: We formulate linear dimensionality reduction as a semi-parametric estimation problem, enabling us to study its asymptotic behavior. We generalize the problem beyond additive Gaussian noise to (unknown) nonGaussian additive noise, and to unbiased non-additive models. 1
[1] Daniel D. Lee and H. Sebastian Seung. Learning the parts of objects by non-negative matrix factorization. Nature, 401:788–791, 1999.
[2] Orly Alter, Patrick O. Brown, and David Botstein. Singular value decomposition for genomewide expression data processing and modeling. PNAS, 97(18):10101–10106, 2000.
[3] Yossi Azar, Amos Fiat, Anna R. Karlin, Frank McSherry, and Jared Saia. Spectral analysis of data. In 33rd ACM Symposium on Theory of Computing, 2001.
[4] M. E. Tipping and C. M. Bishop. Probabilistic principal component analysis. Journal of the Royal Statistical Society, Series B, 21(3):611–622, 1999.
[5] Nathan Srebro and Tommi Jaakkola. Weighted low rank approximation. In 20th International Conference on Machine Learning, 2003.
[6] M. Collins, S. Dasgupta, and R. E. Schapire. A generalization of principal components analysis to the exponential family. In Advances in Neural Information Processing Systems 14, 2002.
[7] Geoffrey J. Gordon. Generalized2 linear2 models. In Advances in Neural Information Processing Systems 15, 2003.
[8] G. W. Stewart and Ji-guang Sun. Matrix Perturbation Theory. Academic Press, Inc, 1990.
[9] Michal Irani and P Anandan. Factorization with uncertainty. In 6th European Conference on Computer Vision, 2000.
[10] T. W. Anderson and Herman Rubin. Statistical inference in factor analysis. In Third Berleley Symposium on Mathematical Statistics and Probability, volume V, pages 111–150, 1956.
[11] M J Wainwright and E P Simoncelli. Scale mixtures of Gaussians and the statistics of natural images. In Advances in Neural Information Processing Systems 12, 2000.