nips nips2005 nips2005-31 nips2005-31-reference knowledge-graph by maker-knowledge-mining

31 nips-2005-Asymptotics of Gaussian Regularized Least Squares


Source: pdf

Author: Ross Lippert, Ryan Rifkin

Abstract: We consider regularized least-squares (RLS) with a Gaussian kernel. We prove that if we let the Gaussian bandwidth σ → ∞ while letting the regularization parameter λ → 0, the RLS solution tends to a polynomial 1 whose order is controlled by the rielative rates of decay of σ2 and λ: if λ = σ −(2k+1) , then, as σ → ∞, the RLS solution tends to the kth order polynomial with minimal empirical error. We illustrate the result with an example. 1


reference text

1. Aronszajn. Theory of reproducing kernels. Transactions of the American Mathematical Society, 68:337–404, 1950. 2. Evgeniou, Pontil, and Poggio. Regularization networks and support vector machines. Advances In Computational Mathematics, 13(1):1–50, 2000. 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1st order solution, and successive approximations. 0.8 0th order solution, and successive approximations. −0.4 −0.2 Deg. 1 polynomial s = 1.d+1 s = 1.d+2 −0.4 −0.2 Deg. 0 polynomial s = 1.d+1 s = 1.d+2 s = 1.d+3 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.4 0.6 0.8 1.0 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 5th order solution, and successive approximations. 0.8 4th order solution, and successive approximations. 0.2 −0.2 Deg. 5 polynomial s = 1.d+1 s = 1.d+3 s = 1.d+5 s = 1.d+6 −0.4 −0.4 −0.2 Deg. 4 polynomial s = 1.d+1 s = 1.d+2 s = 1.d+3 s = 1.d+4 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 Fig. 3. As s → ∞, σ 2 = s2 and λ = s−(2k+1) , the solution to Gaussian RLS approaches the kth order polynomial solution. 3. Keerthi and Lin. Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation, 15(7):1667–1689, 2003. 4. Ross Lippert and Ryan Rifkin. Asymptotics of gaussian regularized least-squares. Technical Report MIT-CSAIL-TR-2005-067, MIT Computer Science and Artificial Intelligence Laboratory, 2005. 5. Rifkin. Everything Old Is New Again: A Fresh Look at Historical Approaches to Machine Learning. PhD thesis, Massachusetts Institute of Technology, 2002. 6. Rifkin and Lippert. Practical regularized least-squares: λ-selection and fast leave-one-outcomputation. In preparation, 2005. 7. Wahba. Spline Models for Observational Data, volume 59 of CBMS-NSF Regional Conference Series in Applied Mathematics. Society for Industrial & Applied Mathematics, 1990. 8. Yang, Duraiswami, and Davis. Efficient kernel machines using the improved fast Gauss transform. In Advances in Neural Information Processing Systems, volume 16, 2004.