nips nips2007 nips2007-43 nips2007-43-reference knowledge-graph by maker-knowledge-mining

43 nips-2007-Catching Change-points with Lasso


Source: pdf

Author: Céline Levy-leduc, Zaïd Harchaoui

Abstract: We propose a new approach for dealing with the estimation of the location of change-points in one-dimensional piecewise constant signals observed in white noise. Our approach consists in reframing this task in a variable selection context. We use a penalized least-squares criterion with a 1 -type penalty for this purpose. We prove some theoretical results on the estimated change-points and on the underlying piecewise constant estimated function. Then, we explain how to implement this method in practice by combining the LAR algorithm and a reduced version of the dynamic programming algorithm and we apply it to synthetic and real data. 1


reference text

[1] M. Basseville and N. Nikiforov. The detection of abrupt changes. Information and System sciences series. Prentice-Hall, 1993.

[2] R. Bellman. On the approximation of curves by line segments using dynamic programming. Communications of the ACM, 4(6), 1961.

[3] P. Bickel, Y. Ritov, and A. Tsybakov. Simultaneous analysis of Lasso and Dantzig selector. Preprint 2007.

[4] L. Boysen, A. Kempe, A. Munk, V. Liebscher, and O. Wittich. Consistencies and rates of convergence of jump penalized least squares estimators. Annals of Statistics, In revision.

[5] B. Brodsky and B. Darkhovsky. Non-parametric statistical diagnosis: problems and methods. Kluwer Academic Publishers, 2000.

[6] O. Capp´ , E. Moulines, and T. Ryden. Inference in Hidden Markov Models (Springer Series in Statistics). e Springer-Verlag New York, Inc., 2005.

[7] B. Efron, T. Hastie, and R. Tibshirani. Least angle regression. Annals of Statistics, 32:407–499, 2004.

[8] P. Fearnhead. Exact and efficient bayesian inference for multiple changepoint problems. Statistics and Computing, 16:203–213, 2006.

[9] W. D. Fisher. On grouping for maximum homogeneity. Journal of the American Statistical Society, 53:789–798, 1958.

[10] O. Gillet, S. Essid, and G. Richard. On the correlation of automatic audio and visual segmentation of music videos. IEEE Transactions on Circuits and Systems for Video Technology, 2007.

[11] S. M. Kay. Fundamentals of statistical signal processing: detection theory. Prentice-Hall, Inc., 1993.

[12] M. Lavielle. Using penalized contrasts for the change-points problems. Signal Processing, 85(8):1501– 1510, 2005.

[13] M. Lavielle and E. Moulines. Least-squares estimation of an unknown number of shifts in a time series. Journal of time series analysis, 21(1):33–59, 2000.

[14] E. Lebarbier. Detecting multiple change-points in the mean of a gaussian process by model selection. Signal Processing, 85(4):717–736, 2005.

[15] C.-B. L. Lee. Estimating the number of change-points in a sequence of independent random variables. Statistics and Probability Letters, 25:241–248, 1995.

[16] E. Mammen and S. Van De Geer. Locally adaptive regression splines. Annals of Statistics, 1997.

[17] P. Massart. A non asymptotic theory for model selection. pages 309–323. European Mathematical Society, 2005.

[18] N. Meinshausen and B. Yu. Lasso-type recovery of sparse representations for high-dimensional data. Preprint 2006.

[19] S. Rosset and J. Zhu. Piecewise linear regularized solution paths. Annals of Statistics, 35, 2007.

[20] J. Ruanaidh and W. Fitzgerald. Numerical Bayesian Methods Applied to Signal Processing. Statistics and Computing. Springer, 1996.

[21] R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58(1):267–288, 1996.

[22] R. Tibshirani and P. Wang. Spatial smoothing and hot spot detection for cgh data using the fused lasso. Biostatistics, 9(1):18–29, 2008.

[23] P. Zhao and B. Yu. On model selection consistency of lasso. Journal Of Machine Learning Research, 7, 2006. 8