nips nips2010 nips2010-117 nips2010-117-reference knowledge-graph by maker-knowledge-mining

117 nips-2010-Identifying graph-structured activation patterns in networks


Source: pdf

Author: James Sharpnack, Aarti Singh

Abstract: We consider the problem of identifying an activation pattern in a complex, largescale network that is embedded in very noisy measurements. This problem is relevant to several applications, such as identifying traces of a biochemical spread by a sensor network, expression levels of genes, and anomalous activity or congestion in the Internet. Extracting such patterns is a challenging task specially if the network is large (pattern is very high-dimensional) and the noise is so excessive that it masks the activity at any single node. However, typically there are statistical dependencies in the network activation process that can be leveraged to fuse the measurements of multiple nodes and enable reliable extraction of highdimensional noisy patterns. In this paper, we analyze an estimator based on the graph Laplacian eigenbasis, and establish the limits of mean square error recovery of noisy patterns arising from a probabilistic (Gaussian or Ising) model based on an arbitrary graph structure. We consider both deterministic and probabilistic network evolution models, and our results indicate that by leveraging the network interaction structure, it is possible to consistently recover high-dimensional patterns even when the noise variance increases with network size. 1


reference text

[1] F. Abramovich, Y. Benjamini, D. L. Donoho, and I. M. Johnstone, Adapting to unknown sparsity by controlling the false discovery rate, Annals of Statistics 34 (2006), no. 2, 584–653.

[2] Rie K. Ando and Tong Zhang, Learning on graph with laplacian regularization, Advances in Neural Information Processing Systems (NIPS), 2006.

[3] M. Belkin and P. Niyogi, Semi-supervised learning on riemannian manifolds, Machine Learning 56(1-3) (2004), 209–239.

[4] Mikhail Belkin, Irina Matveeva, and Partha Niyogi, Regularization and semi-supervised learning on large graphs, Conference on Learning Theory (COLT), 2004.

[5] Mikhail Belkin and Partha Niyogi, Laplacian eigenmaps for dimensionality reduction and data representation, Neural Computation 15 (2003), no. 6, 1373–1396.

[6] , Convergence of laplacian eigenmaps, Advances in Neural Information Processing Systems (NIPS), 2006.

[7] B. Bollobas, Random graphs, Cambridge University Press, 2001.

[8] Luis E. Carvalho and Charles E. Lawrence, Centroid estimation in discrete high-dimensional spaces with applications in biology, PNAS 105 (2008), no. 9, 3209–3214.

[9] R. Coifman and M. Maggioni, Diffusion wavelets, Applied and Computational Harmonic Analysis 21 (2006), no. 1, 53–94.

[10] D. L. Donoho, I. M. Johnstone, J. C. Hoch, and A. S. Stern, Maximum entropy and the nearly black object, Journal of Royal Statistical Society, Series B 54 (1992), 41–81.

[11] P. Erd¨ s and A R´ nyi, On the evolution of random graphs, Publication of the Mathematical Institute of o e the Hungarian Academy of Sciences, 1960, pp. 17–61.

[12] Ill´ s J. Farkas, Imre Der´ nyi, Albert-L´ szl´ Barab´ si, and Tam´ s Vicsek, Spectra of real-world graphs: e e a o a a Beyond the semi-circle law, Physical Review E 64 (2001), 1–12.

[13] Bernard Friedman, Eigenvalues of composite matrices, Mathematical Proceedings of the Cambridge Philosophical Society 57 (1961), 37–49.

[14] M. Gavish, B. Nadler, and R. Coifman, Multiscale wavelets on trees, graphs and high dimensional data: Theory and applications to semi supervised learning, 27th International Conference on Machine Learning (ICML), 2010.

[15] S. Jalan and J. N. Bandyopadhyay, Random matrix analysis of network laplacians, Tech. Report condmat/0611735, Nov 2006.

[16] J. Jin and D. L. Donoho, Higher criticism for detecting sparse heterogeneous mixtures, Annals of Statistics 32 (2004), no. 3, 962–994.

[17] A. B. Lee, B. Nadler, and L. Wasserman, Treelets - an adaptive multi-scale basis for sparse unordered data, Annals of Applied Statistics 2 (2008), no. 2, 435–471.

[18] N. Meinshausen and P. Buhlmann, High dimensional graphs and variable selection with the lasso, Annals of Statistics 34 (2006), no. 3, 1436–1462.

[19] A. T. Ogielski and D. L. Stein, Dynamics on ultrametric spaces, Physical Review Letters 55 (1985), 1634–1637.

[20] J. Shi and J. Malik, Normalized cuts and image segmentation, IEEE Trans. Pattern Analysis and Machine Intelligence 22 (2000), 888–905.

[21] A. Singer, From graph to manifold laplacian: the convergence rate, Applied and Computational Harmonic Analysis 21 (2006), no. 1, 135–144.

[22] A. Singh, R. Nowak, and R. Calderbank, Detecting weak but hierarchically-structured patterns in networks, 13th International Conference on Artificial Intelligence and Statistics (AISTATS), 2010.

[23] A. Smola and R. Kondor, Kernels and regularization on graphs, Conference on Learning Theory (COLT), 2003.

[24] Ulrike von Luxburg, A tutorial on spectral clustering, Statistics and Computing 17 (2007), no. 4, 395–416.

[25] M. Wainwright, P. Ravikumar, and J. D. Lafferty, High-dimensional graphical model selection using regularized logistic regression, Advances in Neural Information Processing Systems (NIPS), 2006. 1-

[26] Duncan J. Watts and Steven H. Strogatz, Collective dynamics of ‘small-world’ networks, Nature 393 (1998), no. 6684, 440–442.

[27] Choujun Zhan, Guanrong Chen, and Lam F. Yeung, On the distribution of laplacian eigenvalues versus node degrees in complex networks, Physica A 389 (2010), 1779–1788. 9