nips nips2013 nips2013-291 nips2013-291-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Daniel S. Levine, Jonathan P. How
Abstract: We consider the sensor selection problem on multivariate Gaussian distributions where only a subset of latent variables is of inferential interest. For pairs of vertices connected by a unique path in the graph, we show that there exist decompositions of nonlocal mutual information into local information measures that can be computed efficiently from the output of message passing algorithms. We integrate these decompositions into a computationally efficient greedy selector where the computational expense of quantification can be distributed across nodes in the network. Experimental results demonstrate the comparative efficiency of our algorithms for sensor selection in high-dimensional distributions. We additionally derive an online-computable performance bound based on augmentations of the relevant latent variable set that, when such a valid augmentation exists, is applicable for any distribution with nuisances. 1
[1] C. M. Kreucher, A. O. Hero, and K. D. Kastella. An information-based approach to sensor management in large dynamic networks. Proc. IEEE, Special Issue on Modeling, Identificiation, & Control of Large-Scale Dynamical Systems, 95(5):978–999, May 2007.
[2] H.-L. Choi and J. P. How. Continuous trajectory planning of mobile sensors for informative forecasting. Automatica, 46(8):1266–1275, 2010.
[3] V. Chandrasekaran, N. Srebro, and P. Harsha. Complexity of inference in graphical models. In Proc. Uncertainty in Artificial Intelligence, 2008.
[4] D. Koller and N. Friedman. Probabilistic Graphical Models: Principles and Techniques. MIT Press, 2009.
[5] F. R. Kschischang, B. J. Frey, and H.-A. Loeliger. Factor graphs and the sum-product algorithm. IEEE Transactions on Information Theory, 47(2):498–519, Feb 2001.
[6] A. Krause and C. Guestrin. Near-optimal nonmyopic value of information in graphical models. In Proc. Uncertainty in Artificial Intelligence (UAI), 2005.
[7] G. Nemhauser, L. Wolsey, and M. Fisher. An analysis of approximations for maximizing submodular set functions. Mathematical Programming, 14:489–498, 1978.
[8] J. L. Williams, J. W. Fisher III, and A. S. Willsky. Performance guarantees for information theoretic active inference. In M. Meila and X. Shen, editors, Proc. Eleventh Int. Conf. on Artificial Intelligence and Statistics, pages 616–623, 2007.
[9] T. M. Cover and J. A. Thomas. Elements of Information Theory. Wiley, 2nd ed. edition, 2006.
[10] Y. Weiss and W. T. Freeman. Correctness of belief propagation in Gaussian graphical models of arbitrary topology. Neural Computation, 13(10):2173–2200, 2001.
[11] D. M. Malioutov, J. K. Johnson, and A. S. Willsky. Walk-sums and belief propagation in Gaussian graphical models. Journal of Machine Learning Research, 7:2031–2064, 2006.
[12] Y. Liu, V. Chandrasekaran, A. Anandkumar, and A. S. Willsky. Feedback message passing for inference in gaussian graphical models. IEEE Transactions on Signal Processing, 60(8):4135– 4150, Aug 2012.
[13] C. Ko, J. Lee, and M. Queyranne. An exact algorithm for maximum entropy sampling. Operations Research, 43:684–691, 1995.
[14] A. Krause and C. Guestrin. Optimal value of information in graphical models. Journal of Artificial Intelligence Research, 35:557–591, 2009.
[15] P. L. Erd˝ s, M. A. Steel, L. A. Sz´ kely, and T. J. Warnow. A few logs suffice to build (almost) o e all trees: Part ii. Theoretical Computer Science, 221:77–118, 1999.
[16] M. J. Choi, V. Y. F. Tan, A. Anandkumar, and A. S. Willsky. Learning latent tree graphical models. Journal of Machine Learning Research, 12:1771–1812, May 2011.
[17] CERN - European Organization for Nuclear Research. Colt, 1999. 9