nips nips2005 nips2005-70 nips2005-70-reference knowledge-graph by maker-knowledge-mining

70 nips-2005-Fast Information Value for Graphical Models


Source: pdf

Author: Brigham S. Anderson, Andrew W. Moore

Abstract: Calculations that quantify the dependencies between variables are vital to many operations with graphical models, e.g., active learning and sensitivity analysis. Previously, pairwise information gain calculation has involved a cost quadratic in network size. In this work, we show how to perform a similar computation with cost linear in network size. The loss function that allows this is of a form amenable to computation by dynamic programming. The message-passing algorithm that results is described and empirical results demonstrate large speedups without decrease in accuracy. In the cost-sensitive domains examined, superior accuracy is achieved.


reference text

Agostak, J. M., & Weiss, J. (1999). Active Fusion for Diagnosis Guided by Mutual Information. Proceedings of the 2nd International Conference on Information Fusion. Anderson, B. S., & Moore, A. W. (2005). Active learning for hidden markov models: Objective functions and algorithms. Proceedings of the 22nd International Conference on Machine Learning. Kjrulff, U., & van der Gaag, L. (2000). Making sensitivity analysis computationally efficient. Kohavi, R., & Wolpert, D. H. (1996). Bias Plus Variance Decomposition for Zero-One Loss Functions. Machine Learning : Proceedings of the Thirteenth International Conference. Morgan Kaufmann. Krishnamurthy, V. (2002). Algorithms for optimal scheduling and management of hidden markov model sensors. IEEE Transactions on Signal Processing, 50, 1382–1397. Laskey, K. B. (1995). Sensitivity Analysis for Probability Assessments in Bayesian Networks. IEEE Transactions on Systems, Man, and Cybernetics. Murphy, K. (2005). Bayes net toolbox for matlab. U. C. Berkeley. http://www.ai.mit.edu/˜ murphyk/Software/BNT/bnt.html.