nips nips2008 nips2008-152 nips2008-152-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Joshua W. Robinson, Alexander J. Hartemink
Abstract: A principled mechanism for identifying conditional dependencies in time-series data is provided through structure learning of dynamic Bayesian networks (DBNs). An important assumption of DBN structure learning is that the data are generated by a stationary process—an assumption that is not true in many important settings. In this paper, we introduce a new class of graphical models called non-stationary dynamic Bayesian networks, in which the conditional dependence structure of the underlying data-generation process is permitted to change over time. Non-stationary dynamic Bayesian networks represent a new framework for studying problems in which the structure of a network is evolving over time. We define the non-stationary DBN model, present an MCMC sampling algorithm for learning the structure of the model from time-series data under different assumptions, and demonstrate the effectiveness of the algorithm on both simulated and biological data. 1
[1] Nir Friedman, Michal Linial, Iftach Nachman, and Dana Pe’er. Using Bayesian networks to analyze expression data. In RECOMB 4, pages 127–135. ACM Press, 2000.
[2] V. Anne Smith, Jing Yu, Tom V. Smulders, Alexander J. Hartemink, and Erich D. Jarvis. Computational inference of neural information flow networks. PLoS Computational Biology, 2(11):1436–1449, 2006.
[3] Steve Hanneke and Eric P. Xing. Discrete temporal models of social networks. In Workshop on Statistical Network Analysis, ICML 23, 2006.
[4] Fan Guo, Steve Hanneke, Wenjie Fu, and Eric P. Xing. Recovering temporally rewiring networks: A model-based approach. In ICML 24, 2007.
[5] Makram Talih and Nicolas Hengartner. Structural learning with time-varying components: Tracking the cross-section of financial time series. Journal of the Royal Statistical Society B, 67(3):321–341, 2005.
[6] Xiang Xuan and Kevin Murphy. Modeling changing dependency structure in multivariate time series. In ICML 24, 2007.
[7] David Heckerman, Dan Geiger, and David Maxwell Chickering. Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning, 20(3):197–243, 1995.
[8] Claudia Tarantola. MCMC model determination for discrete graphical models. Statistical Modelling, 4(1):39–61, 2004.
[9] P Krause. Learning probabilistic networks. The Knowledge Engineering Review, 13(4):321–351, 1998.
[10] Kevin Murphy. Learning Bayesian network structure from sparse data sets. U.C. Berkeley Technical Report, Computer Science Department 990, University of California at Berkeley, 2001.
[11] Peter J. Green. Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711–732, 1995.
[12] M Arbeitman, E Furlong, F Imam, E Johnson, B Null, B Baker, M Krasnow, M Scott, R Davis, and K White. Gene expression during the life cycle of Drosophila melanogaster. Science, 5590(297):2270– 2275, 2002.
[13] Wentao Zhao, Erchin Serpedin, and Edward R. Dougherty. Inferring gene regulatory networks from time series data using the minimum description length principle. Bioinformatics, 22(17):2129–2135, 2006.
[14] T Sandmann, L Jensen, J Jakobsen, M Karzynski, M Eichenlaub, P Bork, and E Furlong. A temporal map of transcription factor activity: mef2 directly regulates target genes at all stages of muscle development. Developmental Cell, 10(6):797–807, 2006. 8