acl acl2011 acl2011-295 acl2011-295-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Nikhil Garg ; James Henderson
Abstract: We propose a generative model based on Temporal Restricted Boltzmann Machines for transition based dependency parsing. The parse tree is built incrementally using a shiftreduce parse and an RBM is used to model each decision step. The RBM at the current time step induces latent features with the help of temporal connections to the relevant previous steps which provide context information. Our parser achieves labeled and unlabeled attachment scores of 88.72% and 91.65% respectively, which compare well with similar previous models and the state-of-the-art.
Y. Bengio, R. Ducharme, P. Vincent, and C. Janvin. 2003. A neural probabilistic language model. The Journal of Machine Learning Research, 3: 1137–1 155. B. Bohnet. 2009. Efficient parsing of syntactic and semantic dependency structures. In Proceedings of the Thirteenth Conference on Computational Natural Language Learning: Shared Task, CoNLL ’09, pages 67–72. Association for Computational Linguistics. R. Collobert and J. Weston. 2008. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning, pages 160–167. ACM. J. Haji cˇ, M. Ciaramita, R. Johansson, D. Kawahara, M.A. Mart ı´, L. M `arquez, A. Meyers, J. Nivre, S. Pad o´, J. Sˇt eˇp a´nek, et al. 2009. The CoNLL-2009 shared task: Syntactic and semantic dependencies in multiple languages. In Proceedings of the Thirteenth Conference on Computational Natural Language Learning: Shared Task, pages 1–18. Association for Computational Linguistics. J. Hall, J. Nilsson, J. Nivre, G. Eryigit, B. Megyesi, M. Nilsson, and M. Saers. 2007. Single malt or blended? A study in multilingual parser optimization. In Proceedings of the CoNLL Shared Task Session of EMNLP-CoNLL 2007, pages 933–939. Association for Computational Linguistics. J. Henderson, P. Merlo, G. Musillo, and I. Titov. 2008. A latent variable model of synchronous parsing for syntactic and semantic dependencies. In Proceedings of the Twelfth Conference on Computational Natural Language Learning, pages 178–182. Association for Computational Linguistics. G.E. Hinton, S. Osindero, and Y.W. Teh. 2006. A fast learning algorithm for deep belief nets. Neural computation, 18(7): 1527–1554. G.E. Hinton. 2002. Training products of experts by min- imizing contrastive divergence. Neural Computation, 14(8): 1771–1800. R. Johansson and P. Nugues. 2008. Dependencybased syntactic-semantic analysis with PropBank and NomBank. In Proceedings of the Twelfth Conference on Computational Natural Language Learning, pages 183–187. Association for Computational Linguistics. D. Lin. 1998. An information-theoretic definition of similarity. In Proceedings of the 15th International Conference on Machine Learning, volume 1, pages 296–304. R. McDonald, F. Pereira, K. Ribarov, and J. Haji cˇ. 2005. Non-projective dependency parsing using spanning tree algorithms. In Proceedings of the conference on Human Language Technology and Empirical Methods 16 in Natural Language Processing, pages 523–530. Association for Computational Linguistics. G.A. Miller, R. Beckwith, C. Fellbaum, D. Gross, and K.J. Miller. 1990. Introduction to wordnet: An online lexical database. International Journal of lexicography, 3(4):235. A. Mnih and G. Hinton. 2007. Three new graphical models for statistical language modelling. In Proceedings ofthe 24th international conference on Machine learning, pages 641–648. ACM. J. Nivre and R. McDonald. 2008. Integrating graphbased and transition-based dependency parsers. Proceedings of ACL-08: HLT, pages 950–958. J. Nivre, J. Hall, and J. Nilsson. 2004. Memory-based dependency parsing. In Proceedings of CoNLL, pages 49–56. J. Nivre, J. Hall, and J. Nilsson. 2006a. MaltParser: A data-driven parser-generator for dependency parsing. In Proceedings of LREC, volume 6. J. Nivre, J. Hall, J. Nilsson, G. Eryiit, and S. Marinov. 2006b. Labeled pseudo-projective dependency parsing with support vector machines. In Proceedings of the Tenth Conference on Computational Natural Language Learning, pages 221–225. Association for Computational Linguistics. A. Ratnaparkhi. 1999. Learning to parse natural language with maximum entropy models. Machine Learning, 34(1): 151–175. R. Salakhutdinov and G. Hinton. 2009. Replicated softmax: an undirected topic model. Advances in Neural Information Processing Systems, 22. R. Salakhutdinov, A. Mnih, and G. Hinton. 2007. Restricted Boltzmann machines for collaborative filtering. In Proceedings of the 24th international conference on Machine learning, page 798. ACM. M. Surdeanu and C.D. Manning. 2010. Ensemble models for dependency parsing: cheap and good? In Hu- man Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics, pages 649–652. Association for Computational Linguistics. M. Surdeanu, R. Johansson, A. Meyers, L. M `arquez, and J. Nivre. 2008. The CoNLL-2008 shared task on joint parsing of syntactic and semantic dependencies. In Proceedings of the Twelfth Conference on Computational Natural Language Learning, pages 159–177. Association for Computational Linguistics. I. Sutskever, G. Hinton, and G. Taylor. 2008. The recurrent temporal restricted boltzmann machine. In NIPS, volume 21, page 2008. G.W. Taylor, G.E. Hinton, and S.T. Roweis. 2007. Modeling human motion using binary latent variables. Advances in neural information processing systems, 19: 1345. Titov and J. Henderson. 2007a. Constituent parsing with incremental sigmoid belief networks. In Proceedings of the 45th Annual Meeting on Association for Computational Linguistics, volume 45, page 632. I. Titov and J. Henderson. 2007b. Fast and robust multilingual dependency parsing with a generative latent variable model. In Proceedings of the CoNLL Shared Task Session of EMNLP-CoNLL, pages 947–95 1. I. 17