acl acl2011 acl2011-173 acl2011-173-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Hiroyuki Shindo ; Akinori Fujino ; Masaaki Nagata
Abstract: We propose a model that incorporates an insertion operator in Bayesian tree substitution grammars (BTSG). Tree insertion is helpful for modeling syntax patterns accurately with fewer grammar rules than BTSG. The experimental parsing results show that our model outperforms a standard PCFG and BTSG for a small dataset. For a large dataset, our model obtains comparable results to BTSG, making the number of grammar rules much smaller than with BTSG.
J. Chen, S. Bangalore, and K. Vijay-Shanker. 2006. Automated extraction of Tree-Adjoining Grammars from treebanks. Natural Language Engineering, 12(03):251–299. D. Chiang, 2003. Statistical Parsing with an Automatically Extracted Tree Adjoining Grammar, chapter 16, pages 299–316. CSLI Publications. T. Cohn and P. Blunsom. 2010. Blocked inference in Bayesian tree substitution grammars. In Proceedings of the ACL 2010 Conference Short Papers, pages 225– 230, Uppsala, Sweden, July. Association for Computational Linguistics. T. Cohn, P. Blunsom, and S. Goldwater. 2011. Inducing tree-substitution grammars. Journal of Machine Learning Research. To Appear. M. Johnson and S. Goldwater. 2009. Improving nonparameteric Bayesian inference: experiments on unsupervised word segmentation with adaptor grammars. In Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics (HLT-NAACL), pages 317–325, Boulder, Colorado, June. Association for Computational Linguistics. A.K. Joshi. 1985. Tree adjoining grammars: How much context-sensitivity is required to provide reasonable structural descriptions? Natural Language Parsing: Psychological, Computational, and Theoretical Perspectives, pages 206–250. K. Lari and S.J. Young. 1991. Applications of stochastic context-free grammars using the inside-outside algorithm. Computer Speech & Language, 5(3):237–257. T. Matsuzaki, Y. Miyao, and J. Tsujii. 2005. Probabilistic CFG with latent annotations. In Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics (ACL), pages 75–82. Association for Computational Linguistics. S. Petrov, L. Barrett, R. Thibaux, and D. Klein. 2006. Learning accurate, compact, and interpretable tree annotation. In Proceedings of the 21st International Conference on Computational Linguistics and the 44th Annual Meeting of the Association for Computa- tional Linguistics (ICCL-ACL), pages 433–440, Sydney, Australia, July. Association for Computational Linguistics. J. Pitman and M. Yor. 1997. The two-parameter PoissonDirichlet distribution derived from a stable subordinator. The Annals of Probability, 25(2):855–900. M. Post and D. Gildea. 2009. Bayesian learning of a tree substitution grammar. In Proceedings of the ACLIJCNLP 2009 Conference Short Papers, pages 45–48, Suntec, Singapore, August. Association for Computational Linguistics. Y. Schabes and R.C. Waters. 1995. Tree insertion grammar: a cubic-time, parsable formalism that lexicalizes context-free grammar without changing the trees produced. Fuzzy Sets and Systems, 76(3):309–317. Y. W. Teh. 2006a. A Bayesian interpretation of interpolated Kneser-Ney. Technical Report TRA2/06, School of Computing, National University of Singapore. Y. W. Teh. 2006b. A hierarchical Bayesian language model based on Pitman-Yor processes. In Proceedings of the 21st International Conference on Computational Linguistics and the 44th Annual Meeting of the Association for Computational Linguistics (ICCLACL), pages 985–992. F. Xia. 1999. Extracting tree adjoining grammars from bracketed corpora. In Proceedings of the 5th Natural Language Processing Pacific Rim Symposium (NLPRS), pages 398–403. 211