acl acl2010 acl2010-83 acl2010-83-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Wenbin Jiang ; Qun Liu
Abstract: In this paper we describe an intuitionistic method for dependency parsing, where a classifier is used to determine whether a pair of words forms a dependency edge. And we also propose an effective strategy for dependency projection, where the dependency relationships of the word pairs in the source language are projected to the word pairs of the target language, leading to a set of classification instances rather than a complete tree. Experiments show that, the classifier trained on the projected classification instances significantly outperforms previous projected dependency parsers. More importantly, when this clas- , sifier is integrated into a maximum spanning tree (MST) dependency parser, obvious improvement is obtained over the MST baseline.
Adam L. Berger, Stephen A. Della Pietra, and Vincent J. Della Pietra. 1996. A maximum entropy approach to natural language processing. Computational Linguistics. Xavier Carreras, Mihai Surdeanu, and Lluis Marquez. 2006. Projective dependency parsing with perceptron. In Proceedings of the CoNLL. Xavier Carreras, Michael Collins, and Terry Koo. 2008. Tag, dynamic programming, and the perceptron for efficient, feature-rich parsing. In Proceedings of the CoNLL. Eugene Charniak and Mark Johnson. 2005. Coarseto-fine-grained n-best parsing and discriminative reranking. In Proceedings of the ACL. Michael Collins. 1996. A new statistical parser based on bigram lexical dependencies. In Proceedings of ACL. Michael Collins. 2000. Discriminative reranking for natural language parsing. In Proceedings of the ICML, pages 175–182. Michael Collins. 2002. Discriminative training methods for hidden markov models: Theory and experiments with perceptron algorithms. In Proceedings of the EMNLP, pages 1–8, Philadelphia, USA. Jason M. Eisner. 1996. Three new probabilistic models for dependency parsing: An exploration. In Proceedings of COLING, pages 340–345. Kuzman Ganchev, Jennifer Gillenwater, and Ben Taskar. 2009. Dependency grammar induction via bitext projection constraints. In Proceedings of the 47th ACL. Liang Huang and David Chiang. 2005. Better k-best parsing. In Proceedings of the IWPT, pages 53–64. Liang Huang. 2008. Forest reranking: Discriminative parsing with non-local features. In Proceedings of the ACL. Rebecca Hwa, Philip Resnik, Amy Weinberg, and Okan Kolak. 2002. Evaluating translational corre- spondence using annotation projection. In Proceedings of the ACL. Rebecca Hwa, Philip Resnik, Amy Weinberg, Clara Cabezas, and Okan Kolak. 2005. Bootstrapping parsers via syntactic projection across parallel texts. In Natural Language Engineering, volume 11, pages 311–325. 19 Wenbin Jiang and Qun Liu. 2009. Automatic adaptation of annotation standards for dependency parsing using projected treebank as source corpus. In Proceedings of IWPT. Wenbin Jiang, Liang Huang, and Qun Liu. 2009. Automatic adaptation of annotation standards: Chinese word segmentation and pos tagging–a case study. In Proceedings of the 47th ACL. Dan Klein and Christopher D. Manning. 2004. Corpusbased induction of syntactic structure: Models of dependency and constituency. In Proceedings of the ACL. Terry Koo, Xavier Carreras, and Michael Collins. 2008. Simple semi-supervised dependency parsing. In Proceedings of the ACL. Taku Kudo and Yuji Matsumoto. 2000. Japanese dependency structure analysis based on support vector machines. In Proceedings of the EMNLP. Yang Liu, Tian Xia, Xinyan Xiao, and Qun Liu. 2009. Weighted alignment matrices for statistical machine translation. In Proceedings of the EMNLP. Yajuan L u¨, Sheng Li, Tiejun Zhao, and Muyun Yang. 2002. Learning chinese bracketing knowledge based on a bilingual language model. In Proceedings of the COLING. Mitchell P. Marcus, Beatrice Santorini, and Mary Ann Marcinkiewicz. 1993. Building a large annotated corpus of english: The penn treebank. In Computational Linguistics. Ryan McDonald and Fernando Pereira. 2006. Online learning of approximate dependency parsing algorithms. In Proceedings of EACL, pages 81–88. Ryan McDonald, Koby Crammer, and Fernando Pereira. 2005a. Online large-margin training of dependency parsers. In Proceedings ofACL, pages 91– 98. Ryan McDonald, Fernando Pereira, Kiril Ribarov, and Jan Haji c˘. 2005b. Non-projective dependency parsing using spanning tree algorithms. In Proceedings of HLT-EMNLP. J. Nivre and M. Scholz. 2004. Deterministic dependency parsing of english text. In Proceedings of the COLING. Joakim Nivre, Johan Hall, Jens Nilsson, Gulsen Eryigit, and Svetoslav Marinov. 2006. Labeled pseudoprojective dependency parsing with support vector machines. In Proceedings of CoNLL, pages 221–225. Franz J. Och and Hermann Ney. 2000. Improved statistical alignment models. In Proceedings of the ACL. Franz Joseph Och. 2003. Minimum error rate training in statistical machine translation. In Proceedings of the ACL, pages 160–167. David Smith and Jason Eisner. 2009. Parser adaptation and projection with quasi-synchronous grammar features. In Proceedings of EMNLP. Kiyotaka Uchimoto, Satoshi Sekine, and Hitoshi Isahara. 1999. Japanese dependency structure analysis based on maximum entropy models. In Proceedings of the EACL. Vladimir N. Vapnik. 1998. Statistical learning theory. In A Wiley-Interscience Publication. Dekai Wu. 1997. Stochastic inversion transduction grammars and bilingual parsing of parallel corpora. Computational Linguistics. Nianwen Xue, Fei Xia, Fu-Dong Chiou, and Martha Palmer. 2005. The penn chinese treebank: Phrase structure annotation of a large corpus. In Natural Language Engineering. H Yamada and Y Matsumoto. 2003. Statistical dependency analysis using support vector machines. In Proceedings of IWPT. Yue Zhang and Stephen Clark. 2008. Joint word segmentation and pos tagging using a single perceptron. In Proceedings of the ACL. 20