Implementation of Dependency Parser Using Artificial Neural Network Methods

  • Nurul Izzah Telkom University
  • Moch Arif Bijaksana Telkom University
  • Arief Fatchul Huda UIN Sunan Gunung Djati
Abstract views: 29 , 504-1 downloads: 10


In recent years, parsing has become very popular within the scope of NLP (Natural Language Processing) with the presence of Dependency Parser. However, almost all existing Dependency Parser do classifications based on millions of sparse indicator features. This feature is not only bad in drawing conclusions, but also significantly limits the speed of parsing so that the resulting parsing is not optimal. To overcome these problems, changing the use of sparse features becomes dense features to reduce sparsity between words. The Artificial Neural Network classification method is used to produce fast and concise parsing in the Transition-Based Dependency Parser by using 2 hyperparameters. The dataset used in this study is Arabic, Chinese, English, and Indonesian. Based on the evaluation that has been done, it shows a higher result using the second hyperparameter. In testing with English test data, the accuracy value of LAS (Labeled Attachment Score) is 80.4% and UAS (Unlabelled Attachment Score) is 83%, Then with dev data obtained an accuracy value of LAS 81.1% and UAS 83.7%, and parsing speed of 98 sentences per second (sent/s).

Keywords: Parsing, dependency parser, transition-based dependency parsing.


Download data is not yet available.


Giuseppe Attardi. Experiments with a multilanguage non-projective dependency parser. pages 166–170, 2006.

Danqi Chen and Christopher D Manning. A fast and accurate dependency parser using neural networks. pages 740–750,2014.

Filip Ginter, Yoav Goldberg, and Jan Haji ́c. Universal dependency., 2014. Online; Accessed 9 June 2020.

Spence Green and Christopher D Manning. Better arabic parsing: Baselines, evaluations, and analysis. In Proceedings of the 23rd International Conference on Computational Linguistics (Coling 2010), pages 394–402, 2010.

He He, Hal Daumé III, and Jason Eisner. Dynamic feature selection for dependency parsing. In Proceedings of the 2013 conference on empirical methods in natural language processing, pages 1455–1464, 2013.

Daniel Jurafsky and James H Martin. Dependency parsing. Speech and Language Processing, 2009.

Dan Klein and Christopher D Manning. Accurate unlexicalized parsing. pages 423–430, 2003.

Terry Koo, Xavier Carreras, and Michael Collins. Simple semi-supervised dependency parsing. In Proceedings of ACL-08:HLT, pages 595–603, 2008.

Sandra Kübler, Ryan McDonald, and Joakim Nivre. Dependency parsing.Synthesis lectures on human language technologies, 1(1):1–127, 2009.

Tetsuya Nasukawa. Parsing method and system for natural language processing, June 2 1998. US Patent 5,761,631.

D. Nasution, T. H. F. Harumy, E. Haryanto, F. Fachrizal, Julham, and A. Turnip. A classification method for prediction of qualitative properties of multivariate eeg-p300 signals. In 2015 International Conference on Automation, Cognitive Science,Optics, Micro Electro-Mechanical System, and Information Technology (ICACOMIT), pages 82–86, 2015.

Joakim Nivre. An efficient algorithm for projective dependency parsing. In Proceedings of the Eighth International Conference on Parsing Technologies, pages 149–160, 2003.

Joakim Nivre, Johan Hall, and Jens Nilsson. Maltparser: A data-driven parser-generator for dependency parsing. 6:2216–2219,2006.

Chuan Wang, Bin Li, and Nianwen Xue. Transition-based chinese amr parsing. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 247–252, 2018.

David Weiss, Chris Alberti, Michael Collins, and Slav Petrov. Structured training for neural network transition-based parsing.arXiv preprint arXiv:1506.06158, 2015.

How to Cite
Izzah, N., Bijaksana, M. A., & Huda, A. F. (2021). Implementation of Dependency Parser Using Artificial Neural Network Methods. Indonesia Journal on Computing (Indo-JC), 5(3), 15-22.
Computer Science