HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Structured Training for Neural Network Transition-Based Parsing

David Weiss; Chris Alberti; Michael Collins; Slav Petrov

Structured Training for Neural Network Transition-Based Parsing

Abstract

We present structured perceptron training for neural network transition-based dependency parsing. We learn the neural network representation using a gold corpus augmented by a large number of automatically parsed sentences. Given this fixed network representation, we learn a final layer using the structured perceptron with beam-search decoding. On the Penn Treebank, our parser reaches 94.26% unlabeled and 92.41% labeled attachment accuracy, which to our knowledge is the best accuracy on Stanford Dependencies to date. We also provide in-depth ablative analysis to determine which aspects of our model provide the largest gains in accuracy.

Benchmarks

BenchmarkMethodologyMetrics
dependency-parsing-on-penn-treebankWeiss et al.
LAS: 92.06
POS: 97.3
UAS: 94.01

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Structured Training for Neural Network Transition-Based Parsing | Papers | HyperAI