HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Unsupervised Dependency Parsing: Let's Use Supervised Parsers

Phong Le; Willem Zuidema

Unsupervised Dependency Parsing: Let's Use Supervised Parsers

Abstract

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called `iterated reranking' (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the state-of-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.

Benchmarks

BenchmarkMethodologyMetrics
unsupervised-dependency-parsing-on-pennIterative reranking
UAS: 66.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Unsupervised Dependency Parsing: Let's Use Supervised Parsers | Papers | HyperAI