HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Shared Logistic Normal Distributions for Soft Parameter Tying in Unsupervised Grammar Induction

{Noah A. Smith Shay Cohen}

Shared Logistic Normal Distributions for Soft Parameter Tying in Unsupervised Grammar Induction

Abstract

We present a family of priors over probabilistic grammar weights, called the shared logistic normal distribution. This family extends the partitioned logistic normal distribution, enabling factored covariance between the probabilities of different derivation events in the probabilistic grammar, providing a new way to encode prior knowledge about an unknown grammar. We describe a variational EM algorithm for learning a probabilistic grammar based on this family of priors. We then experiment with unsupervised dependency grammar induction and show significant improvements using our model for both monolingual learning and bilingual learning with a non-parallel, multilingual corpus.

Benchmarks

BenchmarkMethodologyMetrics
unsupervised-dependency-parsing-on-pennShared Logistic Normal DMV
UAS: 41.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Shared Logistic Normal Distributions for Soft Parameter Tying in Unsupervised Grammar Induction | Papers | HyperAI