HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Multitask Parsing Across Semantic Representations

Daniel Hershcovich; Omri Abend; Ari Rappoport

Multitask Parsing Across Semantic Representations

Abstract

The ability to consolidate information of different types is at the core of intelligence, and has tremendous practical value in allowing learning for one task to benefit from generalizations learned for others. In this paper we tackle the challenging task of improving semantic parsing performance, taking UCCA parsing as a test case, and AMR, SDP and Universal Dependencies (UD) parsing as auxiliary tasks. We experiment on three languages, using a uniform transition-based system and learning architecture for all parsing tasks. Despite notable conceptual, formal and domain differences, we show that multitask learning significantly improves UCCA parsing in both in-domain and out-of-domain settings.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
ucca-parsing-on-semeval-2019-task-1Transition-based + MTL
English-20K (open) F1: 68.4
English-Wiki (open) F1: 73.5

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Multitask Parsing Across Semantic Representations | Papers | HyperAI