HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model

{Mirella Lapata Jiangming Liu Shay B. Cohen}

Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model

Abstract

We describe the systems we developed for Discourse Representation Structure (DRS) parsing as part of the IWCS-2019 Shared Task of DRS Parsing.1 Our systems are based on sequence-to-sequence modeling. To implement our model, we use the open-source neural machine translation system implemented in PyTorch, OpenNMT-py. We experimented with a variety of encoder-decoder models based on recurrent neural networks and the Transformer model. We conduct experiments on the standard benchmark of the Parallel Meaning Bank (PMB 2.2). Our best system achieves a score of 84.8{%} F1 in the DRS parsing shared task.

Benchmarks

BenchmarkMethodologyMetrics
drs-parsing-on-pmb-2-2-0Transformer seq2seq
F1: 87.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model | Papers | HyperAI