Command Palette
Search for a command to run...
Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model
{Mirella Lapata Jiangming Liu Shay B. Cohen}

Abstract
We describe the systems we developed for Discourse Representation Structure (DRS) parsing as part of the IWCS-2019 Shared Task of DRS Parsing.1 Our systems are based on sequence-to-sequence modeling. To implement our model, we use the open-source neural machine translation system implemented in PyTorch, OpenNMT-py. We experimented with a variety of encoder-decoder models based on recurrent neural networks and the Transformer model. We conduct experiments on the standard benchmark of the Parallel Meaning Bank (PMB 2.2). Our best system achieves a score of 84.8{%} F1 in the DRS parsing shared task.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| drs-parsing-on-pmb-2-2-0 | Transformer seq2seq | F1: 87.1 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.