HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Global-to-local Memory Pointer Networks for Task-Oriented Dialogue

Chien-Sheng Wu; Richard Socher; Caiming Xiong

Global-to-local Memory Pointer Networks for Task-Oriented Dialogue

Abstract

End-to-end task-oriented dialogue is challenging since knowledge bases are usually large, dynamic and hard to incorporate into a learning framework. We propose the global-to-local memory pointer (GLMP) networks to address this issue. In our model, a global memory encoder and a local memory decoder are proposed to share external knowledge. The encoder encodes dialogue history, modifies global contextual representation, and generates a global memory pointer. The decoder first generates a sketch response with unfilled slots. Next, it passes the global memory pointer to filter the external knowledge for relevant information, then instantiates the slots via the local memory pointers. We empirically show that our model can improve copy accuracy and mitigate the common out-of-vocabulary problem. As a result, GLMP is able to improve over the previous state-of-the-art models in both simulated bAbI Dialogue dataset and human-human Stanford Multi-domain Dialogue dataset on automatic and human evaluation.

Code Repositories

mangonihao/GLMP_Annotation
pytorch
Mentioned in GitHub
LooperXX/DF-Net
pytorch
Mentioned in GitHub
jasonwu0731/GLMP
Official
pytorch
Mentioned in GitHub
scoyer/fg2seq
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
task-oriented-dialogue-systems-on-kvretGLMP
BLEU: 14.79
Entity F1: 59.97

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp