HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network

{dianhai yu daxiang dong Ying Chen Yi Liu Hua Wu Xiangyang Zhou Lu Li Wayne Xin Zhao}

Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network

Abstract

Human generates responses relying on semantic and functional dependencies, including coreference relation, among dialogue elements and their context. In this paper, we investigate matching a response with its multi-turn context using dependency information based entirely on attention. Our solution is inspired by the recently proposed Transformer in machine translation (Vaswani et al., 2017) and we extend the attention mechanism in two ways. First, we construct representations of text segments at different granularities solely with stacked self-attention. Second, we try to extract the truly matched segment pairs with attention across the context and response. We jointly introduce those two kinds of attention in one uniform neural network. Experiments on two large-scale multi-turn response selection tasks show that our proposed model significantly outperforms the state-of-the-art models.

Benchmarks

BenchmarkMethodologyMetrics
conversational-response-selection-on-douban-1DAM
MAP: 0.550
MRR: 0.601
P@1: 0.427
R10@1: 0.254
R10@2: 0.410
R10@5: 0.757
conversational-response-selection-on-rrsDAM
MAP: 0.511
MRR: 0.534
P@1: 0.347
R10@1: 0.308
R10@2: 0.457
R10@5: 0.751
conversational-response-selection-on-ubuntu-1DAM
R10@1: 0.767
R10@2: 0.874
R10@5: 0.969
R2@1: 0.938

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp