HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Two-Level Supervised Contrastive Learning for Response Selection in Multi-Turn Dialogue

Wentao Zhang; Shuang Xu; Haoran Huang

Two-Level Supervised Contrastive Learning for Response Selection in Multi-Turn Dialogue

Abstract

Selecting an appropriate response from many candidates given the utterances in a multi-turn dialogue is the key problem for a retrieval-based dialogue system. Existing work formalizes the task as matching between the utterances and a candidate and uses the cross-entropy loss in learning of the model. This paper applies contrastive learning to the problem by using the supervised contrastive loss. In this way, the learned representations of positive examples and representations of negative examples can be more distantly separated in the embedding space, and the performance of matching can be enhanced. We further develop a new method for supervised contrastive learning, referred to as two-level supervised contrastive learning, and employ the method in response selection in multi-turn dialogue. Our method exploits two techniques: sentence token shuffling (STS) and sentence re-ordering (SR) for supervised contrastive learning. Experimental results on three benchmark datasets demonstrate that the proposed method significantly outperforms the contrastive learning baseline and the state-of-the-art methods for the task.

Benchmarks

BenchmarkMethodologyMetrics
conversational-response-selection-on-eBERT-TL
R10@1: 0.927
R10@2: 0.974
R10@5: 0.997

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp