HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

HLATR: Enhance Multi-stage Text Retrieval with Hybrid List Aware Transformer Reranking

Yanzhao Zhang; Dingkun Long; Guangwei Xu; Pengjun Xie

HLATR: Enhance Multi-stage Text Retrieval with Hybrid List Aware Transformer Reranking

Abstract

Deep pre-trained language models (e,g. BERT) are effective at large-scale text retrieval task. Existing text retrieval systems with state-of-the-art performance usually adopt a retrieve-then-reranking architecture due to the high computational cost of pre-trained language models and the large corpus size. Under such a multi-stage architecture, previous studies mainly focused on optimizing single stage of the framework thus improving the overall retrieval performance. However, how to directly couple multi-stage features for optimization has not been well studied. In this paper, we design Hybrid List Aware Transformer Reranking (HLATR) as a subsequent reranking module to incorporate both retrieval and reranking stage features. HLATR is lightweight and can be easily parallelized with existing text retrieval systems so that the reranking process can be performed in a single yet efficient processing. Empirical experiments on two large-scale text retrieval datasets show that HLATR can efficiently improve the ranking performance of existing multi-stage text retrieval methods.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
passage-re-ranking-on-ms-marcoHLATR
MRR: 0.42

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
HLATR: Enhance Multi-stage Text Retrieval with Hybrid List Aware Transformer Reranking | Papers | HyperAI