HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Enhancing Sentence Embedding with Generalized Pooling

Qian Chen; Zhen-Hua Ling; Xiaodan Zhu

Enhancing Sentence Embedding with Generalized Pooling

Abstract

Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention. We evaluate the proposed model on three different tasks: natural language inference (NLI), author profiling, and sentiment classification. The experiments show that the proposed model achieves significant improvement over strong sentence-encoding-based methods, resulting in state-of-the-art performances on four datasets. The proposed approach can be easily implemented for more problems than we discuss in this paper.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
natural-language-inference-on-snli600D BiLSTM with generalized pooling
% Test Accuracy: 86.6
% Train Accuracy: 94.9
Parameters: 65m
sentiment-analysis-on-yelp-fine-grainedBiLSTM generalized pooling
Error: 33.45

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Enhancing Sentence Embedding with Generalized Pooling | Papers | HyperAI