HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Nyströmformer: A Nyström-Based Algorithm for Approximating Self-Attention

Yunyang Xiong Zhanpeng Zeng Rudrasis Chakraborty Mingxing Tan Glenn Fung Yin Li Vikas Singh

Nyströmformer: A Nyström-Based Algorithm for Approximating Self-Attention

Abstract

Transformers have emerged as a powerful tool for a broad range of natural language processing tasks. A key component that drives the impressive performance of Transformers is the self-attention mechanism that encodes the influence or dependence of other tokens on each specific token. While beneficial, the quadratic complexity of self-attention on the input sequence length has limited its application to longer sequences -- a topic being actively studied in the community. To address this limitation, we propose Nyströmformer -- a model that exhibits favorable scalability as a function of sequence length. Our idea is based on adapting the Nyström method to approximate standard self-attention with $O(n)$ complexity. The scalability of Nyströmformer enables application to longer sequences with thousands of tokens. We perform evaluations on multiple downstream tasks on the GLUE benchmark and IMDB reviews with standard sequence length, and find that our Nyströmformer performs comparably, or in a few cases, even slightly better, than standard self-attention. On longer sequence tasks in the Long Range Arena (LRA) benchmark, Nyströmformer performs favorably relative to other efficient self-attention methods. Our code is available at https://github.com/mlpen/Nystromformer.

Benchmarks

BenchmarkMethodologyMetrics
natural-language-inference-on-qnliNyströmformer
Accuracy: 88.7%
semantic-textual-similarity-on-mrpcNyströmformer
F1: 88.1%
sentiment-analysis-on-imdbNyströmformer
Accuracy: 93.2
sentiment-analysis-on-sst-2-binaryNyströmformer
Accuracy: 91.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp