HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

ETC: Encoding Long and Structured Inputs in Transformers

Joshua Ainslie Santiago Ontanon Chris Alberti Vaclav Cvicek Zachary Fisher Philip Pham Anirudh Ravula Sumit Sanghai Qifan Wang Li Yang

ETC: Encoding Long and Structured Inputs in Transformers

Abstract

Transformer models have advanced the state of the art in many Natural Language Processing (NLP) tasks. In this paper, we present a new Transformer architecture, Extended Transformer Construction (ETC), that addresses two key challenges of standard Transformer architectures, namely scaling input length and encoding structured inputs. To scale attention to longer inputs, we introduce a novel global-local attention mechanism between global tokens and regular input tokens. We also show that combining global-local attention with relative position encodings and a Contrastive Predictive Coding (CPC) pre-training objective allows ETC to encode structured inputs. We achieve state-of-the-art results on four natural language datasets requiring long and/or structured inputs.

Benchmarks

BenchmarkMethodologyMetrics
question-answering-on-conditionalqaETC-Pipeline
Conditional (answers): 39.4 / 41.8
Conditional (w/ conditions): 2.5 / 3.4
Overall (answers): 35.6 / 39.8
Overall (w/ conditions): 26.9 / 30.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp