HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving

Imanol Schlag Paul Smolensky Roland Fernandez Nebojsa Jojic Jürgen Schmidhuber Jianfeng Gao

Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving

Abstract

We incorporate Tensor-Product Representations within the Transformer in order to better support the explicit representation of relation structure. Our Tensor-Product Transformer (TP-Transformer) sets a new state of the art on the recently-introduced Mathematics Dataset containing 56 categories of free-form math word-problems. The essential component of the model is a novel attention mechanism, called TP-Attention, which explicitly encodes the relations between each Transformer cell and the other cells from which values have been retrieved by attention. TP-Attention goes beyond linear combination of retrieved values, strengthening representation-building and resolving ambiguities introduced by multiple layers of standard attention. The TP-Transformer's attention maps give better insights into how it is capable of solving the Mathematics Dataset's challenging problems. Pretrained models and code will be made available after publication.

Code Repositories

ischlag/TP-Transformer
Official
pytorch
Mentioned in GitHub
jlrussin/interpret-math-transformer
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
question-answering-on-mathematics-datasetTP-Transformer
Accuracy: 0.8192

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp