HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Cell-aware Stacked LSTMs for Modeling Sentences

Jihun Choi; Taeuk Kim; Sang-goo Lee

Cell-aware Stacked LSTMs for Modeling Sentences

Abstract

We propose a method of stacking multiple long short-term memory (LSTM) layers for modeling sentences. In contrast to the conventional stacked LSTMs where only hidden states are fed as input to the next layer, the suggested architecture accepts both hidden and memory cell states of the preceding layer and fuses information from the left and the lower context using the soft gating mechanism of LSTMs. Thus the architecture modulates the amount of information to be delivered not only in horizontal recurrence but also in vertical connections, from which useful features extracted from lower layers are effectively conveyed to upper layers. We dub this architecture Cell-aware Stacked LSTM (CAS-LSTM) and show from experiments that our models bring significant performance gain over the standard LSTMs on benchmark datasets for natural language inference, paraphrase detection, sentiment classification, and machine translation. We also conduct extensive qualitative analysis to understand the internal behavior of the suggested approach.

Benchmarks

BenchmarkMethodologyMetrics
natural-language-inference-on-snli300D 2-layer Bi-CAS-LSTM
% Test Accuracy: 87
paraphrase-identification-on-quora-questionBi-CAS-LSTM
Accuracy: 88.6
sentiment-analysis-on-sst-2-binaryBi-CAS-LSTM
Accuracy: 91.3
sentiment-analysis-on-sst-5-fine-grainedBi-CAS-LSTM
Accuracy: 53.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Cell-aware Stacked LSTMs for Modeling Sentences | Papers | HyperAI