HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

On Tree-Based Neural Sentence Modeling

Haoyue Shi; Hao Zhou; Jiaze Chen; Lei Li

On Tree-Based Neural Sentence Modeling

Abstract

Neural networks with tree-based sentence encoders have shown better results on many downstream tasks. Most of existing tree-based encoders adopt syntactic parsing trees as the explicit structure prior. To study the effectiveness of different tree structures, we replace the parsing trees with trivial trees (i.e., binary balanced tree, left-branching tree and right-branching tree) in the encoders. Though trivial trees contain no syntactic information, those encoders get competitive or even better results on all of the ten downstream tasks we investigated. This surprising result indicates that explicit syntax guidance may not be the main contributor to the superior performances of tree-based neural sentence modeling. Further analysis show that tree modeling gives better results when crucial words are closer to the final representation. Additional experiments give more clues on how to design an effective tree-based encoder. Our code is open-source and available at https://github.com/ExplorerFreda/TreeEnc.

Code Repositories

ExplorerFreda/TreeEnc
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
sentiment-analysis-on-amazon-review-fullGumbel+bi-leaf-RNN
Accuracy: 49.7
sentiment-analysis-on-amazon-review-polarityGumbel+bi-leaf-RNN
Accuracy: 88.1
text-classification-on-ag-newsBalanced+bi-leaf-RNN
Error: 7.9
text-classification-on-dbpediaBalanced+bi-leaf-RNN
Error: 1.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp