HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Protoformer: Embedding Prototypes for Transformers

Ashkan Farhangi Ning Sui Nan Hua Haiyan Bai Arthur Huang Zhishan Guo

Protoformer: Embedding Prototypes for Transformers

Abstract

Transformers have been widely applied in text classification. Unfortunately, real-world data contain anomalies and noisy labels that cause challenges for state-of-art Transformers. This paper proposes Protoformer, a novel self-learning framework for Transformers that can leverage problematic samples for text classification. Protoformer features a selection mechanism for embedding samples that allows us to efficiently extract and utilize anomalies prototypes and difficult class prototypes. We demonstrated such capabilities on datasets with diverse textual structures (e.g., Twitter, IMDB, ArXiv). We also applied the framework to several models. The results indicate that Protoformer can improve current Transformers in various empirical settings.

Benchmarks

BenchmarkMethodologyMetrics
text-classification-on-arxiv-10Protoformer
Accuracy: 0.794

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp