HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model

Jieh-Sheng Lee; Jieh Hsiang

PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model

Abstract

In this work we focus on fine-tuning a pre-trained BERT model and applying it to patent classification. When applied to large datasets of over two millions patents, our approach outperforms the state of the art by an approach using CNN with word embeddings. In addition, we focus on patent claims without other parts in patent documents. Our contributions include: (1) a new state-of-the-art method based on pre-trained BERT model and fine-tuning for patent classification, (2) a large dataset USPTO-3M at the CPC subclass level with SQL statements that can be used by future researchers, (3) showing that patent claims alone are sufficient for classification task, in contrast to conventional wisdom.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
multi-label-text-classification-on-uspto-3mBERT
F1: 66.83%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model | Papers | HyperAI