HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

HittER: Hierarchical Transformers for Knowledge Graph Embeddings

Sanxing Chen Xiaodong Liu Jianfeng Gao Jian Jiao Ruofei Zhang Yangfeng Ji

HittER: Hierarchical Transformers for Knowledge Graph Embeddings

Abstract

This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.

Code Repositories

seeyourmind/tkgelib
pytorch
Mentioned in GitHub
zjunlp/relphormer
pytorch
Mentioned in GitHub
microsoft/HittER
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
link-prediction-on-fb15k-237HittER
Hit@1: 0.279
Hit@10: 0.558
Hits@3: 0.409
MRR: 0.373
link-prediction-on-wn18rrHittER
Hits@1: 0.462
Hits@10: 0.584
Hits@3: 0.516
MRR: 0.503

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp