Command Palette
Search for a command to run...
Breaking Through the 80% Glass Ceiling: Raising the State of the Art in Word Sense Disambiguation by Incorporating Knowledge Graph Information
{Michele Bevilacqua Roberto Navigli}

Abstract
Neural architectures are the current state of the art in Word Sense Disambiguation (WSD). However, they make limited use of the vast amount of relational information encoded in Lexical Knowledge Bases (LKB). We present Enhanced WSD Integrating Synset Embeddings and Relations (EWISER), a neural supervised architecture that is able to tap into this wealth of knowledge by embedding information from the LKB graph within the neural architecture, and to exploit pretrained synset embeddings, enabling the network to predict synsets that are not in the training set. As a result, we set a new state of the art on almost all the evaluation settings considered, also breaking through, for the first time, the 80{%} ceiling on the concatenation of all the standard all-words English WSD evaluation benchmarks. On multilingual all-words WSD, we report state-of-the-art results by training on nothing but English.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| word-sense-disambiguation-on-supervised | EWISER | SemEval 2007: 71.0 SemEval 2013: 78.9 SemEval 2015: 79.3 Senseval 2: 78.9 Senseval 3: 78.4 |
| word-sense-disambiguation-on-supervised | EWISER+WNGC | SemEval 2007: 75.2 SemEval 2013: 80.7 SemEval 2015: 81.8 Senseval 2: 80.8 Senseval 3: 79.0 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.