Command Palette
Search for a command to run...
Chengyue Gong; Di He; Xu Tan; Tao Qin; Liwei Wang; Tie-Yan Liu

Abstract
Continuous word representation (aka word embedding) is a basic building block in many neural network-based models used in natural language processing tasks. Although it is widely accepted that words with similar semantics should be close to each other in the embedding space, we find that word embeddings learned in several tasks are biased towards word frequency: the embeddings of high-frequency and low-frequency words lie in different subregions of the embedding space, and the embedding of a rare word and a popular word can be far from each other even if they are semantically similar. This makes learned word embeddings ineffective, especially for rare words, and consequently limits the performance of these neural network models. In this paper, we develop a neat, simple yet effective way to learn \emph{FRequency-AGnostic word Embedding} (FRAGE) using adversarial training. We conducted comprehensive studies on ten datasets across four natural language processing tasks, including word similarity, language modeling, machine translation and text classification. Results show that with FRAGE, we achieve higher performance than the baselines in all tasks.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| language-modelling-on-penn-treebank-word | FRAGE + AWD-LSTM-MoS + dynamic eval | Params: 22M Test perplexity: 46.54 Validation perplexity: 47.38 |
| language-modelling-on-wikitext-2 | FRAGE + AWD-LSTM-MoS + dynamic eval | Number of params: 35M Test perplexity: 39.14 Validation perplexity: 40.85 |
| machine-translation-on-iwslt2015-german | Transformer with FRAGE | BLEU score: 33.97 |
| machine-translation-on-wmt2014-english-german | Transformer Big with FRAGE | BLEU score: 29.11 Hardware Burden: Operations per network pass: |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.