HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Latent Predictor Networks for Code Generation

Wang Ling; Edward Grefenstette; Karl Moritz Hermann; Tomáš Kočiský; Andrew Senior; Fumin Wang; Phil Blunsom

Latent Predictor Networks for Code Generation

Abstract

Many language generation tasks require the production of text conditioned on both structured and unstructured inputs. We present a novel neural network architecture which generates an output sequence conditioned on an arbitrary number of input functions. Crucially, our approach allows both the choice of conditioning context and the granularity of generation, for example characters or tokens, to be marginalised, thus permitting scalable and effective training. Using this framework, we address the problem of generating programming code from a mixed natural language and structured specification. We create two new data sets for this paradigm derived from the collectible trading card games Magic the Gathering and Hearthstone. On these, and a third preexisting corpus, we demonstrate that marginalising multiple predictors allows our model to outperform strong benchmarks.

Code Repositories

deepmind/card2code
Official
Mentioned in GitHub
davidgolub/QuestionGeneration
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
code-generation-on-djangolpn (Ling et al., 2016)
Accuracy: 62.3
BLEU Score: 77.6
code-generation-on-djangoPhrasal Statistical MT (Ling et al., 2016)
Accuracy: 31.5
BLEU Score: 47.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp