HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Counterpoint by Convolution

Cheng-Zhi Anna Huang; Tim Cooijmans; Adam Roberts; Aaron Courville; Douglas Eck

Counterpoint by Convolution

Abstract

Machine learning models of music typically break up the task of composition into a chronological process, composing a piece of music in a single pass from beginning to end. On the contrary, human composers write music in a nonlinear fashion, scribbling motifs here and there, often revisiting choices previously made. In order to better approximate this process, we train a convolutional neural network to complete partial musical scores, and explore the use of blocked Gibbs sampling as an analogue to rewriting. Neither the model nor the generative procedure are tied to a particular causal direction of composition. Our model is an instance of orderless NADE (Uria et al., 2014), which allows more direct ancestral sampling. However, we find that Gibbs sampling greatly improves sample quality, which we demonstrate to be due to some conditional distributions being poorly modeled. Moreover, we show that even the cheap approximate blocked Gibbs procedure from Yao et al. (2014) yields better samples than ancestral sampling, based on both log-likelihood and human evaluation.

Code Repositories

lukewys/coconet-pytorch
pytorch
Mentioned in GitHub
czhuang/coconet
Official
tf
kevindonoghue/coconet-pytorch
pytorch
Mentioned in GitHub
prentrodgers/coconet-pytorch-csound
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
music-modeling-on-jsb-choralesCoCoNet
NLL: 2.22

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp