AI-generated jazz? It’s apparently a thing now:
DeepJazz is a 2016 project by Princeton computer science student Ji-Sung Kim that spews out piano solo variations on Pat Metheny’s “And Then I Knew.” The model was created using the original Pat Metheny track MIDI file as the data source, the Keras and Theano machine learning APIs, and a long-term short memory (LTSM) recurrent neural network. Recurrent neural networks (RNNs) are popular in today’s AI composition because they learn from previous input by looping and thus backpropagate on the fly.
Problem is, it’s…not very good.
Maybe it’s just me, but improvisation doesn’t seem like the sort of thing that could be done by AI, mostly because it’s so personal. I mean, if all it took was an understanding of music theory and chord progressions and scales and all that, there’d be a lot more Coltranes out there.
“Crescent”: John Coltrane, tenor saxophone; McCoy Tyner, piano; Jimmy Garrison, bass; Elvin Jones, drums (1963)
But there aren’t. Because improvisation requires reason, reflection, and a musical vocabulary established over decades of study. It’s also because a great solo is about balancing silence, repetition, and cohesiveness with conceptual and thematic development. That’s just the fundamentals: A guitar teacher I once had stressed the importance of tension and release through resolution, which necessitates an understanding not only of the material, but also of your audience. And what about the other musicians? Improvisation doesn’t happen in a vacuum. Without the collective energy of the other musicians on stage, it’s just noodling – which is pretty much what we get from DeepJazz.
So color me skeptical. Sure, it’s a pretty impressive feat. But to quote Samuel Johnson, it’s rather “like a dog’s walking on his hind legs. It is not done well; but you are surprised to find it done at all.”