The Future: What's happening at Queen Mary University on Artificial Intelligence and Music

A few months ago I joined the Advisory Board for the Queen Mary University Centre for Doctoral Training (CDT) for Artificial Intelligence & Music (AIM). Queen Mary is genuinely one of the world's leading centres for this area of research, a crucial area for the future of music and the creative industries. AIM's research is focused on 3 broad themes: music understanding, intelligent instruments and interfaces and computational creativity.

I'm delighted to play my part in this, however small. And it's already been a rewarding journey and one with a lot of learning.

Last week we had a Board Meeting that involved a number of student showcases, which were impressive and inspiring. I decided to write this post to share a flavour of the work going on there, just 1.5 years into the formation of this new Centre for Doctoral Training.

1607013705941.png

David Foster is working on a paper titled "Automatic identification, reproduction and synthesis of jazz performances". He is using machine learning to interpret and reproduce a specific musician's style, and recurrent neural networks to predict the next note in the sequence. Imagine being able to select a score and choose to play back the saxophone in the style of Charlie Parker, John Coltrane or Shabaka Hutchings! This is interesting research, not only because Jazz is arguably the greatest artform in the world (yes, I welcome your challenge on this!), but also because is it currently underrepresented in music information retrieval (MIR) research.

Lele Liu's research is on "Joint multi-pitch detection and score transcription for polyphonic piano music". She is using scores from the MuseScore website as training data, played back on the Native Instruments Contact Player using 4 different piano styles. I learned about a number of interesting challenges from her short talk. For example, figuring out what to represent in the left hand vs right hand score when automatically transcribing polyphonic music. And the challenge of outputting a score it in a visually elegant manner, for which I learned of this very cool project called LilyPond

1607013992470.png

Andrea Martelloni's work is on "Percussive Fingerstyle Guitar through the Lens of NIME: an Interview Study". NIME means "new interfaces for musical expression" - so we're moving from software to hardware, and a fascinating area for innovation in music. I loved his presentation, which was more of a qualitative deep dive into a scene of guitar playing called "percussive fingerstyle guitar" which maximises the instrument for percussion and melody in parallel - check this out for a primer. The goal was to understand how these players use the current form factor and what can be reinvented to create a new interface. Excited to see where this one goes.

1607014781946.png

Other presentations were on "Automatic Music Transcription (AMT) using Deep Learning" (Mary Pilataki-Manika), which is looking at the challenge of multi instrument transcription amongst other things. And finally "Musical Smart City: Perspectives on Ubiquitous Sonification" (Pedro Sarmento). In his research, Pedro looked at the sonification of air pollution levels (NO2) on Mile End Road applied to the piece Air on the G String by Bach. Very interesting to mull over how smart data from our cities can be used and interpreted sonically.

This is just a subset of the research and we will continue to have presentations from the students, and we will continue to discuss how the CDT can play an important role in researching and representing the breadth of topics that are important to the future of the music and creative industries.

CALLS TO ACTION:

  1. I'd love to hear opinions and advice from people who know about this subject to help me make the most of my role in the Advisory Board: Recommended reading. What areas of AI & Music are you excited by? What worries you? Any topics you really think we should be discussing in the Board Meetings?

  2. The Centre are always interested in conversations with new industry partners, so hit me up if you think this is relevant and you want to find out more.

Previous
Previous

Guest Post for the Webby Awards

Next
Next

END OF AN ERA: STEPPING BACK FROM MIXCLOUD