Ben Hayes
I am a PhD student in Artificial Intelligence and Music at Queen Mary University of London’s Centre for Digital Music. My research currently focuses on differentiable digital signal processing for deep learning based control over audio synthesisers and effects. I am particularly interested in how we might overcome the optimisation pathologies that currently limit this methodology. My broader research interests include audio synthesis, symmetry, meta-learning, timbre perception, and cross-modal interactions. I am jointly supervised by Dr Charalampos Saitis and Dr György Fazekas, and am very grateful to be funded by the UKRI Centre for Doctoral Training in Artificial Intelligence and Music.
Previously, I was Music Lead at the award-winning AI-driven generative music startup Jukedeck, and was a research intern with ByteDance’s Speech, Audio & Music Intelligence (SAMI) team. I also make music and taught undergraduate Electronic and Produced music at the Guildhall School of Music and Drama.
I am particularly open to collaborations with musicians and artists looking to apply artificial intelligence to their work, engineers interested in building new musical tools, and researchers working on related topics.
You can contact me at b.j.hayes (at) qmul.ac.uk