Ben Hayes

I am a PhD student in Artificial Intelligence and Music at Queen Mary University of London. My research bridges neural audio synthesis, the psychoacoustics of musical timbre, and applications of deep generative models to audio. I am very grateful to be funded by the UKRI Centre for Doctoral Training in Artificial Intelligence and Music, and am a member of the Centre for Digital Music.

Previously, I was Music Lead at the award-winning AI-driven generative music startup Jukedeck, who were acquired by ByteDance. I also make music.

You can contact me at b.j.hayes (at)

selected publications

  1. ISMIR
    Neural Waveshaping Synthesis
    In Proceedings of the 22nd International Society for Music Information Retrieval Conference 2021
  2. ICMPC
    Perceptual and semantic scaling of FM synthesis timbres: Common dimensions and the role of expertise
    In 16th International Conference on Music Perception and Cognition 2021
  3. DMRN
    Perceptual Similarities in Neural Timbre Embeddings
    Hayes, Ben, Brosnahan, Luke, Saitis, Charalampos, and Fazekas, George
    In DMRN+15: Digital Music Research Network One-Day Workshop 2020 2020
  4. Timbre
    There’s More to Timbre than Musical Instruments: Semantic Dimensions of FM Sounds
    Hayes, Ben, and Saitis, Charalampos
    In Proceedings of the 2nd International Conference on Timbre 2020
  5. Timbre
    Evidence for Timbre Space Robustness to an Uncontrolled Online Stimulus Presentation
    Zacharakis, Asterios, Hayes, Ben, Saitis, Charalampos, and Pastiadis, Konstantinos
    In Proceedings of the 2nd International Conference on Timbre 2020