Dimitris Menexopoulos
Queen Mary University of London
iGGi PG Researcher
Dimitris Menexopoulos is a versatile music composer, sound designer, audio technologist, and multi-instrumentalist from Thessaloniki, Greece.
With an academic background in Geoscience, Electronic Production, and Information Experience Design, he draws elements from a wide knowledge spectrum in the fields of Art, Science, and Technology to carry out his work. He has released two solo albums under his name (Perpetuum Mobile - 2017, Phenomena - 2014), two EPs (Modern Catwalk Music - 2022, 40 EP - 2020), two published soundtracks (Iolas Wonderland - 2021, The Village - 2019), and has performed internationally. His collaborations include work with electronic musician Robert Rich (Vestiges - 2016), director Shekhar Kapur (Brides of the Well - 2018), and film composer George Kallis (The Last Warrior: Root of Evil - 2021, Cliffs of Freedom - 2019), among others.
As a designer, he has presented work at prominent venues including the Barbican Centre (Nesta FutureFest - 2019, with Akvile Terminaite), Somerset House (24 Hours in Uchronia with Helga Schmid - 2020), and Christie's London (Christie's Lates - 2023, with Scarlett Yang). His current research focuses on Graphics-Based Procedural Sound Design for Games as well as Innovative Music Composition and Performance Systems.
His original scientific publications and devices have been presented at prestigious events in Japan (AES 6th International Conference on Audio for Games - 2024), Spain (AES Europe - 2024), the UK (Iklectik - 2020), France (IRCAM - 2020, 2019), and the USA (Mass MoCA - 2019).
A description of Dimitris' research:
Procedural content generation supports the creation of rich and varied games, but audio design has not kept pace with such innovation. Often the visual aspects of every asset in the scene may be procedurally rendered, yet audio developers still rely mostly on pre-recorded samples in order to carry out their tasks. However, much of the information required to determine smooth audiovisual interactions is already there. For example, the size, shape, material and movement of assets offer potential types of data that can drive audio algorithms directly. This topic explores how available animation information in the game engine can be used to generate the sound effects produced when objects interact in real-time.
Mastodon
Other links
Website
Github