LightSynth is a real-time generative music visualisation app developed in Unity that responds to MIDI messages to create visuals that are tightly synchronised with the musical performance. It is a work in progress, but can already create some expressive visuals that I use in live electronic music performances under the moniker INVADR.
Using MIDI, rather than audio, means that precise note and timing information can be used to trigger visual responses to the music in real-time. MIDI also enables visuals to respond individually to different parts of the music.
LightSynth can respond individually to MIDI notes and CC ( controller change ) messages from multiple channels. This means that different visual effects and animations can be mapped to each part of the music. For example, notes played by a synth can be mapped onto a ring of spheres, with each note representing a different colour, whilst the kick drum makes all elements on screen ‘bounce’.
I’ve created a system of ‘Modifiers’ which change aspects of the visuals on screen, such as position, scale and colour. These modifiers can be combined and then assigned to ‘Message Inputs’. ‘Message Inputs’ can be configured to listen to particular MIDI channels, message types and note ranges. The Modifiers are then mapped onto the note, velocity, hold duration and time elapsed since the message was received. Building up scenes comprised of many Message Inputs and stacks of Modifiers allows for rich and varied visual responses.