This is one of a series of experiments using AI. I have improved the system somewhat since the below capture. I am now exploring new ways to apply the technology.
Mixing procedurally generated visuals with archive footage and post processinng. Audio reactivity and expressive hands on control using a MIDI controller to deliver a live performance.
This interdisciplinary collaboration with dance artists at BLOK Manchester was part of an embodied looping exploration. The system used TouchDesigner for generative visuals and Ableton Live as the audio engine, communicatiing via OSC
Summary of project
This immersive arts project war born out of my research into and around how to elicit the emotion of positive awe in an audience. It aimed to do this by
With some prior experience in audio visual production and performance I wanted to explore this area some more through further practical tests.
I have previous experience as a DJ in live environments… manipulating