Movement as an Interface for Audiovisual Performance
This project explores how human movement can function as an interface for audiovisual performance. Using a webcam and a YOLO-based pose detection model, the system tracks the performer’s body and extracts joint positions in real time.
These motion values are translated into OSC and MIDI signals that control visual parameters in Processing and MadMapper as well as musical parameters in Ableton Live. Instead of analysing the dancer as a single object, the system focuses on relationships between individual joints and their movement dynamics.
The project investigates how gestures can generate audiovisual structures and how computational systems can extend the expressive possibilities of live performance.
Tools
Python · YOLO Pose Detection · Jupyter Notebook · OSC · MIDI · Processing · MadMapper · Ableton Live

Processing Experiments
Studies exploring how motion data can generate responsive visual structures

Back to Top