Arduino for Musicians
Arduino can be incredibly useful for musicians, especially when you're interested in creating more natural and responsive interactions. In my case, I use it as an audio interface, a movement tracker, and even as a standalone musical instrument.
I like to think of it as handmade live electronics, because, just like all my work, it’s reactive: it listens and responds to whatever the performer is doing in real time.
I program the Arduino myself and connect it to Max/MSP using OSC (Open Sound Control), allowing me to build flexible systems that translate movement, gestures, or interactions into musical events. This setup enables me to create performances where physical action directly shapes sound in real time.
Here are three examples of how I use Arduino in my practice: in the first one, I integrated mechanical percussion instruments with solenoids, which can be automated or made responsive to a performer’s movements. Additionally, my Virtual Musician system learns from gestures to activate the solenoids, generating evolving rhythmic patterns that create complex and engaging textures. This setup was first presented on September 6, 2026. In the second example, I mapped Arduino sensors to my body movements, using sensors placed on my arms, to control various parameters of a granular synthesizer. This setup lets me shape sound textures through gesture, turning the performance into an embodied and dynamic interaction between movement and sound. In the third example, Arduino functions as a custom-built interface that allows me to trigger and manipulate sound in real time.