For my final project, I want to create a motion-responsive interactive installation. The idea is that a user’s movements will be sensed by Arduino sensors, like an ultrasonic sensor, and sent to p5.js, which will process the input and generate dynamic visuals. I haven’t yet decided on the exact type of art to display; it could be abstract particle systems, animated images, or other digital creations. However, the key is that the visuals will respond in real-time to the user’s motion, making the experience immersive and engaging.
A recent trip I made to teamLab Phenomena, where I experienced installations that react to human gestures, is the inspiration behind this project. For example, moving through a space could cause flowers to bloom digitally or change the colors of projected visuals. I want to recreate that sense of wonder and interaction in my own project, where the user feels connected to the digital environment.
This project will be bidirectional.
-
Arduino → p5.js: The user’s gestures are sensed by Arduino and drive the visuals, controlling motion, shape, size, or color of the on-screen elements.
-
p5.js → Arduino: When the visuals reach certain thresholds, for instance, a cluster of particles grows large, or an animated image reaches a boundary, p5.js will send meaningful signals back to Arduino. These signals could trigger LEDs or small motors to give tactile feedback, making the interaction feel alive and responsive. This way, the user sees the effect on screen and feels it physically, creating a continuous loop of action and response.