Final Project Proposal: ExpressNotes
For my final project, I am creating ExpressNotes, an interactive audiovisual system that combines sound, visuals, and physical input to inspire real-time artistic expression. The system uses an Arduino connected to seven push buttons and a potentiometer. Each button represents a musical note in the A-G scale. When the user presses a button, the Arduino detects the interaction and sends the corresponding note information to a P5.js sketch running on a computer. The potentiometer allows the user to control the volume of the sounds being played. In response to these inputs, P5.js generates synchronized piano sounds and matching visual effects, forming a dynamic and engaging artistic experience that evolves with each interaction.
ExpressNotes is designed to turn the user into both a musician and visual artist. Each button press is more than a sound—it becomes a trigger for a burst of visual energy. The system fosters expressive exploration by offering real-time feedback. For example, pressing the “C” button might create a ripple of soft blue circles along with a gentle piano tone, while pressing “F” could unleash rotating magenta shapes accompanied by a brighter sound. Over time, the visual canvas builds up in complexity and motion, reflecting the rhythm and emotion of the user’s choices.
The Arduino is programmed to listen for button presses and read the current position of the potentiometer. When a button is pressed, it sends a message like “note:C” to the P5.js program. It also continuously reads the potentiometer value, which ranges from 0 to 1023, and sends volume updates in the form of messages like “volume:750.” The Arduino serves solely as a sender—it does not receive any data back from P5.js.
On the software side, P5.js is responsible for listening to the serial port and interpreting the messages from the Arduino. When it receives a note message, it plays the corresponding piano note and triggers a unique visual effect associated with that note. When it receives a volume message, it maps the value to a usable range and updates the volume of the audio accordingly. Each note is associated with a different visual response to create a richer and more immersive experience. For instance, the “A” note may create blue ripples, “B” may release golden triangle bursts, “C” may trigger expanding red squares, “D” might produce spiraling cyan patterns, “E” could generate flowing green waves, “F” may show rotating magenta shapes, and “G” could spark star-like white particles.
When the user interacts with the buttons and the volume knob, they create a personal audiovisual composition. The experience is fluid, immediate, and emotionally engaging. Each session is unique, and the visual display reflects the rhythm, timing, and feeling of the musical input. The evolving artwork becomes a living canvas of user expression.
In nutshell, ExpressNotes demonstrates the creative potential of combining physical sensors with generative audio-visual programming. It allows users to explore sound and visual art intuitively, transforming simple button presses into an expressive performance. The project encourages artistic freedom, emotional engagement, and a playful connection between sound and image, making it a compelling example of interactive media design.