I’m thinking of building a “Gesture-Based Musical Interface” for final which will be an interactive system that allows users to create music through hand movements. Combining Arduino’s sensing capabilities with P5.js’s graphical and sound processing, this project will transform physical gestures into a live audio-visual performance. It aims to provide an intuitive way for users to interact with technology and art. I’ll use two ultrasonic sensors to detect the hand movements in X-axis and Y-axis, speakers to output the generated musical notes, and webcam to visualize the movements on p5.