The core idea of this project is to develop an interactive sound visualizer that not only generates music based on keyboard inputs but also visualizes this music with unique, noise-driven shapes on the screen. Each key press corresponds to a different note, and the shapes evolve in real-time to reflect the pitch, volume, and timbre of the sounds being produced. The project aims to blur the lines between music creation and visual art, offering users a multi-sensory experience of composing and visualizing music in real time.
Users interact with the application primarily through the keyboard, where each key triggers a distinct musical note or sound. The visual component consists of evolving shapes whose characteristics (e.g., size, color, noise level) change in response to the music. The proposed enhancement involves integrating hand detection technology, potentially through a webcam and machine learning models, to allow users to manipulate sound characteristics (like volume and pitch) and visual aspects (like shape complexity) with hand movements and gestures.
The main component is the visual class:
class ShapeVisualizer { constructor(x, y) { this.x = x; this.y = y; } draw(size, alpha, weight) { strokeWeight(weight); push(); translate(this.x, this.y); beginShape(); for (let a = 0; a < TWO_PI; a += 0.02) { let xoff = map(cos(a), -1, 1, 0, 5) + angle; let yoff = map(sin(a), -1, 1, 0, 5) + angle; let r = map(noise(xoff, yoff), 0, 1, size * 0.5, size); let x = r * cos(a); let y = r * sin(a); let hue = (angle * 50) % 360; stroke(hue, 100, 100, alpha); vertex(x, y); } endShape(CLOSE); pop(); } }
Hopefully, for future I want to make this more of an interactive synthesizer, hopefully, as I said, using hand motion