Week 12: Final Project Draft

Concept:

The proposed project creates an interactive hand-gesture-based drawing system that enables users to draw on a digital canvas using hand movements. By combining P5.js, ML5.js Handpose, and Arduino, the project bridges physical gestures with digital creativity. The system uses hand tracking to allow natural interactions and integrates real-time physical feedback through LEDs and a vibration motor, offering an immersive experience.

The primary goal is to deliver a tactile and visual interface where users feel a sense of creation and engagement as their gestures directly influence the drawing and physical environment. The interplay between the digital canvas and physical feedback fosters an innovative and intuitive drawing system.

 

Arduino Program

The Arduino system responds to commands received from P5.js, controlling physical feedback components:

  • NeoPixel LEDs display colors that reflect the brush dynamics, such as the current color selected for drawing. Faster hand movements can increase brightness or trigger lighting effects.
  • Vibration motors provide tactile feedback during specific actions, such as clearing the canvas or activating special effects.

The Arduino continuously listens to P5.js via serial communication, mapping the incoming data (e.g., RGB values, movement speed) to appropriate hardware actions

P5.js Program

The P5.js program is using ML5.js Handpose to track the user’s hand and fingers. The tracked position—primarily the palm center or fingertip—is mapped to the digital canvas for drawing.

As the user moves their hand, the P5.js canvas renders brush strokes in real-time. Brush properties such as size and color dynamically adjust based on gestures or key inputs. For instance:

  • Moving the palm creates brush strokes at the detected location.
  • Pressing specific keys changes brush size (+/-) or color (C).
  • Gestures like a quick swipe could trigger special visual effects.

The program communicates with Arduino to send brush-related data such as the selected color and movement intensity. This ensures the physical environment (e.g., LED lighting) mirrors the user’s actions on the canvas.

Interaction Flow

  1. The system starts with a webcam feed processed by ML5.js Handpose.
  2. P5.js tracks hand movements and maps positions to the canvas.
  3. Users draw by moving their hand, with brush strokes appearing at the tracked position.
  4. Real-time brush properties, such as color and size, are sent to Arduino.
  5. Arduino reflects these changes through LEDs and tactile feedback:
    • The LEDs light up with the same brush color.
    • Vibration occurs during specific gestures or interactions, providing physical confirmation of actions.
  6. The user experiences synchronized digital and physical interactions, offering a sense of control and creativity.

Goals and Outcomes

The project aims to deliver an interactive tool that provides:

  1. A dynamic and intuitive drawing experience using hand gestures.
  2. Real-time physical feedback through LEDs and vibration motors.
  3. A visually and physically engaging system that bridges the digital and physical worlds.

Current Code:

Leave a Reply