Project Proposal – Stefania Petre

Hey there!

For this final, I have imagined stepping into a world where art and technology intertwine to create something truly special.

Envision an interactive space where art and technology fuse, transforming physical gestures into a cascade of colors on a digital canvas. My project is an invitation to express creativity through motion, as participants use a paintbrush to navigate a symphony of shapes and hues. This is not just an artwork but an experience that captures the essence of movement and translates it into a personalized digital masterpiece.

Arduino Part:

At the interaction’s core lies the Arduino Uno, paired with a ZX Distance and Gesture Sensor. The sensor is adept at tracking the proximity of the paintbrush and the subtle hand gestures of the artist.

  • Input: Proximity data and gesture commands from the ZX Sensor.
  • Output: Serial communication to relay the sensor data to the computer running the P5 sketch.
  • Data to P5: Real-time Z-axis data for proximity (distance of the hand or brush from the sensor) and X-axis data for lateral movement, along with gesture detections (swipes, taps).
  • From P5: Instructions may be sent back to calibrate gesture sensitivity or toggle the sensor’s active state.

P5JS Part:

On the digital front, P5 will act as the canvas and palette, responsive and mutable. It will process the incoming data from the Arduino, converting it into an array of colors and movements on the screen.

  • Receiving Data: Interpretation of proximity and gesture data from the Arduino.
  • Processing Movements: Real-time mapping of hand movements to strokes and splashes of color, varying in intensity and spread on the digital canvas.
  • Visual Feedback: Dynamic visual changes to represent the flow and dance of the user’s movements.
  • To Arduino: Signals for adjusting the ZX Sensor settings based on real-time performance and user experience feedback.

So far, this is how the painting should look like:

 

Development and User Testing:

The ZX Distance and Gesture Sensor is now integrated with Arduino. The creation of a smooth data flow to the P5 programme is the urgent objective. The system should react to hand motions by displaying appropriate visual changes on the screen by the time of user testing the following week. Assessing user interaction—specifically, how natural and fulfilling it is to paint in midair—will be largely dependent on this initial prototype. The documentation of user testing will involve multiple techniques such as gathering feedback from participants, recording interactions on video, and analysing the responsiveness of the system.

 

 

Leave a Reply