Week 14 – Final Project

Flock Together: Gesture-Controlled Boids

Concept:

This interactive project brings the beautiful patterns of flocking behavior to life through a blend of digital and physical interactions. Drawing inspiration from Craig Reynolds’ classic Boids algorithm, the simulation creates an emergent collection of entities that move with collective “intelligence” (just like birds). What makes this implementation special is how it places control directly in your hands (literally).

Using hand tracking technology, you can shape the flock by pinching different fingers against your thumb: switch between triangles, circles, squares, and stars with your left hand, or adjust the flocking parameters with your right. The experience extends beyond the screen through an Arduino connection, where turning a physical knob changes the speed of the entire flock, and buttons let you add or remove boids in groups. The result is a meditative yet playful experience that bridges the digital and physical worlds, inviting you to explore how simple rules can create complex, beautiful patterns that respond intuitively to your gestures and touch.

Implementation Details:

I put together a fun web demo of Craig Reynolds’ Boids algorithm that you can actually mess with using your hands, your mouse, or even an Arduino knob. Behind the scenes there’s a Flock class full of little boids that follow separation, alignment, and cohesion rules you can tweak on the fly. You can choose triangles, circles, squares, or stars, and twist a physical potentiometer (via p5.webserial.js) to speed them up from half-speed to 5x, or hit “ADD” or “REMOVE” commands to add or remove 5 boids. In the browser, I use ML5.js’s HandPose model to spot pinch gestures, pinch with your left hand to swap shapes, pinch with your right to dial in the steering forces, and I run the video offscreen so everything stays buttery smooth. The canvas automatically resizes to fit your window, shows a chill gradient background that subtly reacts to your hardware tweaks, and even displays connection status and boid count. If hand tracking hiccups or the Arduino disconnects, it just auto-reconnects so the flock party never has to stop 🙂

Interaction Design:

The application uses ML5’s handpose detection to implement a natural gestural interface:

  1. Left Hand Controls Shape Selection:
    • Index finger + thumb pinch: Triangle shape (shape 0)
    • Middle finger + thumb pinch: Circle shape (shape 1)
    • Ring finger + thumb pinch: Square shape (shape 2)
    • Pinky finger + thumb pinch: Star shape (shape 3)
  2. Right Hand Controls Flocking Parameters:
    • Middle finger + thumb pinch: Increases separation force (from 1.5 to 8.0)
    • Ring finger + thumb pinch: Increases cohesion force (from 1.0 to 2.0)
    • Pinky finger + thumb pinch: Increases alignment force (from 1.0 to 2.0)

The gesture system uses a distance threshold of 20 pixels between finger and thumb to trigger actions, making the interaction intuitive and responsive.

Physical Hardware Integration

The project incorporates an Arduino with physical controls:

  1. Potentiometer Input:
    • Controls the movement speed of the boids (mapped from 0.5 to 5.0)
    • Values from Arduino (0-1023) are normalized for smooth speed control
  2. Button Controls (inferred from serial messages):
    • “ADD” command: Adds 5 new boids to the simulation
    • “REMOVE” command: Removes 5 boids from the simulation

Serial Communcation:

The serial communication in this project connects the web application with an Arduino microcontroller using the p5.webserial.js library. In sketch.js, the app initializes a serial port (createSerial()) in the setupSerial() function and attempts to connect to previously used ports. When connected (indicated by the green indicator), the application receives two types of data: commands like “ADD” and “REMOVE” that control the boid population (adding or removing 5 boids at a time), and analog values from a potentiometer (0-1023) that control the speed of the boids. The potentiometer value is normalized (mapped to a range of 0.5-5) and applied to the flock’s movement speed via the flock.updateSpeed() method. Users can also manually connect to the Arduino by clicking on the screen if the connection indicator shows red. This bidirectional communication allows physical hardware to influence the digital flocking simulation in real-time.

Schematic:

Arduino Code:

// Pin definitions
const int BUTTON_ADD_PIN = 2;     // Button to add boids
const int BUTTON_REMOVE_PIN = 3;  // Button to remove boids
const int POT_PIN = A0;           // Potentiometer for background color

// Variables to keep track of button states
int buttonAddState = HIGH;         // Current state of add button (assume pulled high)
int lastButtonAddState = HIGH;     // Previous state of add button
int buttonRemoveState = HIGH;      // Current state of remove button
int lastButtonRemoveState = HIGH;  // Previous state of remove button

// Variables for debouncing
unsigned long lastDebounceTime = 0;  
const unsigned long debounceDelay = 50;  // Debounce time in milliseconds

// Variable to store pot value
int potValue = 0;
int lastPotValue = -1;  // Store last pot value to detect significant changes

void setup() {
  // Initialize serial communication at 9600 bps
  Serial.begin(9600);
  
  // Set button pins as inputs with pullup resistors
  pinMode(BUTTON_ADD_PIN, INPUT_PULLUP);
  pinMode(BUTTON_REMOVE_PIN, INPUT_PULLUP);
  
  // No need to set analog pin mode for potentiometer
}

void loop() {
  // Read button states (LOW when pressed, HIGH when released due to pullup)
  int readingAdd = digitalRead(BUTTON_ADD_PIN);
  int readingRemove = digitalRead(BUTTON_REMOVE_PIN);
  
  // Check if add button state changed
  if (readingAdd != lastButtonAddState) {
    lastDebounceTime = millis();
  }
  
  // Check if remove button state changed
  if (readingRemove != lastButtonRemoveState) {
    lastDebounceTime = millis();
  }
  
  // Wait for debounce time to pass
  if ((millis() - lastDebounceTime) > debounceDelay) {
    // Update button states if they've changed
    if (readingAdd != buttonAddState) {
      buttonAddState = readingAdd;
      
      // If button is pressed (LOW), send command to add boids
      if (buttonAddState == LOW) {
        Serial.println("ADD");
      }
    }
    
    if (readingRemove != buttonRemoveState) {
      buttonRemoveState = readingRemove;
      
      // If button is pressed (LOW), send command to remove boids
      if (buttonRemoveState == LOW) {
        Serial.println("REMOVE");
      }
    }
  }
  
  // Read potentiometer value (0-1023)
  potValue = analogRead(POT_PIN);
  
  // Only send pot value if it changed significantly (to reduce serial traffic)
  if (abs(potValue - lastPotValue) > 10) {
    Serial.println(potValue);
    lastPotValue = potValue;
  }
  
  // Update last button states
  lastButtonAddState = readingAdd;
  lastButtonRemoveState = readingRemove;
  
  // Small delay to stabilize readings
  delay(10);
}

p5 code:

sketch.js:

let handPose;                            // ml5 HandPose model
const baseWidth = 1440;                  // reference canvas width
const baseHeight = 900;                  // reference canvas height
const shape = 0;                         // initial shape type (0–triangle)
                                           // 0 - triangle, 1 - circle, 2 - square, 3 - stars

let flock;                               // Flock instance
let video;                               // video capture
let port;                                // serial port
let serialConnected = false;             // serial connection flag
let potentiometerValue = 0;              // analog input from Arduino

// draw a star shape at (x,y)
function star(x, y, radius1, radius2, npoints) {
    let angle = TWO_PI / npoints;
    let halfAngle = angle / 2.0;
    beginShape();
    for (let a = 0; a < TWO_PI; a += angle) {
        // outer vertex
        vertex(x + cos(a) * radius2, y + sin(a) * radius2);
        // inner vertex
        vertex(x + cos(a + halfAngle) * radius1, y + sin(a + halfAngle) * radius1);
    }
    endShape(CLOSE);
}

class Flock {
    constructor() {
        this.boids = [];
        this.numBoids = 100;
        this.shape = shape;
        this.speedMultiplier = 1;
        this.separationWeight = 1.5;
        this.cohesionWeight = 1.0;
        this.alignmentWeight = 1.0;

        // initialize boids at random positions
        for (let i = 0; i < this.numBoids; i++) {
            this.boids.push(new Boid(random(width), random(height), this.shape));
        }
    }

    run() {
        // update each boid's behavior and render
        for (let boid of this.boids) {
            boid.run(this.boids, this.separationWeight, this.cohesionWeight, this.alignmentWeight);
        }
    }

    updateShape(shape) {
        this.shape = shape;
        // apply new shape to all boids
        this.boids.forEach(boid => boid.shape = shape);
    }

    updateSpeed(multiplier) {
        // constrain speed multiplier and update maxSpeed
        this.speedMultiplier = constrain(multiplier, 0.5, 5);
        this.boids.forEach(boid => boid.maxSpeed = 3 * this.speedMultiplier);
    }

    updateSeparation(weight) {
        // adjust separation weight
        this.separationWeight = constrain(weight, 0.5, 8);
    }

    updateCohesion(weight) {
        // adjust cohesion weight
        this.cohesionWeight = constrain(weight, 0.5, 3);
    }

    updateAlignment(weight) {
        // adjust alignment weight
        this.alignmentWeight = constrain(weight, 0.5, 3);
    }

    addBoid(boid) {
        // add a new boid
        this.boids.push(boid);
    }

    removeRandomBoid() {
        // remove one random boid if any exist
        if (this.boids.length > 0) {
            this.boids.splice(floor(random(this.boids.length)), 1);
        }
    }
}

class Boid {
    constructor(x, y, shape) {
        this.position = createVector(x, y);       // current location
        this.velocity = createVector(random(-1, 1), random(-1, 1));
        this.acceleration = createVector(0, 0);
        this.shape = shape;                       // shape type
        this.maxSpeed = 3;                        // top speed
        this.maxForce = 0.05;                     // steering limit
        this.r = 5;                               // radius for drawing
    }

    run(boids, separationWeight, cohesionWeight, alignmentWeight) {
        // flocking behavior, movement, boundary wrap, and draw
        this.flock(boids, separationWeight, cohesionWeight, alignmentWeight);
        this.update();
        this.borders();
        this.render();
    }

    applyForce(force) {
        // accumulate steering force
        this.acceleration.add(force);
    }

    flock(boids, separationWeight, cohesionWeight, alignmentWeight) {
        // calculate each flocking component
        let alignment = this.align(boids).mult(alignmentWeight);
        let cohesion  = this.cohere(boids).mult(cohesionWeight);
        let separation = this.separate(boids).mult(separationWeight);

        this.applyForce(alignment);
        this.applyForce(cohesion);
        this.applyForce(separation);
    }

    update() {
        // apply acceleration, limit speed, move, reset accel
        this.velocity.add(this.acceleration);
        this.velocity.limit(this.maxSpeed);
        this.position.add(this.velocity);
        this.acceleration.mult(0);
    }

    render() {
        // draw boid with correct shape and rotation
        let theta = this.velocity.heading() + radians(90);
        push();
        translate(this.position.x, this.position.y);
        rotate(theta);
        noStroke();
        fill(127);

        if (this.shape === 1) {
            circle(0, 0, this.r * 2);
        } else if (this.shape === 2) {
            square(-this.r, -this.r, this.r * 2);
        } else if (this.shape === 3) {
            star(0, 0, this.r, this.r * 2.5, 5);
        } else {
            // default triangle
            beginShape();
            vertex(0, -this.r * 2);
            vertex(-this.r, this.r * 2);
            vertex(this.r, this.r * 2);
            endShape(CLOSE);
        }

        pop();
    }

    borders() {
        // wrap around edges
        if (this.position.x < -this.r) this.position.x = width + this.r;
        if (this.position.y < -this.r) this.position.y = height + this.r;
        if (this.position.x > width + this.r) this.position.x = -this.r;
        if (this.position.y > height + this.r) this.position.y = -this.r;
    }

    separate(boids) {
        // steer away from close neighbors
        let perception = 25;
        let steer = createVector();
        let total = 0;

        boids.forEach(other => {
            let d = p5.Vector.dist(this.position, other.position);
            if (other !== this && d < perception) {
                let diff = p5.Vector.sub(this.position, other.position).normalize().div(d);
                steer.add(diff);
                total++;
            }
        });

        if (total > 0) steer.div(total);
        if (steer.mag() > 0) {
            steer.setMag(this.maxSpeed).sub(this.velocity).limit(this.maxForce);
        }
        return steer;
    }

    align(boids) {
        // steer to match average heading
        let perception = 50;
        let sum = createVector();
        let total = 0;

        boids.forEach(other => {
            let d = p5.Vector.dist(this.position, other.position);
            if (other !== this && d < perception) {
                sum.add(other.velocity);
                total++;
            }
        });

        if (total > 0) {
            sum.div(total).setMag(this.maxSpeed).sub(this.velocity).limit(this.maxForce);
        }
        return sum;
    }

    cohere(boids) {
        // steer toward average position
        let perception = 50;
        let sum = createVector();
        let total = 0;

        boids.forEach(other => {
            let d = p5.Vector.dist(this.position, other.position);
            if (other !== this && d < perception) {
                sum.add(other.position);
                total++;
            }
        });

        if (total > 0) {
            sum.div(total).sub(this.position).setMag(this.maxSpeed).sub(this.velocity).limit(this.maxForce);
        }
        return sum;
    }
}

function preload() {
    handPose = ml5.handPose();             // load handpose model
}

function setup() {
    createCanvas(windowWidth, windowHeight);     // full-window canvas
    video = createCapture(VIDEO);                // start video
    video.size(640, 480);
    video.style('transform', 'scale(-1, 1)');    // mirror view
    video.hide();
    handPose.detectStart(video, handleHandDetection); // begin hand detection
    flock = new Flock();                         // init flock

    setupSerial();                               // init Arduino comms
}

function draw() {
    drawGradientBackground();                    // dynamic background
    flock.run();                                 // update and render boids

    // read and handle serial input
    if (port && port.available() > 0) {
        let data = port.readUntil("\n")?.trim();
        if (data === "REMOVE") {
            for (let i = 0; i < 5; i++) flock.removeRandomBoid();
        } else if (data === "ADD") {
            for (let i = 0; i < 5; i++) flock.addBoid(new Boid(random(width), random(height), flock.shape));
        } else {
            potentiometerValue = parseInt(data);
            let norm = map(potentiometerValue, 0, 1023, 0, 1);
            flock.updateSpeed(map(norm, 0, 1, 0.5, 5));
        }
    }

    // display connection status and stats
    fill(serialConnected ? color(0,255,0,100) : color(255,0,0,100));
    noStroke();
    ellipse(25, 25, 15);
    fill(255);
    textSize(12);
    text(serialConnected ? "Arduino connected" : "Arduino disconnected", 40, 30);
    text("Boids: " + flock.boids.length, 40, 50);
    text("Potentiometer: " + potentiometerValue, 40, 70);
}

function windowResized() {
    resizeCanvas(windowWidth, windowHeight);   // adapt canvas size
}

function mouseDragged() {
    // add boid at drag position
    flock.addBoid(new Boid(mouseX, mouseY, flock.shape));
}

function mousePressed() {
    // connect to Arduino on click
    if (!serialConnected) {
        port.open('Arduino', 9600);
        serialConnected = true;
    }
}

function setupSerial() {
    port = createSerial();                     // create serial instance
    let usedPorts = usedSerialPorts();         // recall last port
    if (usedPorts.length > 0) {
        port.open(usedPorts[0], 9600);
        serialConnected = true;
    }
}

function drawGradientBackground() {
    // vertical gradient based on potentiometer
    let norm = map(potentiometerValue, 0, 1023, 0, 1);
    let c1 = color(0, 0, 0);
    let c2 = color(50, 50, 100);
    for (let y = 0; y < height; y++) {
        stroke(lerpColor(c1, c2, map(y, 0, height, 0, 1)));
        line(0, y, width, y);
    }
}

handGestures.js:

function handleHandDetection(results) {
    detectedHands = results;
    if (detectedHands.length === 0) return;

    let leftHandData = null;
    let rightHandData = null;

    // Identify left/right hands (using handedness or X position fallback)
    detectedHands.forEach(hand => {
        if (hand.handedness === 'Left') {
            leftHandData = hand;
        } else if (hand.handedness === 'Right') {
            rightHandData = hand;
        } else if (hand.keypoints[0].x > video.width / 2) {
            leftHandData = hand;
        } else {
            rightHandData = hand;
        }
    });

    if (leftHandData) handleShapeSelection(leftHandData);
    if (rightHandData) handleFlockingParameters(rightHandData);
}

function handleShapeSelection(hand) {
    const kp = hand.keypoints;
    const d = (i) => dist(kp[i].x, kp[i].y, kp[4].x, kp[4].y);

    // Pinch gestures select shape
    if (d(8) < 20) {
        flock.updateShape(0); // index-thumb
    } else if (d(12) < 20) {
        flock.updateShape(1); // middle-thumb
    } else if (d(16) < 20) {
        flock.updateShape(2); // ring-thumb
    } else if (d(20) < 20) {
        flock.updateShape(3); // pinkie-thumb
    }
}

function handleFlockingParameters(hand) {
    const kp = hand.keypoints;
    const pinch = (i) => dist(kp[i].x, kp[i].y, kp[4].x, kp[4].y) < 20;

    // Gesture-controlled forces; reset when not pinched
    flock.updateSeparation(pinch(12) ? 8   : 1.5); // middle-thumb
    flock.updateCohesion  (pinch(16) ? 2.0 : 1.0); // ring-thumb
    flock.updateAlignment (pinch(20) ? 2.0 : 1.0); // pinkie-thumb
}

What I am proud of:

I’m particularly proud of creating a multi-modal interactive system that merges computer vision, physical computing, and algorithmic art into a cohesive experience. The hand gesture interface allows intuitive control over both the visual appearance (shapes) and behavioral parameters (separation, cohesion, alignment) of the flocking simulation, bringing the mathematical beauty of emergent systems to life through natural interactions. The integration of Arduino hardware extends the experience beyond the screen, creating a tangible connection with the digital entities. I’m especially pleased with how the interaction design supports both casual exploration and more intentional control, users can quickly grasp the basic functionality through simple pinch gestures while having access to nuanced parameter adjustments that reveal the underlying complexity of the flocking algorithm.

Future improvements:

Looking ahead, I see several exciting opportunities to enhance this project. Implementing machine learning to adapt flocking parameters based on user behavior could create a more personalized experience. Adding audio feedback that responds to the flock’s collective movement patterns would create a richer multi-sensory experience. The visual aesthetics could be expanded with procedurally generated textures and particle effects that respond to Arduino sensor data. From a technical perspective, optimizing the flocking algorithm with spatial partitioning would allow for significantly more boids without performance issues. Finally, developing a collaborative mode where multiple users could interact with the same flock through different input devices would transform this into a shared creative experience, opening up possibilities for installation art contexts where audience members collectively influence the behavior of the digital ecosystem.

 

Week 13 – User Testing

Concept

For my final project, I developed an interactive flocking simulation that users can control through hand gestures captured via their webcam. The project uses computer vision and machine learning to detect and interpret hand positions, allowing users to manipulate a swarm of entities (called “boids”) by making specific hand gestures.

The core concept was to create an intuitive and embodied interaction between the user and a digital ecosystem. I was inspired by the natural behaviors of flocks of birds, schools of fish, and swarms of insects, and wanted to create a system where users could influence these behaviors through natural movements.

Ben Eater

User Testing Insights

During user testing, I observed people interacting with the system without providing any instructions. Here’s what I discovered:
  • Most users initially waved at the screen, trying to understand how their movements affected the simulation
  • Users quickly discovered that specific hand gestures (pinching fingers) changed the shape of the swarming elements
  • Some confusion occurred about the mapping between specific gestures and shape outcomes
  • Users enjoyed creating new boids by dragging the mouse, which added an additional layer of interactivity
Areas where users got confused:
  • Initially, people weren’t sure if the system was tracking their whole body or just their hands
  • Some users attempted complex gestures that weren’t part of the system
  • The difference between the thumb-to-ring finger and thumb-to-pinkie gestures wasn’t immediately obvious
What worked well:
  • The fluid motion of the boids created an engaging visual experience
  • The responsiveness of the gesture detection felt immediate and satisfying
  • The changing shapes provided clear feedback that user input was working
  • The ability to add boids with mouse drag was intuitive

Interaction (p5 side for now):

The P5.js sketch handles the core simulation and multiple input streams:

  1. Flocking Algorithm:
    • Three steering behaviors: separation (avoidance), alignment (velocity matching), cohesion (position averaging)
    • Adjustable weights for each behavior to change flock characteristics
    • Four visual representations: triangles (default), circles, squares, and stars
  2. Hand Gesture Recognition:
    • Uses ML5.js with HandPose model for real-time hand tracking
    • Left hand controls shape selection:
      • Index finger + thumb pinch: Triangle shape
      • Middle finger + thumb pinch: Circle shape
      • Ring finger + thumb pinch: Square shape
      • Pinky finger + thumb pinch: Star shape
    • Right hand controls flocking parameters:
      • Middle finger + thumb pinch: Increases separation force
      • Ring finger + thumb pinch: Increases cohesion force
      • Pinky finger + thumb pinch: Increases alignment force
  3. Serial Communication with Arduino:
    • Receives and processes three types of data:
      • Analog potentiometer values to control speed
      • “ADD” command to add boids
      • “REMOVE” command to remove boids
    • Provides visual indicator of connection status
  4. User Interface:
    • Visual feedback showing connection status, boid count, and potentiometer value
    • Dynamic gradient background that subtly responds to potentiometer input
    • Click-to-connect functionality for Arduino communication

Technical Implementation

The project uses several key technologies:
  • p5 for rendering and animation
  • ML5js for hand pose detection
  • Flocking algorithm based on Craig Reynolds’ boids simulation
  • Arduino integration (work in progress) for additional physical controls

Demo:

Arduino Integration (Work in Progress)
The Arduino component of this project is currently under development. The planned functionality includes:
  • Button Controls: Multiple buttons to add/remove boids for the simulation
  • Potentiometer: A rotary control to adjust the speed of the boids in real-time, giving users tactile control over the simulation’s pace and energy

Week 12 – Finalized Idea

Concept:

My project explores the fascinating intersection between physical interaction and emergent systems through a digital flocking simulation. Inspired by Craig Reynolds’ “Boids” algorithm, I’m creating an interactive experience where users can manipulate a flock of virtual entities using both hand gestures and physical controls. The goal is to create an intuitive interface that allows people to “conduct” the movement of the flock, experiencing how simple rules create complex, mesmerizing patterns.

The simulation displays a collection of geometric shapes (triangles, circles, squares, and stars) that move according to three core flocking behaviors: separation, alignment, and cohesion. Users can influence these behaviors through hand gestures detected by a webcam and physical controls connected to an Arduino.

Arduino Integration Design

The Arduino component of my project will create a tangible interface for controlling specific aspects of the flocking simulation:

  1. Potentiometer Input:
    • Function: Controls the movement speed of all entities in the flock
    • Implementation: Analog reading from potentiometer (0-1023)
    • Communication: Raw values sent to P5 via serial communication
    • P5 Action: Values mapped to speed multiplier (0.5x to 5x normal speed)
  2. Button 1 – “Add” Button:
    • Function: Adds new entities to the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “ADD” text command when pressed
    • P5 Action: Creates 5 new boids at random positions
  3. Button 2 – “Remove” Button:
    • Function: Removes entities from the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “REMOVE” text command when pressed
    • P5 Action: Removes 5 random boids from the simulation

The Arduino code will continuously monitor these inputs and send the appropriate data through serial communication at 9600 baud. I plan to implement debouncing for the buttons to ensure clean signals and reliable operation.

P5.js Implementation Design

The P5.js sketch handles the core simulation and multiple input streams:

  1. Flocking Algorithm:
    • Three steering behaviors: separation (avoidance), alignment (velocity matching), cohesion (position averaging)
    • Adjustable weights for each behavior to change flock characteristics
    • Four visual representations: triangles (default), circles, squares, and stars
  2. Hand Gesture Recognition:
    • Uses ML5.js with HandPose model for real-time hand tracking
    • Left hand controls shape selection:
      • Index finger + thumb pinch: Triangle shape
      • Middle finger + thumb pinch: Circle shape
      • Ring finger + thumb pinch: Square shape
      • Pinky finger + thumb pinch: Star shape
    • Right hand controls flocking parameters:
      • Middle finger + thumb pinch: Increases separation force
      • Ring finger + thumb pinch: Increases cohesion force
      • Pinky finger + thumb pinch: Increases alignment force
  3. Serial Communication with Arduino:
    • Receives and processes three types of data:
      • Analog potentiometer values to control speed
      • “ADD” command to add boids
      • “REMOVE” command to remove boids
    • Provides visual indicator of connection status
  4. User Interface:
    • Visual feedback showing connection status, boid count, and potentiometer value
    • Dynamic gradient background that subtly responds to potentiometer input
    • Click-to-connect functionality for Arduino communication

Current Progress

So far, I’ve implemented the core flocking algorithm in P5.js and set up the hand tracking system using ML5.js. The boids respond correctly to the three steering behaviors, and I can now switch between different visual representations.

I’ve also established the serial communication framework between P5.js and Arduino using the p5.webserial.js library. The system can detect previously used serial ports and automatically reconnect when the page loads.

For the hand gesture recognition, I’ve successfully implemented the basic detection of pinch gestures between the thumb and different fingers. The system can now identify which hand is which (left vs. right) and apply different actions accordingly.

Next steps include:

  1. Finalizing the Arduino circuit with the potentiometer and two buttons
  2. Implementing proper debouncing for the buttons
  3. Refining the hand gesture detection to be more reliable
  4. Adjusting the flocking parameters for a more visually pleasing result
  5. Adding more visual feedback and possibly sound responses

The most challenging aspect so far has been getting the hand detection to work reliably, especially distinguishing between left and right hands consistently. I’m still working on improving this aspect of the project.

I believe this project has exciting potential not just as a technical demonstration, but as an exploration of how we can create intuitive interfaces for interacting with complex systems. By bridging physical controls and gesture recognition, I hope to create an engaging experience that allows users to develop an intuitive feel for how emergent behaviors arise from simple rules.

Week 11 – Final Idea Proposal

Concept:
For my final project, I want to develop an interactive system which mixes emotional machine learning, environmental sensing, and generative art. The core idea is to create an ambient environment that responds dynamically to human emotions and gestures. Using ML5.js, I aim to analyze real-time inputs like facial expressions (via FaceAPI) or body movements (via PoseNet) from a webcam feed, translating these into evolving light patterns using Arduino-controlled LEDs and abstract visualizations in p5.js.

Challenges:
First, I need to determine how to reliably connect ML5.js (running in a browser) with Arduino, maybe through the WebSerial API, while ensuring real-time synchronization between emotion detection, visualization, and lighting. Latency concerns me, as even minor delays could break the immersive experience. Second, mapping emotional states (e.g., “happy” or “angry”) to artistic outputs feels subjective; I’ll need to experiment with parameter mappings (color, motion, sound).

Reading Response 8 – Design Meets Disability (Week 11)

Graham Pullin’s Design Meets Disability made me reflect on how design often operates in rigid categories, medical vs. fashionable, functional vs. expressive, and how these binaries fail people with disabilities. The reading’s core idea, that assistive tools deserve the same creative energy as mainstream products, feels both radical and obvious. Why shouldn’t a wheelchair or hearing aid reflect personal style? Glasses, after all, evolved from clunky necessities to fashion statements, proving that utility and aesthetics can coexist.

But I also doubted the practicality of this vision. While stylish prosthetics or colorful hearing aids could challenge stigma, design alone can’t dismantle systemic barriers like inaccessible infrastructure or costs. As one sample response noted, a sleek wheelchair might still be seen as “other” in a world built for stairs. And prioritizing aesthetics risks alienating those who need simplicity or affordability—like how designer eyewear can become unaffordable luxury. Still, Pullin isn’t arguing for style over function but for expanding what’s possible. His examples, like voice synthesizers with emotional nuance, show that “inclusive design” isn’t about trends but dignity: tools should empower users to feel seen, not hidden.

What sticks with me is the tension between individuality and universality. While choice is empowering, too many options can overwhelm. Yet Pullin’s call for diversity in design, discreet or bold, high-tech or minimalist, mirrors the broader disability experience: there’s no single “right” way to navigate the world. Maybe the goal isn’t to make assistive tools “mainstream” but to normalize their presence in design conversations. After all, glasses weren’t normalized by hiding them but by celebrating their versatility. Disability isn’t a niche market – it’s a lens (pun intended) through which we can rethink design for everyone.

Week 11 – Serial Communication

Arduino and p5.js:

Zayed and Zein

Exercise 1:

Arduino Code:

void setup(){
  Serial.begin(9600);
}

void loop(){
  int pot = analogRead(A0);
  // Less responsive: smaller output range
  int xPos = map(pot, 0, 1023, 100, 300); 
  Serial.println(xPos);
  delay(100);  // longer delay = slower updates
}

Challenges:

It was difficult to make the ball move gradually, this was an issue with the p5.js sketch and we added a smoothing factor to make the movement more ideal.

 

Exercise 2:

Arduino Code:

// Arduino: LED brightness via Serial input
const int ledPin = 6; 
const unsigned long BAUD = 9600;

void setup() {
  Serial.begin(BAUD);
  while (!Serial) ;        // wait for Serial Monitor
  pinMode(ledPin, OUTPUT);
  Serial.println("LED Brightness Control");
  Serial.println("Send a number 0–255, then <Enter>:");
}

void loop() {
  // only proceed if we have a full line
  if (Serial.available()) {
    String line = Serial.readStringUntil('\n');
    line.trim();           // remove whitespace

    if (line.length() > 0) {
      int b = line.toInt();  
      b = constrain(b, 0, 255);

      analogWrite(ledPin, b);
      Serial.print("▶ Brightness set to ");
      Serial.println(b);
    }
  }
}

// Fade smoothly to a target brightness
void fadeTo(int target, int speed = 5, int delayMs = 10) {
  int curr = 0;
  // read current duty by trial (not perfect, but illustrates the idea)
  for (int i = 0; i < 256; i++) {
    analogWrite(ledPin, i);
    if (i == target) {
      curr = i;
      break;
    }
  }

  while (curr != target) {
    curr += (target > curr) ? speed : -speed;
    curr = constrain(curr, 0, 255);
    analogWrite(ledPin, curr);
    delay(delayMs);
  }
}

Challenges:

For this exercise the challenges were 1: making the gradient slider animation look intuitive for functionality, and we spent a lot of time debating on whether a rainbow slider was cool or not. We decided that it was.

Exercise 3:

For this exercise we decided to build on the example provided and improve the physics of the wind as well as the animation of the ball itself. To keep track of the sensor values and ensure we are receiving consistent results, we included variables to track the potentiometer, bounce (1 results in LED high, 0 results in LED low) and the mass is just an added feature.

Challenges:

The issues we faced were plenty, including making the wind seem more realistic and making the interaction smoother. The first approach to solve this challenge were to implement debounce to discard those operations that would occur too closely during the runtime interval. We also had to be particularly careful about how often the p5.js asks for the potentiometer reading, considering the frequency of the frames displaying the visual text and the frequency of the potentiometer reading.

Arduino Code:

const int LED_PIN = 9;    // LED pin
const int P_PIN = A0;     // Potentiometer pin

void setup() {
  Serial.begin(9600);
  pinMode(LED_PIN, OUTPUT);
  digitalWrite(LED_PIN, LOW); // Initial LED state

  // Handshake: Wait for p5.js to send a start signal
  while (Serial.available() <= 0) {
    Serial.println("INIT"); // Send "INIT" until p5.js responds
    delay(100);             // Avoid flooding
  }
}

void loop() {
  // Check for incoming serial data
  if (Serial.available() > 0) {
    char incoming = Serial.read(); // Read a single character
    if (incoming == 'B') {        // p5.js sends 'B' followed by 0 or 1
      while (Serial.available() <= 0); // Wait for the value
      int bounce = Serial.read() - '0'; // Convert char '0' or '1' to int
      digitalWrite(LED_PIN, bounce);    // Set LED (0 = LOW, 1 = HIGH)
    }
    else if (incoming == 'R') {       // p5.js requests potentiometer reading
      int potValue = analogRead(P_PIN); // Read potentiometer (0-1023)
      Serial.println(potValue);         // Send value
    }
    // Clear any remaining buffer (e.g., newlines)
    while (Serial.available() > 0) {
      Serial.read();
    }
  }
}

 

ONE VIDEO FOR ALL 3 EXERCISES IN RESPECTIVE ORDER

 

Reading Response 7 – Future of Interaction Design (Week 10)

Reading 1: A Brief Rant on the Future of Interaction Design
Bret Victor’s “A Brief Rant on the Future of Interaction Design” challenges the prevailing notion that touchscreens represent the pinnacle of user interface design. He critiques the “Pictures Under Glass” paradigm, highlighting how it neglects the tactile and manipulative capabilities of human hands. Victor emphasizes that our hands are not just for pointing and tapping but are essential tools for feeling and manipulating our environment. He argues that current interfaces fail to leverage these capabilities, leading to a diminished user experience. This perspective prompts a reevaluation of how we design technology, suggesting that future interfaces should engage more deeply with our physical senses to create more intuitive and effective tools.

Reading 2: Follow-Up
In the follow-up responses to his rant, Victor addresses common criticisms and clarifies his intentions. He acknowledges that his piece was meant to highlight a problem rather than provide a solution, aiming to inspire further research into more tactile and dynamic interfaces. Victor compares the current state of technology to early black-and-white photography—revolutionary at the time but lacking in certain dimensions. He encourages exploration into areas like deformable materials and haptic holography, emphasizing the need for interfaces that can be seen, felt, and manipulated. I feel like this response reinforces the idea that while current technologies have their merits, there is significant room for innovation that more fully engages our human capabilities.

Week 10 – Musical Instrument

Concept:

We decided to use two digital switches (push switches) to play two different notes, with the LDR effectively acting as an analogue volume adjustment mechanism. The video demonstrates how this feedback from the LDR changes the volume, and if you focus when the light intensity pointed towards the LDR is decreased, there is a very small noise.

Demo (Circuit):

Demo (Video):

Arduino Code:

// Define pins for the buttons and the speaker
int btnOnePin = 2;
int btnTwoPin = 3;
int speakerPin = 10;

void setup() {
  // Initialize both button pins as inputs with built-in pull-up resistors
  pinMode(btnOnePin, INPUT_PULLUP);
  pinMode(btnTwoPin, INPUT_PULLUP);
  
  // Configure the speaker pin as an output
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  // Check if the first button is pressed
  if (digitalRead(btnOnePin) == LOW) {
    tone(speakerPin, 262);  // Play a tone at 262 Hz
  }
  // Check if the second button is pressed
  else if (digitalRead(btnTwoPin) == LOW) {
    tone(speakerPin, 530);  // Play a tone at 530 Hz
  }
  // No button is pressed
  else {
    noTone(speakerPin);     // Turn off the speaker
  }
}

Challenges:

The initial concept started out with a Light Dependent Resistor and the Piezo speaker/buzzer. We faced issues as the readings from the LDR did not behave as expected, there was an issue with the  sound produced, and the change in music produced was not adequate.

We also faced challenges with the programming, as the noise production was inconsistent. We fixed this by adjusting the mapping of the notes to produce more distinct frequencies for each independent push button (red vs yellow). for 262 and 540 Hz respectively.

Done by: Zayed Alsuwaidi (za2256) and Zein Mukhanov (zm2199)

Week 9 – Analog Input & Output

Concept:

For my Arduino Sensor LED Control, I wanted to combine analog and digital inputs in a single interactive system. A potentiometer (analog) not only drives the brightness of one LED but also “unlocks” a digital LED toggle. Only when the potentiometer is turned into a specific window (300–700) does pressing the pushbutton flip the digital LED on or off. To make the fade more dramatic, the potentiometer value is squared before mapping to PWM, so small turns near the high end feel much brighter.

Setup:

  • Potentiometer: Reads an analog value (0–1023) on pin A0. Its value controls the analog LED’s brightness and gates the button’s functionality.
  • Pushbutton: Connected to pin 2 with a pull-down resistor. When pressed, it toggles the digital LED only if the potentiometer value is between 300 and 700.
  • Digital LED: On pin 10, turns on or off based on the button toggle (when enabled by the potentiometer).
  • Analog LED: On PWM pin 9, its brightness is set by a squared mapping of the potentiometer value, creating a non-linear fade effect (brighter at higher values).

Creative Element: The requirement for the potentiometer to be in a specific range to enable the button adds an interactive challenge, making the system feel like a “lock” that must be “unlocked” by dialing in the right potentiometer position.

Arduino Code:

const int potPin = A0;        // Potentiometer analog input
const int buttonPin = 2;      // Pushbutton digital input
const int digitalLedPin = 10;  // Digital LED output
const int analogLedPin = 9;  // Analog LED (PWM) output

int buttonState = 0;          // Current button state
int lastButtonState = 0;      // Previous button state
bool ledState = false;        // Digital LED state (on/off)

void setup() {
  pinMode(buttonPin, INPUT);
  pinMode(digitalLedPin, OUTPUT);
  pinMode(analogLedPin, OUTPUT);
}

void loop() {
  // Read sensors
  int potValue = analogRead(potPin);  // 0–1023
  buttonState = digitalRead(buttonPin);

  // Control digital LED
  if (buttonState == HIGH && lastButtonState == LOW) {
    // Button pressed, check if potValue is in range (300–700)
    if (potValue >= 300 && potValue <= 700) {
      ledState = !ledState;  // Toggle LED state
      digitalWrite(digitalLedPin, ledState ? HIGH : LOW);
    }
  }
  lastButtonState = buttonState;  // Update button state

  // Control analog LED (non-linear brightness)
  float normalized = potValue / 1023.0;  // Normalize to 0–1
  int brightness = 255 * (normalized * normalized);  // Square for non-linear effect
  analogWrite(analogLedPin, brightness);
}

Schematic:

Demo:

Challenges

  • Range calibration: Finding the sweet-spot (300–700) took a few tests—too narrow and the button felt unresponsive; too wide and the “lock” felt trivial.

  • Button bounce: Without debouncing, sometimes a single press registered multiple toggles. I ended up adding a small delay in code to ignore rapid changes.

  • Non-linear fade tweaking: The square mapping made lower values almost invisible. I had to play with exponent and mapping constants so the fade curve felt smooth.

 

Reading Response 6 – Physical Computing & Interactivity (Week 9)

Reading 1: Physical Computing’s Greatest Hits (and misses)
When I read through the list of projects, I felt a mix of excitement and a bit of doubt. I love seeing new ideas for sensors, LEDs, and simple mechanical parts. At the same time, I wondered how many of these projects really make people stop and play. The drum glove idea made me smile, but I also thought: will users want to keep playing after the first try? The remote hug example hit me hard. It showed how a simple sensor could carry emotion across a room. I pictured my own version, maybe using a small motor to squeeze a pillow. That felt more personal than flashing lights. I also liked the “fields of grass” sensors. It made me think about how we touch the world around us. I realized that each project is more than a gadget. It is a way to connect our bodies with code. This reading reminded me to pick one idea, try it out, and see how people react. I don’t need the flashiest tech. I just need to build something that invites a smile or a question. And then I need to watch and learn from their moves.

Reading 2: Making Interactive Art: Set the Stage, Then Shut Up and Listen
When I read this piece, I realized that sometimes I talk too much about my projects. I write long instructions or guides. But here, the author says to set things up and then be quiet. That idea felt strange at first. I’m used to explaining every detail. I worry people will miss the point if I don’t. But I also saw how much freedom it gives to users. They can try things their own way. I thought about a demo I did for my capstone last month. I spent five minutes showing how to use the application I have built. After that, no one tried new ideas. Next time, I will just ask the users to open the app and try it out on their own, no directions. I will step back and watch. I will take notes on what people try first and what they ignore. I will let them find their own path. I feel nervous but also curious. This reading taught me that good design is a conversation, not a lecture. If I can listen, I will learn more than if I just tell. I feel excited to see what I will learn. I think this small change will make my work more alive. I’m ready to try this and see what surprises come up.