All Posts

Final Project – together we grow

Concept

For my final project, I created an interactive art that aims to celebrate friendship and togetherness. The main idea of the art is that a tree branch appears from the 2 sides of the screen and grows towards each other; once they connect, a flower blooms. Consistent with the theme of togetherness, the project requires 2 people to hold hands together and the strength of their hold influences how fast the branches grow – force sensitive resistor (FSR) is sewed into a glove, which one person puts on and holds the other person’s hand with. With the free hands, each person puts a finger on the temperature sensor and the resulting reading from each sensor is mapped to a color (darkish pink = warm, light yellow = cool) and lerped together to generate a live color gradient for the background.

Video demonstration

IM showcase user interaction

Implementation

I wanted the user interface to be minimal but pretty, so the homepage, the instructions page and the main sketch are all designed to have minimal text and soft color palette.

homepage

instructions page

main sketch (after the branches have connected together)

At the end of the sketch after the branches connect, a quote about friendship and togetherness is randomly chosen from a text file and appears at the bottom of the screen.

initial hand sketches

Interaction design

As the project is about friendship and togetherness, the interaction is designed to involve 2 people’s participation. I was initially using alligator clips for connecting the temperature sensors, but they felt very frail to the hands and got disconnected easily, so I soldered wires into both of the temperature sensors and taped them on top of a cardboard to make sure they are in place. As for the FSR, my very first idea was to place it inside a soft thin object so that the users can put it in between their palms and then hold hands. I could not think of a suitable and meaningful object to place the FSR, not to mention realized it would be strange to ask people to put a random object in between their holding hands. The best option seemed to be using a glove and attaching the FSR inside it. I tested different ways of holding hands and figured that the location where the contact point is highest when holding hands with another person is the purple area in the picture below. This is where I placed the FSR and sewed it on with a thread. During the IM showcase, some people held hands very very faintly, so I had to jump in and demonstrate how they should hold hands to ensure that force was being applied to the FSR. It made me realize that it may have been helpful if there was an image depiction of proper hand holds in the instructions page.

  • Arduino code
  • int fsrPin = A0;
    int tempSensorPin = A1;
    int tempSensor2Pin = A2;
    int fsrReading;    // variable to store the FSR reading
    float fsrVoltage;     // variable to store the voltage value
    float tempSensorVoltage;
    float tempSensorReading;
    float temperature;
    float tempSensor2Voltage;
    float tempSensor2Reading;
    float temperature2;
    
    void setup() {
      // Start serial communication so we can send data
      // over the USB connection to our p5js sketch
      Serial.begin(9600);
    
      // We'll use the builtin LED as a status output.
      // We can't use the serial monitor since the serial connection is
      // used to communicate to p5js and only one application on the computer
      // can use a serial port at once.
      pinMode(LED_BUILTIN, OUTPUT);
    }
    
    void loop() {
      // read the FSR value (0–1023)
      fsrReading = analogRead(fsrPin); 
      fsrVoltage = fsrReading * (5.0 / 1023.0);
    
      tempSensorReading = analogRead(tempSensorPin);
      // convert the analog reading to a voltage (0–5V)
      tempSensorVoltage = tempSensorReading * (5.0 / 1023.0);
      temperature = (tempSensorVoltage - 0.5) * 100.0;
    
      tempSensor2Reading = analogRead(tempSensor2Pin);
      // convert the analog reading to a voltage (0–5V)
      tempSensor2Voltage = tempSensor2Reading * (5.0 / 1023.0);
      temperature2 = (tempSensor2Voltage - 0.5) * 100.0;
    
      // send the FSR reading and 2 temperatures to p5.js
      Serial.print(fsrReading);
      Serial.print(","); 
      Serial.print(temperature); 
      Serial.print(","); 
      Serial.println(temperature2); 
    
      // blink the built-in LED to show activity
      digitalWrite(LED_BUILTIN, HIGH);
      delay(100);
      digitalWrite(LED_BUILTIN, LOW);
    
      delay(200);
    }
    Circuit
  •  p5.js code

Link to fullscreen sketch

// interface variables
let homepageImg, backButtonImg, infoButtonImg, infoButtonWhiteImg, gloveImg, sensorImg;
let ambienceSound, endingSound;
let backButtonX = 20;
let backButtonY = 20;
let backButtonW = 20;
let backButtonH = 20;
let infoButtonX = 920;
let infoButtonY = 20;
let infoButtonW = 20;
let infoButtonH = 20;
let fontNunito, fontNunitoLight;
// transparency variable for the fading start text
let alphaValue = 0;
// variable used for the fade effect of the start text
let fadeDirection = 1;
let c;
let startMsgColor = "#c27e85";
let titleColor = "#745248";
let instructionsBoxColor = "#738059";
let backgroundColor = "#F0EBE5";
let vizOver = true;
let vizStarted = false;
let instructionsOn = false;
let endingSoundPlayed = false;
let ambienceSoundPlaying = false;
let homePage;
let viz;

// illustration variables
let branchGrowth = 0.1;
let maxGrowth;
let leafGrowth;
// for tracking if the branches have connected at the center
let connected = false;
// for storing the coordinates of left branch points
let leftPoints = [];
// for storing the coordinates of right branch points
let rightPoints = [];
let leafImg;
let leadImgFlipped;
let temperature_1;
let temperature_2;
let quotes = [];
let chosenQuote = "";

// arduino variables
let fsrValue = 0;
let temperature = 0;
let temperature2 = 0;
let arduinoConnected = false;

// map the sensor readings to a different range
function mapVals(sensorType, val) {
  // map temperature to colors
  if (sensorType == "temperature") {
    // lowest reading on the temp sensor was 18
    // highest reading on the temp sensor was 29
    // so these are mapped to a number on a wider scale
    // to ensure that the background color changes are more visible/stronger
    return map(val, 18, 30, 1, 38);
  // map fsr reading to branchGrowth
  } else if (sensorType == "fsr") {
    // bound the branch growth to maximum 0.5 to make sure that
    // the visualization doesn't end too quickly
    return map(val, 0, 1023, 0.1, 0.5);
  }
}

function preload() {
  homepageImg = loadImage("/assets/homepage.png");
  backButtonImg = loadImage("/assets/left.png");
  infoButtonImg = loadImage("/assets/information.png");
  infoButtonWhiteImg = loadImage("/assets/information_white.png");
  fontNunito = loadFont("/fonts/Nunito-Medium.ttf");
  fontNunitoLight = loadFont("/fonts/Nunito-Light.ttf");
  leafImg = loadImage("/assets/nature.png");
  leafImgFlipped = loadImage("/assets/nature_flipped.png");
  flowerImg = loadImage("/assets/flower.png");
  gloveImg = loadImage("/assets/glove.png");
  sensorImg = loadImage("/assets/sensor.png");
  ambienceSound = loadSound("/sounds/ambience_long.mp3");
  endingSound = loadSound("/sounds/ending.mp3");
  quotes = loadStrings("/assets/quotes.txt");
  heartImg = loadImage("/assets/heart.png");
}

function setup() {
  createCanvas(960, 540);
  background(0);

  homePage = new HomePage();
  viz = new Visualization();

  // each branch can only grow for a length that's half of the screen
  maxGrowth = width / 2;
  chosenQuote = random(quotes);
}

class HomePage {
  constructor() {
  }

  display() {
    image(homepageImg, 0, 0, width, height);
    image(infoButtonImg, infoButtonX, infoButtonY, infoButtonW, infoButtonH);
    // fade effect on the "Press Enter to start" text
    // by varying the transparency
    alphaValue += fadeDirection * 3;
    if (alphaValue >= 255 || alphaValue <= 0) {
      fadeDirection *= -1;
    }

    push();
    textAlign(CENTER);
    textFont(fontNunito);
    textSize(16);
    c = color(startMsgColor);
    c.setAlpha(alphaValue);
    fill(c);
    // ask user to select serial port first
    if (!arduinoConnected) {
        text("press SPACE to select serial port", width / 2, 500);
    // once serial port selected, show a different text
    } else {
        text("press ENTER to start", width / 2, 500);
    }
    pop();
  }

  showInstructions() {
    this.display();
    let c = color(instructionsBoxColor);
    c.setAlpha(245);
    fill(c);
    noStroke();
    rect(0, 0, width, height);
    image(
      infoButtonWhiteImg,
      infoButtonX,
      infoButtonY,
      infoButtonW,
      infoButtonH
    );
    
    // instructions text
    push();
    textAlign(CENTER);
    textFont(fontNunito);
    textSize(16);
    fill(color(backgroundColor));
    text("h o w    i t    w o r k s", width / 2, 100);
    text(
      "1. have a friend next to you and stand facing each other",
      width / 2,
      200
    );
    text("2. put a finger each on the temperature sensor", width / 2, 250);
    text(
      "3. with the free hands: one person wears the glove and holds the friend's hand",
      width / 2,
      300
    );
    text("4. watch the tree branches bloom!", width / 2, 350);
    pop();
  }
}

class Visualization {
  constructor() {
  }
  
  start() {
    // play the ambience sound if it's not playing
    if (!ambienceSoundPlaying) {
      ambienceSound.play();
      // use a boolean variable to ensure it's only played once
      ambienceSoundPlaying = true;
    }
    
    // stop the sound if visualization is over
    if (connected || vizOver) {
      ambienceSound.stop();
    }
        
    noStroke();
    
    // map the temp sensor readings
    temperature_1 = mapVals("temperature", temperature);
    temperature_2 = mapVals("temperature", temperature2);
    // map each person's temperature to colors
    let color1 = mapTempToColor(temperature_1);  // color for person 1
    let color2 = mapTempToColor(temperature_2);  // color for person 2

    // smooth gradient blending between the two temperatures
    for (let x = 0; x < width; x++) {
      // lerp factor based on x position (left to right transition)
      let lerpFactor = map(x, 0, width, 0, 0.5);

      // blend between color1 and color2
      let col = lerpColor(color1, color2, lerpFactor);

      // apply the color vertically to the canvas
      stroke(col);
      line(x, 0, x, height);  // vertical lines for smooth gradient
    }

    // go back to homepage button
    image(backButtonImg, backButtonX, backButtonY, backButtonW, backButtonH);
    
    // map fsr reading
    let growthSpeed = mapVals("fsr", fsrValue);
    // use the mapped fsr value to grow the branch
    branchGrowth += growthSpeed;
    
    if (branchGrowth < maxGrowth) {
      // introduce noise to left branch
      let noiseOffsetLeftBranch = branchGrowth * 0.01;
      // introduce noise to right branch
      // slightly change the noise offset so that branches don't grow symmetrically and instead looks different
      let noiseOffsetRightBranch = branchGrowth * 0.01 + 1000;
      
      // generate x, y coordinates for points for both branches
      let yLeft = height / 2 + map(noise(noiseOffsetLeftBranch), 0, 1, -40, 40);
      let yRight =
        height / 2 + map(noise(noiseOffsetRightBranch), 0, 1, -40, 40);
      let xLeft = branchGrowth;
      let xRight = width - branchGrowth;
      
      // once the branches are nearing the center, reduce the noise and make them remain near the horizontal middle of the canvas to ensure they connect everytime
      let easingZone = 30;
      if (branchGrowth > maxGrowth - easingZone) {
        let amt = map(branchGrowth, maxGrowth - easingZone, maxGrowth, 0, 1);
        yLeft = lerp(yLeft, height / 2, amt);
        yRight = lerp(yRight, height / 2, amt);
      }
      
      // add the points to the array
      leftPoints.push({
        pos: createVector(xLeft, yLeft),
        // randomly decide if the leaf will be growing on the top or on the bottom
        flip: int(random(2))
});      
      rightPoints.push({
        pos: createVector(xRight, yRight),
        flip: int(random(2))
});     
    } else if (!connected) {
      connected = true;
      
      // play the ending sound if the branches have connected
      if (!endingSoundPlayed) {
        endingSound.play();
        endingSoundPlayed = true;
      }
    }

    // draw branches
    push();
    strokeWeight(3);
    stroke(110, 70, 40); // brown
    noFill();

    beginShape();
    // draw the left branch
    for (let ptObj of leftPoints) {
      let pt = ptObj.pos;
      vertex(pt.x, pt.y);
      
      // for every 75th x coordinate, draw an image of a leaf
      // position of leaf is stored in the flip attribute
      if (int(pt.x) % 75 == 0) {
        if (ptObj.flip === 0) {
          image(leafImgFlipped, pt.x, pt.y - 2, 40, 40);
        } else {
          image(leafImg, pt.x, pt.y - 37, 40, 40);
        }
      }
    }
    endShape();
    
    // draw the right branch
    beginShape();
    for (let ptObj of rightPoints) {
      let pt = ptObj.pos;
      vertex(pt.x, pt.y);

      if (int(pt.x) % 75 == 0) {
        if (ptObj.flip === 0) {
          image(leafImgFlipped, pt.x, pt.y - 2, 40, 40);
        } else {
          image(leafImg, pt.x, pt.y - 37, 40, 40);
        }
      }
    }
    endShape();
    pop();
    
    let leftEnd = leftPoints[leftPoints.length - 1].pos;
    let rightEnd = rightPoints[rightPoints.length - 1].pos;

    let d = dist(leftEnd.x, leftEnd.y, rightEnd.x, rightEnd.y);
    
    // determine if the branches have connected by finding the distance between the 2 branch end points and checking if it's less than 5
    if (d < 5) {
      push();
      rectMode(CENTER);
      image(flowerImg, leftEnd.x - 15, (leftEnd.y - 30), 80, 80);
      
      // console.log(chosenQuote);
      
      // show the quote
      if (chosenQuote !== "") {
        textAlign(CENTER, CENTER);
        textFont(fontNunito);
        textSize(20);
        fill(titleColor);
        text(chosenQuote, width / 2, height - 80);
      }
      
      // heart image at the bottom of the quote
      image(heartImg, width / 2, height - 60, 40, 40);
      pop();
    }
  }
}

function draw() {
  if (instructionsOn) {
    homePage.showInstructions();
  } else if (!vizStarted || vizOver) {
    homePage.display();
  } else {
    viz.start();
  }
  // print(mouseX + "," + mouseY);
}

// map temperature to color which will be used to control the color of the background
function mapTempToColor(temp) {
  let coolColor = color("#F0EBE5");
  let warmColor = color("#dea0a6"); 
  
  // lerp the light yellow color with the dark pink color based on the temperature
  return lerpColor(coolColor, warmColor, map(temp, 1, 38, 0, 1));
}

function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 3) {
      fsrValue = int(fromArduino[0]);
      temperature = int(fromArduino[1]);
      temperature2 = int(fromArduino[2]);
    }
    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    // let sendToArduino = left + "," + right + "\n";
    // writeSerial(sendToArduino);
  }
}

function keyPressed() {
  // if the enter key is pressed, reset state and start the visualization
  if (keyCode === ENTER) {
    if (vizOver && !instructionsOn) {
      // reset visualization state
      branchGrowth = 20;
      leftPoints = [];
      rightPoints = [];
      connected = false;
      ambienceSoundPlaying = false;
      endingSoundPlayed = false;
      chosenQuote = random(quotes);
      vizStarted = true;
      vizOver = false;
    }
  }
  
  // if space is pressed, setup serial communication
  if (key == " ") {
    setUpSerial();
    arduinoConnected = true;
  }
}

function mousePressed() {
  // if mouse is pressed on the < button, go back to the homepage
  if (
    mouseX >= backButtonX &&
    mouseX <= backButtonX + backButtonW &&
    mouseY >= backButtonY &&
    mouseY <= backButtonY + backButtonH
  ) {
    vizOver = true;
    ambienceSound.stop();
    vizStarted = false; // Reset visualization state when returning to homepage
    console.log("back button pressed");
  // if mouse if pressed on the information button, show the instructions page
  } else if (
    mouseX >= infoButtonX &&
    mouseX <= infoButtonX + backButtonW &&
    mouseY >= infoButtonY &&
    mouseY <= infoButtonY + backButtonH
  ) {
    instructionsOn = !instructionsOn;
    console.log("info button pressed");
  }
}
  • Communication between Arduino and p5.js

3 readings from Arduino (2 temperature sensors and 1 FSR) are sent in a X,Y,Z format with commas separating them to p5.js using serial communication. The p5.js code then maps these values to their respective values for color and speed. I could not think of meaningful data to send from p5.js to Arduino that doesn’t just send data for the sake of having a two-way communication. I considered using the LCD to show text (names of the users, reading values etc), buzzer to play music, but as professor mentioned in our chat on the initial evaluation day, these things can be done on the screen as well and does specifically require to be implemented on the hardware. Thus, I decided to continue using Arduino as input only and I think it worked well at the end.

Reflection and Proud Moments

I like how the p5.js interface and the hardware interface ended up looking from a visual standpoint. I put an effort into choosing the visual assets, the fonts and the different colors I used across the sketch to create a coherent visual experience that is simple yet visually pleasing to look at, and I was very happy when people complimented on it! I decorated the cardboard casing covering the Arduino to match the artwork by drawing leaves and flowers, and I think that turned out successfully as well.

For the p5.js artwork, it was tricky to connect the 2 branches every time. The way the drawing of the branches work is that a variable, called branchGrowth, increases every frame, and its speed is determined by the FSR reading. For each new value of branchGrowth, the code calculates a new point for the left and right branches. To ensure that the branches look organic and not like a straight line, Perlin noise (noise(…)) is introduced to create smooth vertical variation in the branches. As each point of the branch is generated randomly (to a certain degree), unless the last segment of each branch was somehow smoothed out and maintained near the horizontal middle, the 2 branches always ended up at different heights. To solve this issue, I eased the y values of both branches toward the center as they approached close each other. This helped to make the branch growth to appear organic but also controlled so that they connected every time.

Overall, I had a lot of fun working on this project and it was a genuinely rewarding experience watching people’s reactions and seeing how they interact with the piece (including finding some loopholes!). I am glad I stuck with the interactive artwork idea and the theme of togetherness, because I got to see some cute interactions of friends who tried out my project <3

Future Improvements

If I were to continue with the project and develop it further, I think I would explore ways of adding more physical components (other than sensors) that people could interact with. One example could be a flower that blooms when the branches meet by using a servo motor. I would also make improvements to the instructions page because it seemed like people didn’t easily see that there was an info button. Even when they did see the button and press on it, there was a tendency for people to take a quick glance at it and head straight into starting the sketch.

Final Project

Concept & Goals

Inspiration: Give a plant a “voice” through digital animation and sound, fostering empathy and care.

The idea was to monitor ambient light and human touch on the plant, translate those signals into a friendly digital avatar’s mood, color-changing lamp, moving leaf, and background music.

    • Goals:

      1. Awareness of plant wellbeing via playful tech.

      2. Interaction through capacitive touch (DIY sensor) and light sensing.

      3. Empathy by giving the plant a way to “talk back.”

Video

Avatar

Setup

Hardware Overview

1. Capacitive Touch Sensor (DIY)

    • Pins: D4 (send) → 1 MΩ resistor → D2 (receive + foil/copper tape).

    • Library: Paul Badger’s CapacitiveSensor. Downloaded for the Arduino code.

    • Assembly:

      1. Connect a 1 MΩ resistor between pin 4 and pin 2.

      2. Attach copper tape to the receiving leg (pin 2) and wrap gently around the plant’s leaves.

      3. In code: CapacitiveSensor capSensor(4, 2);

2. Photoresistor (Ambient Light)

    • Pins: LDR → +5 V; 10 kΩ → GND; junction → A0.

    • Function: Reads 0–1023, mapped to 0–255 to control lamp intensity.

3. Push-Button (Music Control)

    • Pins: Button COM → D7, NO → GND (using INPUT_PULLUP).

4. Mood LEDs

    • Pins:

      • Green LED1: D12 → 330 Ω → LED → GND

      • Red LED2: D13 → 330 Ω → LED → GND

    • Behavior:

      • Red LED on when touch > high threshold (indicates that the plant does not like the touch).

      • Green LED on when touch < low threshold (the plant is calm and likes the touch).

Arduino Code

#include <CapacitiveSensor.h>

// Capacitive sensor: send→ 4, receive→ 2
CapacitiveSensor capSensor(4, 2);

// Mood LEDs
const int ledPin   = 12;
const int ledPin2  = 13;

// Photoresistor
const int photoPin = A0;

// Push-button + button LED
const int buttonPin    = 7;  // COM→7, NO→GND


// Hysteresis thresholds
const long thresholdHigh = 40;
const long thresholdLow  = 20;

// Debounce
const unsigned long debounceDelay = 50;
unsigned long lastDebounceTime    = 0;
int           lastButtonReading   = HIGH;

// State trackers
bool musicOn = false;
bool led1On  = false;

void setup() {
  Serial.begin(9600);
  delay(100);

  // Let p5.js know we start OFF
  Serial.println("MUSIC_OFF");

  // Mood LEDs
  pinMode(ledPin,   OUTPUT);
  pinMode(ledPin2,  OUTPUT);

  // Capacitive sensor raw
  capSensor.set_CS_AutocaL_Millis(0);

  // Button LED off
  pinMode(buttonLedPin, OUTPUT);
  digitalWrite(buttonLedPin, LOW);

  // Push-button with pull-up
  pinMode(buttonPin, INPUT_PULLUP);
}

void loop() {
  // Button toggle (only prints MUSIC_ON / MUSIC_OFF) 
  int reading = digitalRead(buttonPin);
  if (reading != lastButtonReading) {
    lastDebounceTime = millis();
  }
  if (millis() - lastDebounceTime > debounceDelay) {
    static int buttonState = HIGH;
    if (reading != buttonState) {
      buttonState = reading;
      if (buttonState == LOW) {            // you pressed
        musicOn = !musicOn;
        Serial.println(musicOn ? "MUSIC_ON" : "MUSIC_OFF");
        digitalWrite(buttonLedPin, musicOn ? HIGH : LOW);
      }
    }
  }
  lastButtonReading = reading;

  // Capacitive
  long sensorValue = capSensor.capacitiveSensor(30);
  Serial.println(String("TOUCH:") + sensorValue);

  // Mood LED hysteresis 
  if (!led1On && sensorValue > thresholdHigh) {
    led1On = true;
  } else if (led1On && sensorValue < thresholdLow) {
    led1On = false;
  }
  digitalWrite(ledPin,  led1On ? HIGH : LOW);
  digitalWrite(ledPin2, led1On ? LOW  : HIGH);

  // Photoresistor
  int raw       = analogRead(photoPin);
  int mappedVal = map(raw, 0, 1023, 0, 255);
  Serial.println(String("LAMP:") + mappedVal);

  delay(50);
}

Serial messages:

    • MUSIC_ON / MUSIC_OFF (button)

    • TOUCH:<value> (capacitive)

    • LAMP:<0–255> (light)

P5js Code

let port;
const baudrate = 9600;
let connectButton;
let bgMusic;
let interactionStarted = false;
let isTouched = false;
let lampBrightness = 0;


let plankCount = 6;
let cam;
let myFont;
let waveAngle = 0;
let isWaving = false;
let waveStartTime = 0;



function preload() {
  myFont = loadFont('CalligraphyFLF.ttf');
  bgMusic = loadSound('musica.mp3');  // load  MP3
}
  // Works for clicks, touches, and fullscreen events
  const unlockAudio = () => {
    if (!audioUnlocked) {
      getAudioContext().resume().then(() => {
        console.log('AudioContext unlocked');
        audioUnlocked = true;
        if (musicOn && bgMusic.isLoaded()) bgMusic.loop();
      });
    }
  };
  // Mouse/touch unlock
  window.addEventListener('mousedown', unlockAudio);
  window.addEventListener('touchstart', unlockAudio);
  // Also unlock on fullscreen change
  document.addEventListener('fullscreenchange', unlockAudio);

function setup() {
  createCanvas(windowWidth, windowHeight, WEBGL);

  connectButton = createButton("Connect to Arduino");
  connectButton.position(20, 20);
  connectButton.mousePressed(() => {
    port.open(baudrate);
  });

  port = createSerial();
  const used = usedSerialPorts();
  if (used.length > 0) {
    port.open(used[0], baudrate);
  } else {
    console.warn("No previously used serial ports found.");
  }

  setInterval(onSerialData, 50);   

  textFont(myFont);
  textSize(36);
  textAlign(CENTER, CENTER);

  cam = createCamera();
  const fov = PI / 3;
  const cameraZ = (height / 2.0) / tan(fov / 2.0);
  cam.setPosition(0, 0, cameraZ);
  cam.lookAt(0, 0, 0);
  frameRate(60);
}
function onSerialData() { //serial data
  if (!port || !port.opened()) return;
  while (port.available() > 0) {
    const raw = port.readUntil('\n');
    if (!raw) break;
    const line = raw.trim();
    console.log('Received:', line);

    if (line.startsWith('LAMP:')) {
      lampBrightness = int(line.split(':')[1]);
    } else if (line.startsWith('TOUCH:')) {
      const t = int(line.split(':')[1]);
      // ignore baseline zero readings
      if (t > 0) {
        isTouched = true;
        isWaving = true;
        waveStartTime = millis();
      }
    } else if (line === 'START') {
      interactionStarted = true;
    } else if (line === 'MUSIC_ON') {
      musicOn = true;
      if (bgMusic.isLoaded()) {
        console.log('MUSIC_ON → play');
        bgMusic.play();
      }
    } else if (line === 'MUSIC_OFF') {
      musicOn = false;
      if (bgMusic.isPlaying()) {
        console.log('MUSIC_OFF → stop');
        bgMusic.stop();
      }
    }
  }
}

//Draw
function draw() {
  background(210, 180, 140);
  ambientLight(10);
  pointLight(255, 200, 150, 200, -300, 300);
  pointLight(255, 150, 100, 400, -300, 300);

  drawWarmGradientBackground();
  drawWarmLamp();
  drawWoodDeck();
  drawWoodenBase();
  drawShadow();
  drawPot();
  drawBody();
  drawFace();
  drawPetiole();
  drawInstructionFrame();

  if (!interactionStarted) {
    drawInstructionFrame();
  } else {
    drawScene();
    pointLight(lampBrightness, lampBrightness * 0.8, 100, 200, -300, 300);
  }

  if (isWaving) {
    waveAngle = sin((millis() - waveStartTime) / 2000) * 0.5;

    if (millis() - waveStartTime > 400) {
      isWaving = false;
      waveAngle = 0;
    }
  }

  push();
  rotateZ(waveAngle);
  drawLeaf(0, -140, 0, 1, 4);
  pop();

  push();
  rotateZ(-waveAngle);
  drawLeaf(0, -140, 0, -1, 4);
  pop();

}

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
  const fov = PI / 3;
  const cameraZ = (height / 2.0) / tan(fov / 2.0);
  cam.setPosition(0, 0, cameraZ);
}


function drawWarmLamp() {
  push();
  translate(250, -250, 0);

  // glow — modulate alpha by lampBrightness
  for (let r = 300; r > 50; r -= 20) {
    push();
    noStroke();
    // fade alpha between 0 and 150
    let a = map(lampBrightness, 0, 255, 20, 150);
    fill(255, 180, 90, a);
    ellipse(0, 10, r, r * 1.2);
    pop();
  }

  // stand
  fill(100, 70, 50);
  translate(0, 200, 0);
  cylinder(5, 400);

  // base
  translate(0, 200, 0);
  fill(80, 60, 40);
  ellipse(0, 0, 80, 16);

  // lampshade (inverted cone)
  push();
  translate(0, -400, 0);
  fill(225, 190, 150);
  cone(50, -100, 24);
  pop();

  pop();
}

function drawWoodDeck() {
  push();
    rotateX(HALF_PI);
    translate(0, -2, -170);
    // static plank color
    fill(160, 100, 60);
    stroke(100, 60, 40);
    const plankHeight = -50;
    for (let i = 0; i < plankCount; i++) {
      box(width, plankHeight, 10);
      translate(0, plankHeight, 0);
    }
  pop();
}

function drawWoodenBase() {
  push();
    rotateX(HALF_PI);
    translate(0, 150, -200);
    // static plank color
    fill(160, 100, 60);
    stroke(100, 60, 40);
    const baseCount = 8;
    const plankWidth = (width * 1.2) / baseCount;
    for (let i = 0; i < baseCount; i++) {
      push();
        translate(-width * 0.6 + i * plankWidth + plankWidth / 2, 24, 0);
        box(plankWidth, 400, 20);
      pop();
    }
  pop();
}


function drawShadow() {
  push();
  translate(0, 150, -10);
  rotateX(HALF_PI);
  noStroke();
  fill(80, 70, 60, 30);
  ellipse(0, 0, 190, 30);
  pop();
}

function drawPot() {
  push();
  translate(0, 100, 0);
  fill(170, 108, 57);
  stroke(120, 70, 40);
  strokeWeight(1);
  cylinder(60, 80);
  translate(0, -50, 0);
  fill(190, 120, 70);
  cylinder(80, 20);
  pop();
}

function drawBody() {
  push();
  translate(0, 20, 0);
  noStroke();
  fill(150, 255, 150);
  sphere(75);
  translate(0, -90, 0);
  sphere(70);

  // highlights
  fill(255, 255, 255, 90);
  translate(-30, -10, 50);
  sphere(10);

  pop();
}

function drawPetiole() {
  push();
  translate(0, -110, 20);  // start from top of head
  rotateX(radians(200));  // slight backward tilt
  fill(100, 200, 100);
  noStroke();
  cylinder(8, 100);       
  pop();
}


function drawLeaf(x, y, z, flip = 1, scaleFactor = 1) {
  push();
  translate(x, y, z);
  rotateZ(flip * QUARTER_PI); // tilt outwards
  rotateX(HALF_PI - QUARTER_PI * 0.5); // add a slight backward curve
  scale(flip * scaleFactor, scaleFactor); // flip + scale


  fill(100, 200, 100);
  stroke(60, 160, 80);
  strokeWeight(1);

  beginShape();
  vertex(0, 0);
  bezierVertex(-35, -90, -70, -10, 0, 0); // left curve
  endShape(CLOSE);

  // Center vein
  stroke(90, 150, 80);
  strokeWeight(2);
  line(0, 0, -40, -29);
  pop();
}

function drawFace() {
  push();
  translate(0, -100, 70);

  stroke(10);
  strokeWeight(2);
  noFill();
  arc(-20, 0, 20, 10, 0, PI);
  arc(20, 0, 20, 10, 0, PI);

  // blush
  noStroke();
  fill(255, 200, 200, 100);
  ellipse(-35, 10, 15, 8);
  ellipse(35, 10, 15, 8);

  // smile
  stroke(30);
  noFill();
  arc(0, 15, 30, 10, 0, PI);

  pop();
}
function drawWarmGradientBackground() {
  push();
  translate(0, 0, -500); // move far behind the scene
  noStroke();
  beginShape();
  fill(250, 230, 200); // top (warm cream)
  vertex(-width, -height);
  vertex(width, -height);
  fill(210, 170, 130); // bottom (warm brownish-orange)
  vertex(width, height);
  vertex(-width, height);
  endShape(CLOSE);
  pop();
}


function drawInstructionFrame() {
  push();
    // move to a spot on the back wall
    translate(-width * 0.25, -height * 0.25, -490);

    // outer frame (landscape)
    fill(120, 80, 40);
    box(430, 300, 10);

    // inner “paper” canvas inset
    push();
      translate(0, 0, 7);
      fill(255, 245, 220);
      box(390, 260, 2);
    pop();

    // text
    push();
      translate(0, 0, 12);
      fill(60, 40, 30);
      textSize(40);
      textAlign(CENTER, CENTER);
      text("Make the Plant Happy\n- Press to play Music\n- Control the Lighting\n- Pet the plant", 0, 0);
    pop();
  pop();
}

Functionality Flow

    1. Startup

      • Arduino sends MUSIC_OFF. p5.js opens port, waits for START (button press on avatar).

    2. Interaction

      • Touch: Plant touch → TOUCH:<value> → leaf animation.

      • Light: Ambient changes → LAMP:<0–255> → lamp glow intensity.

      • Music: Physical push-button → MUSIC_ON/MUSIC_OFF → background music.

Challenges & Solutions

    • Serial Fragmentation: Split messages were mis-parsed → switched to single println("TOUCH:"+value) calls.

    • Sensor Hysteresis: Capacitive values vary → implemented high/low thresholds to avoid flicker.

    • Full-Screen Behavior: Music wouldn’t play in full screen → tied audio unlock to fullscreenchange event.

Final Project Documentation

I initiated this project to emulate the card-scanning excitement of Yu-Gi-Oh duel disks, in which tapping cards summons monsters and spells. Users present one or more RFID tags-each representing cowboy, astronaut or alien-to an MFRC522 reader connected to an Arduino Uno. The system then allocates a five-second selection window before launching one of three interactive mini-games in p5.js: Stampede, Shooter or Cookie Clicker. In Stampede you take the helm of a lone rider hurtling through a hazardous space canyon, dodging bouncing rocks and prickly cacti that can slow or shove you backwards-all while a herd of cosmic cows closes in on your tail. Shooter throws two players into a tense standoff: each pilot manoeuvres left and right, firing lasers at their opponent and scrambling down shields to block incoming beams until one side breaks. Cookie Clicker is pure, frenzied fun-each participant pounds the mouse on a giant on-screen cookie for ten frantic seconds, racing to rack up the most clicks before time runs out. All visual feedback appears on a browser canvas, and audio loops accompany each game.

 

 

Components

The solution comprises four principal components:

  • RFID Input Module: An MFRC522 reader attached to an Arduino Uno captures four-byte UIDs from standard MIFARE tags.
  • Serial Bridge: The Arduino transmits single-character selection codes (‘6’, ‘7’ or ‘8’) at 9600 baud over USB and awaits simple score-report messages in return. P5.js
  • Front End: A browser sketch employs the WebSerial API to receive selection codes, manage global state and asset loading, display a five-second combo bar beneath each character portrait, and execute the three mini-game modules.
  • Mechanical Enclosure: Laser-cut plywood panels, secured with metal L-brackets, form a cuboid housing; a precision slot allows the 16×2 LCD module to sit flush with the front panel.

Hardware Integration

The MFRC522 reader’s SDA pin connects to Arduino digital pin D10 and its RST pin to D9, while the SPI lines (MOSI, MISO, SCK) share the hardware bus. In firmware, the reader is instantiated via “MFRC522 reader(SS_PIN, RST_PIN);
” and a matchUID() routine compares incoming tags against the three predefined UID arrays.

Integrating a standard 16×2 parallel-interface LCD alongside the RFID module proved significantly more troublesome. As soon as “lcd.begin(16, 2)”  was invoked in setup(), RFID reads ceased altogether. Forum guidance indicated that pin conflicts between the LCD’s control lines and the RC522’s SPI signals were the most likely culprit. A systematic pin audit revealed that the LCD’s Enable and Data-4 lines overlapped with the RFID’s SS and MISO pins. I resolved this by remapping the LCD to use digital pins D2–D5 for its data bus and D6–D7 for RS/Enable, updating both the wiring and the constructor call in the Arduino sketch.

P5.js Application and Mini-Games

The browser sketch orchestrates menu navigation, character selection and execution of three distinct game modules within a single programme.

A single “currentState” variable (0–3) governs menu, Stampede, Shooter and Cookie Clicker modes. A five-second “combo” timer begins upon the first tag read, with incremental progress bars drawn beneath each portrait to visualise the window. Once the timer elapses, the sketch evaluates the number of unique tags captured and transitions to the corresponding game state.

Merging three standalone games into one sketch turned out to be quite the headache. Each mini-game had its own globals-things like score, stage and bespoke input handlers-which clashed as soon as I tried to switch states. To sort that out, I prefixed every variable with its game name (stampedeScore, sh_p1Score, cc_Players), wrapped them in module-specific functions and kept the global namespace clean.

The draw loop needed a rethink, too. Calling every game’s draw routine in sequence resulted in stray graphics popping up when they shouldn’t. I restructured draw() into a clear state machine-only the active module’s draw function runs each frame. That meant stripping out stray background() calls and rogue translate()s  from the individual games so they couldn’t bleed into one another

Finally, unifying input was tricky. I built a single handleInput function that maps RFID codes (‘6’, ‘7’, ‘8’) and key presses to abstract commands (move, shoot, click) then sends them to whichever module is active. A bit of debouncing logic keeps duplicate actions at bay- especially critical during that five-second combo window- so you always get predictable, responsive controls.

The enclosure is constructed from laser-cut plywood panels, chosen both for its sustainability, and structural rigidity, and finished internally with a white-gloss plastic backing to evoke a sleek, modern aesthetic. Metal L-brackets fasten each panel at right angles, avoiding bespoke fasteners and allowing for straightforward assembly or disassembly. A carefully dimensioned aperture in the front panel accommodates the 16×2 LCD module so that its face sits perfectly flush with the surrounding wood, maintaining clean lines.

Switching between the menu and the individual mini-games initially caused the sketch to freeze on several occasions. Timers from the previous module would keep running, arrays retained stale data and stray transformations lingered on the draw matrix. To address this, I introduced dedicated cleanup routine- resetStampede(), shCleanup() and ccCleanup()- that execute just before currentState changes. Each routine clears its game’s specific variables, halts any looping audio and calls resetMatrix() (alongside any required style resets) so that the next module starts with a pristine canvas.

Audio behaviour also demanded careful attention. In early versions, rapidly switching from one game state to another led to multiple tracks playing at once or to music cutting out abruptly, leaving awkward silences. I resolved these issues by centralising all sound control within a single audio manager. Instead of scattering stop() and loop() calls throughout each game’s code, the manager intercepts state changes and victory conditions, fading out the current track and then initiating the next one in a controlled sequence. The result is seamless musical transitions that match the user’s actions without clipping or overlap.

The enclosure underwent its own process of refinement. My first plywood panels, cut on a temperamental laser cutter, frequently misaligned-the slot for the LCD would be too tight to insert the module or so loose that it rattled. After three iterative cuts, I tweaked the slot width, adjusted the alignment tabs and introduced a white-gloss plastic backing. This backing not only conceals the raw wood edges but also delivers a polished, Apple- inspired look. Ultimately, the panels now fit together snugly around the LCD and each other, creating a tool-free assembly that upholds the project’s premium aesthetic.

Future Plans

Looking ahead, the system lends itself readily to further enhancement through the addition of new mini-games. For instance, there could be a puzzle challenge or a rhythm-based experience which leverages the existing state-framework; each new module would simply plug into the central logic, reusing the asset-loading and input-dispatch infrastructure already in place.

Beyond additional games, implementing networked multiplayer via WebSockets or a library such as socket.io would open the possibility of remote matches and real-time score sharing, transforming the project from a local-only tabletop experience into an online arena. Finally, adapting the interface for touch input would enable smooth operation on tablets and smartphones, extending the user base well beyond desktop browsers.

Conclusion

Working on this tabletop arcade prototype has been both challenging and immensely rewarding. I navigated everything from the quirks of RFID timing and serial communications to the intricacies of merging three distinct games into a single p5.js sketch, all while refining the plywood enclosure for a polished finish. Throughout the “Introduction to Interactive Media” course, I found each obstacle-whether in hardware, code or design-to be an opportunity to learn and to apply creative problem-solving. I thoroughly enjoyed the collaborative atmosphere and the chance to experiment across disciplines; I now leave the class not only with a functional prototype but with a genuine enthusiasm for future interactive projects.

 

Final Project Documentation

Final Project Documentation

Concept:

“Olive Tree Stories” is an interactive storytelling project inspired by the deep cultural roots of olive tree’s in Palestinian heritage. I wanted my project to be meaningful and leave a strong impact on the user, so i drew inspiration from my own culture. The installation uses a plastic olive tree with three symbolic leaves. Each leaf has a hidden touch sensor made of aluminum foil and sponge, which activates when pressed gently on a red mark. When the leaf is pressed, the corresponding olive lights up on the tree, and a short audio (poem or song) starts playing, followed by a visual enlargement of a related image on screen. I was looking for the overall experience to evoke the act of picking an olive and hearing a story passed down through generations. My main goal for this project is to preserve memory and heritage through touch, sound, and visuals.

Interaction Design:
– The user interacts with the leaves of an olive tree.
– Each leaf has a red marking –> made using two layers of aluminum separated by sponge.
– When red marking is pressed:
– A green olive lights up on the tree (via LED).
– One of two audio stories tied to that olive is randomly selected and begins playing.
– A corresponding visual (image) on screen zooms in smoothly.
– After the story ends:
– The screen returns to its main menu.
– The olive light turns off.

I wanted this natural interaction design to draw people in gently, as they literally touch a memory through the leaves of the olive tree.

Link to process photos + Link to extra designs made

Arduino Code:

The Arduino reads input from 3 capacitive touch pins and lights up corresponding LEDs. It also sends a serial message (“1”, “2”, or “3”) to indicate which olive was touched.

#include <Adafruit_NeoPixel.h>

const int touchPin = 6;
const int touchLedPin = 3;

const int touchPin2 = 10;
const int touchLedPin2 = 11;

const int touchPin3 = 7;
const int touchLedPin3 = 12;

const int trigPin = 8;
const int echoPin = 9;

#define STRIP_PIN 5
#define NUMPIXELS 50
Adafruit_NeoPixel strip(NUMPIXELS, STRIP_PIN, NEO_GRB + NEO_KHZ800);

void setup() {
  pinMode(touchPin, INPUT_PULLUP);
  pinMode(touchLedPin, OUTPUT);

  pinMode(touchPin2, INPUT_PULLUP);
  pinMode(touchLedPin2, OUTPUT);

  pinMode(touchPin3, INPUT_PULLUP);
  pinMode(touchLedPin3, OUTPUT);

  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);

  strip.begin();
  strip.show();
  Serial.begin(9600);
}

void loop() {
  int touchState = digitalRead(touchPin);
  digitalWrite(touchLedPin, touchState == LOW ? HIGH : LOW);
  if (touchState == LOW) {
    Serial.println("1");
    delay(300);
  }

  int touchState2 = digitalRead(touchPin2);
  digitalWrite(touchLedPin2, touchState2 == LOW ? HIGH : LOW);
  if (touchState2 == LOW) {
    Serial.println("2");
    delay(300);
  }

  int touchState3 = digitalRead(touchPin3);
  digitalWrite(touchLedPin3, touchState3 == LOW ? HIGH : LOW);
  if (touchState3 == LOW) {
    Serial.println("3");
    delay(300);
  }

  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  long duration = pulseIn(echoPin, HIGH, 30000);
  int distance = (duration == 0) ? 999 : duration * 0.034 / 2;

  if (distance < 100) {
    for (int i = 0; i < NUMPIXELS; i++) {
      strip.setPixelColor(i, strip.Color(60, 90, 30));
    }
  } else {
    for (int i = 0; i < NUMPIXELS; i++) {
      strip.setPixelColor(i, 0);
    }
  }

  strip.show();
  delay(50);
}

p5.js Code: 

– Loads 6 images and 5 audio files
– Maps each box to one audio
– Randomizes which box is triggered per olive
– Communicates with Arduino using Web Serial
– Displays interactive visuals with zoom animation and audio syncing

Arduino – p5.js Communication:
– Serial connection via Web Serial API
– Arduino sends “1”, “2”, or “3” when a sensor is triggered
– p5.js receives the value, randomly selects one of two mapped boxes, and plays audio/image
– Serial connection is opened through a “Connect to Arduino” button

 

Link to video demonstration 

 

Highlights / Proud Moments:

I was very proud of my use of leaves as soft sensors because I believe it adds a layer of tactile beauty while blending the installation with poetic significance. i also liked my use of the red markings on each leaf as they serve as subtle visual guide, naturally blending into the overall aesthetic of the tree while inviting interaction. When a user presses on a marked leaf, the seamless synchronisation between light and audio brings the experience to life, creating an emotionally resonant response. I knew that my goal was met when a lot of users commented and told me that they really appreciated the meaningful aspect of my project. Overall, I think that my project connects technology and tradition and turns a symbolic object into a responsive storytelling medium.

Challenges/Future Improvements: 

One of the main challenges I faced was my initial idea to make the olives themselves the touchpoint that triggered the audio.  I struggled to add a reliable sensor inside the small and spherical shape of the olives. However, to overcome this, I problem-solved by shifting the interaction to the leaves, which offered more surface area and flexibility. Also, I learned the importance of saving your code very often and debugging. Looking ahead, future improvements could include integrating haptic feedback into the olives, adding more stories with replay options, and even including narration subtitles for accessibility. Overall, I am extremely proud of how my project turned out and the responses it got from many users.

Final Project Documentation

Concept 

My concept is a semi-thriller game that features two key sensors — the joystick, and the pulse sensor. It was inspired by my siblinghood. Growing up, I would often eat my sister’s snacks, of course, without her permission. This would give me a thrill that was akin to what someone would feel playing hide-and-seek with a murderer (I’m exaggerating here; I actually really love my sister haha). I settle on making it “slime-themed” since I wanted it to be more fun and gender-neutral to be inclusive to all players (possibly people that had older brothers, cousins, etc.). I mean certainly a slime character is non-specific enough I think.

How it works

Well, when you first start the game, you will be greeted with an eery background music and screen image with two options: “how to play” and “start game”. If you press on “how to play” with the joystick, you’ll be shown a very comprehensive manual on how to play the game. Then, if you press start, it’ll take you to the game dialog, and you can click through the dialog through the joystick clicks. After that, the game begins! When it starts, on the top right of the screen there is a icon of slime chips with a number on top, these are the number of chips you need to eat without getting caught and losing all your hearts. The number of hearts you have is on the top left. When the sister turns around and you’re eating your chips (i.e. your hands are touching; you can move the right hand with the chips left and right with the joystick), she gets mad and you lose a heart. Here are the game p5.js visuals:

(startscreen) (how to play) (dialog screen) (game started and sister turned around) (game started and sister caught you!!)(you won!!)

Implementation

1. Interaction Design:

I wanted it to have a flex sensor initially as the pseudo “joystick” but I realized that that wouldn’t really be intuitive or even enjoyable for the average user likely due to its foreign nature in games like these. So I decided to borrow a joystick from the IM lab, alongside a pulse sensor. I wanted to make the game take real-time input from the user and imitate the scenario of my heart racing as a kid when stealing my sister’s food. So, every 1.5 to 2.5 seconds, the game checks if the user’s heartbeat ‘jumps’ by more than 70 (imitating nervousness). If so, or with a 25% random chance, the sister turns around. I did this to simulate the tension of being caught, as the player’s nervousness (heartbeat jump) makes the slime sibling more likely to notice you with a bit of randomness. I added chip-eating sound effects among some other sounds to reflect what is happening on the screen.

2. Arduino Code:

My Arduino code controls three pairs of colored LEDs (red, yellow, green) based on commands received from a computer (via Serial). It also reads a joystick’s X-axis, a pulse sensor, and a joystick button (I put print statements for their values for debugging and monitoring). Only one color is lit at a time, and the code ignores repeated color commands, to make sure a state is reflected clearly for the player. I also added some helper functions to keep all LEDs off at the beginning and another one so that the same color LED is on at both sides of the box. Here is the code:

const int joyXPin   = A0;    // joystick a-axis
const int pulsePin  = A1;    // pulse sensor
const int selPin    = 8;     // joystick “select” button (digital)
const int baudRate  = 9600;

// LED pins
const int RED_L  = 7,  RED_R  = 5;
const int YEL_L  = 10, YEL_R  = 4;
const int GRN_L  = 12, GRN_R  = 2;

const int threshold = 50;    // dead-zone around center (512)

// remembering last LED state
String currentCmd = "";

void setup() {
  Serial.begin(baudRate);
  pinMode(selPin, INPUT_PULLUP);
  
  // LED pins
  pinMode(RED_L, OUTPUT);
  pinMode(RED_R, OUTPUT);
  pinMode(YEL_L, OUTPUT);
  pinMode(YEL_R, OUTPUT);
  pinMode(GRN_L, OUTPUT);
  pinMode(GRN_R, OUTPUT);

  // start with all off
  clearAllLeds();
}

void loop() {
  // checking for new command from p5.js
  if (Serial.available()) {
    String cmd = Serial.readStringUntil('\n');
    cmd.trim();

    // only accept valid colors
    if (cmd == "RED" || cmd == "YELLOW" || cmd == "GREEN") {
      // if different than last, update
      if (cmd != currentCmd) {
        currentCmd = cmd;
        applyLeds(currentCmd);
      }
    }
    // echo back for debugging
    Serial.print("Received: ");
    Serial.println(cmd);
  }

  // existing joystick/pulse prints
  int xRaw    = analogRead(joyXPin);
  int pulseRaw = analogRead(pulsePin);
  bool selPressed = (digitalRead(selPin) == LOW);

  int dirX;
  if (xRaw < 512 - threshold)      dirX = 0;
  else if (xRaw > 512 + threshold) dirX = 1;
  else                              dirX = -1;

  Serial.print("DIRX:");   Serial.print(dirX);
  Serial.print("  PULSE:");Serial.print(pulseRaw);
  Serial.print("  SEL:");  Serial.println(selPressed ? 1 : 0);
}

// helper to turn all LEDs off
void clearAllLeds() {
  digitalWrite(RED_L, LOW); digitalWrite(RED_R, LOW);
  digitalWrite(YEL_L, LOW); digitalWrite(YEL_R, LOW);
  digitalWrite(GRN_L, LOW); digitalWrite(GRN_R, LOW);
}

// helper to set exactly one color on both sides
void applyLeds(const String &cmd) {
  clearAllLeds();
  if (cmd == "RED") {
    digitalWrite(RED_L, HIGH);
    digitalWrite(RED_R, HIGH);
  } 
  else if (cmd == "YELLOW") {
    digitalWrite(YEL_L, HIGH);
    digitalWrite(YEL_R, HIGH);
  } 
  else if (cmd == "GREEN") {
    digitalWrite(GRN_L, HIGH);
    digitalWrite(GRN_R, HIGH);
  }
}

3. Schematic of Project:

4. p5.js Code:

The game features multiple states (start menu, gameplay, how-to-play screen) managed through a state machine architecture. When playing, users control a hand to grab chips while avoiding detection from the sister character who randomly turns around. The sister’s turning behavior is influenced by both randomness and the player’s actual heartbeat measured through a pulse sensor.

Key features in the p5.js include:

  • Dialog system for narrative progression
  • Character animation with multiple emotional states
  • Collision detection between hands
  • Health and scoring systems (hearts and chip counter)
  • Two endings (happy and bad) based on gameplay outcomeThis is the code:
    let gameState = 'start'; // 'start', 'game', or 'howToPlay'
    let startImg;
    let bgImg;
    let howToPlayImg; 
    let bgMusic;
    let angry;
    
    // serial communication variables
    let rightX = 0;
    let pulseValue = 0;  
    let connectButton; // button to initiate serial connection
    
    // hoystick control variables for menu navigation
    let currentSelection = 0; // 0 = start game, 1 = how to play
    let selPressed = false;
    let prevSelPressed = false; // to detect rising edge
    let backButtonSelected = false; // tracking if back button is selected in how to play screen
    
    // hand objects
    let leftHand = {
      img: null,
      x: 0,
      y: 0,
      width: 150,
      height: 150,
      visible: false
    };
    
    let rightHand = {
      img: null,
      x: 0,
      y: 0,
      width: 150,
      height: 150,
      visible: false,
      currentImg: 1, // tracking current hand image (1, 2, or 3)
      animationStarted: false, // tracking if animation has started
      animationTimer: 0, // timer for animation
      lastLedState: "",
      sisterTurned: false, // tracking if sister is turned around
      sisterTurnStart: 0,      // ← when she last flipped to “turned”
      maxTurnDuration: 1700,   // ← max ms she can remain turned (1.7 s)
      lastTurnCheck: 0, // timer for checking sister's state
      heartDeducted: false, // flag to track if heart was deducted in current overlap
      lastSisterState: false, // tracking previous sister state to detect changes
      sisterAngry: false, // flag to track if sister is angry
      lastPulseValue: 0 // tracking last pulse value for heartbeat detection
    };
    
    let chipCounter = {
      img: null,
      x: 0,
      y: 0,
      width: 80,
      height: 80,
      count: 10
    };
    
    let hearts = {
      img: null,
      count: 5, 
      width: 30,
      height: 30,
      spacing: 10,
      x: 20, 
      y: 20 
    };
    
    let gameOver = false;
    let gameEndingType = ''; // 'good' or 'bad' to track which ending to show
    
    let dialogSounds = {};
    
    let startButton = {
      x: 0,
      y: 0,
      width: 150, 
      height: 30,
      text: 'start game',
      isHovered: false
    };
    
    let howToPlayButton = {
      x: 0,
      y: 0,
      width: 180, 
      height: 30,
      text: 'how to play',
      isHovered: false
    };
    
    let backButton = { 
      x: 0,
      y: 0,
      width: 100,
      height: 40,
      text: 'Back',
      isHovered: false
    };
    
    let instrumentSerifFont;
    let currentDialog = null; 
    
    let characters = {};
    
    function preload() {
      startImg = loadImage('Images/start.png');
      bgImg = loadImage('Images/bg.png');
      howToPlayImg = loadImage('Images/howtoplay.png'); 
      instrumentSerifFont = loadFont('InstrumentSerif-Regular.ttf');
      bgMusic = loadSound('Audios/bg.mp3');
      leftHand.img = loadImage('Images/left_hand.png');
      rightHand.img = loadImage('Images/right_hand1.png');
      rightHand.img2 = loadImage('Images/right_hand2.png');
      rightHand.img3 = loadImage('Images/right_hand3.png');
      chipCounter.img = loadImage('Images/slimechips.png');
      hearts.img = loadImage('Images/heart.png');
      happyEndingImg = loadImage('Images/happy_ending.png');
      badEndingImg = loadImage('Images/bad_ending.png');
    
      dialogSounds['chipbag.mp3'] = loadSound('Audios/chipbag.mp3');
      dialogSounds['chip_crunch.mp3'] = loadSound('Audios/chip_crunch.mp3');
      dialogSounds['angry.mp3'] = loadSound('Audios/angry.mp3');
    
      characters['sister_normal'] = new Character('sister_normal', 'Sister', 'Images/sister_normal.png');
      characters['sister_speaking'] = new Character('sister_speaking', 'Sister', 'Images/sister_speaking.png');
      characters['sister_turned'] = new Character('sister_turned', 'Sister', 'Images/sister_turned.png');
      characters['sister_upset'] = new Character('sister_upset', 'Sister', 'Images/sister_upset.png');
      characters['sister_angry'] = new Character('sister_angry', 'Sister', 'Images/sister_angry.png');
      characters['slime'] = new Character('slime', 'me', 'Images/slime.png');
      characters['slime_speaking'] = new Character('slime_speaking', 'me', 'Images/slime_speaking.png');
      for (let key in characters) {
        characters[key].preload();
      }
    }
    
    async function setUpSerialWithErrorHandling() { // handling errors with serial
      try {
        await setUpSerial();
      } catch (error) {
        console.log("Serial connection failed: " + error.message);
      }
    }
    
    function setup() {
      createCanvas(627, 447);
      noCursor(); 
    
      // playing background music
      bgMusic.setVolume(0.5); 
      bgMusic.loop(); 
    
      textFont(instrumentSerifFont);
    
      connectButton = createButton('connect serial');
      connectButton.position(25, 25);
      connectButton.style('font-size', '18px');
      connectButton.style('font-weight', 'bold');
      connectButton.style('color', '#000');
      connectButton.style('background-color', '#fff');
      connectButton.style('border', '2px solid #000');
      connectButton.style('border-radius', '8px');
      connectButton.style('padding', '8px 16px');
      connectButton.style('cursor', 'pointer');
    
        connectButton.mouseOver(() => {
          connectButton.style('background-color', '#000');
          connectButton.style('color', '#fff');
        });
        connectButton.mouseOut(() => {
          connectButton.style('background-color', '#fff');
          connectButton.style('color', '#000');
        });
    
      connectButton.mousePressed(setUpSerialWithErrorHandling); // using setUpSerial from p5.web-serial.js
    
    
      positionStartScreenButtons();
    
      positionBackButton();
    
      rightHand.x = width - 10;
      rightHand.y = height/2 - rightHand.height/2;
    
      chipCounter.x = width - chipCounter.width - 20;
      chipCounter.y = 20;
    
      const sampleDialog = [
        { speaker: 'sister_speaking', line: "Hey! Have you seen my chips?" },
        { speaker: 'slime', line: "Nope, I haven't seen them." , sound: "chipbag.mp3"},
        { speaker: 'sister_upset', line: "Hmmm... are you sure?" },
        { speaker: 'slime_speaking', line: "Yep...! Haven't seen them! I'll let you know if I see them!" },
        { speaker: 'sister_speaking', line: "Grrr! If I catch you with them..I'm going to kill you!" }
      ];
      currentDialog = new Dialog(sampleDialog); // passing the new structure to the Dialog constructor
    }
    
    function readSerial(data) {
      let val = trim(data);
      if (!val) return;
    
      let parts = val.split(" ");
    
      // parsing DIRX for both game movement and menu navigation
      let dirPart = parts.find(p => p.startsWith("DIRX:"));
      if (dirPart) {
        let dir = int(dirPart.split(":")[1]);
    
        // setting rightX for in-game hand movement
        if (dir === 0) {
          rightX = width * 0.25;  // move left
          
          // for menu navigation, select the left button (start game)
          if (gameState === 'start') {
            currentSelection = 0; // selecting start game button
          } 
          // for how to play screen, select the back button when moving left
          else if (gameState === 'howToPlay') {
            backButtonSelected = true;
          }
        } else if (dir === 1) {
          rightX = width * 0.75;  // move right
          
          // for menu navigation, select the right button (how to play)
          if (gameState === 'start') {
            currentSelection = 1; // selecting how to play button
          }
          // for how to play screen, deselect the back button when moving right
          else if (gameState === 'howToPlay') {
            backButtonSelected = false;
          }
        } else if (dir === -1) {
          rightX = width / 2;     // center (neutral)
        }
    
        console.log("Joystick Direction:", dir);
      }
    
      // parsing SEL for button selection
      let selPart = parts.find(p => p.startsWith("SEL:"));
      if (selPart) {
        let sel = int(selPart.split(":")[1]);
        prevSelPressed = selPressed;
        selPressed = (sel === 1);
        
        // if SEL button was just pressed (rising edge detection)
        if (selPressed && !prevSelPressed) {
          // handling button selection based on current screen
          if (gameState === 'start') {
            if (currentSelection === 0) {
              startGame();
            } else if (currentSelection === 1) {
              showHowToPlay();
            }
          } else if (gameState === 'howToPlay') {
            // only going back if the back button is selected
            if (backButtonSelected) {
              goBackToStart();
            }
          } else if (gameState === 'game' && currentDialog && !currentDialog.isComplete) {
            currentDialog.advanceDialog();
          }
        }
      }
    
      // parsing PULSE
      let pulsePart = parts.find(p => p.startsWith("PULSE:"));
      if (pulsePart) {
        pulseValue = int(pulsePart.split(":")[1]);
        console.log("Pulse Reading:", pulseValue);
      }
    }
    
    
    // helper that mirrors mouseClicked() logic but uses joystick coords
    function joystickClick(x,y){
      if(gameState==='start'){
        // is it on the “start” button?
        if(x > startButton.x && x < startButton.x+startButton.width &&
           y > startButton.y && y < startButton.y+startButton.height){
          startGame();
        }
        // how-to-play button?
        else if(x > howToPlayButton.x && x < howToPlayButton.x+howToPlayButton.width &&
                y > howToPlayButton.y && y < howToPlayButton.y+howToPlayButton.height){
          showHowToPlay();
        }
      }
    }
    
    
    function draw() {
      background(255); 
      
      switch (gameState) {
        case 'start':
          drawStartScreen();
          updateButtonHover(startButton);
          updateButtonHover(howToPlayButton);
          break;
        case 'game':
          drawGameScreen();
          updateButtonHover(backButton);
          break;
        case 'howToPlay':
          drawHowToPlayScreen();
          updateButtonHover(backButton);
          break;
      }
    
      if (!gameOver) {
        noCursor();
      }
    }
    
    function drawStartScreen() {
      image(startImg, 0, 0, width, height);
      
      // updating button hover state based on joystick selection instead of mouse
      if (gameState === 'start') {
        // hilighting the currently selected button based on joystick input
        startButton.isHovered = (currentSelection === 0);
        howToPlayButton.isHovered = (currentSelection === 1);
      }
      
      drawButton(startButton);
      drawButton(howToPlayButton);
      
      // hiding back button on start screen
      backButton.x = -backButton.width; // positioning off-screen
    }
    
    function drawGameScreen() {
      image(bgImg, 0, 0, width, height);
      
      // drawing dialog if it exists and isn’t complete
      if (currentDialog && !currentDialog.isComplete) {
        currentDialog.display();
        return;
      }
      
      // if dialog is complete but gameover, show ending
      if (gameOver) {
        drawGameOverScreen();
        return;
      }
    
      // --- sister character ---
      let sisterWidth  = 180;
      let sisterHeight = 200;
      let sisterX = width/2 - sisterWidth/2;
      let sisterY = height/2 - sisterHeight/2 - 30;
      
      let sisterImg = characters['sister_turned'].img;
      if (rightHand.sisterAngry)        sisterImg = characters['sister_angry'].img;
      else if (rightHand.sisterTurned)  sisterImg = characters['sister_upset'].img;
      
      // possibly flip her based on time+random+heartbeat
      if (!rightHand.lastTurnCheck 
          || millis() - rightHand.lastTurnCheck > random(1500, 2500)) {
            
        rightHand.lastTurnCheck = millis();
        let heartbeatTrigger = (rightHand.lastPulseValue 
          && Math.abs(pulseValue - rightHand.lastPulseValue) > 70);
        rightHand.lastPulseValue = pulseValue;
        
        if (random() < 0.25 || heartbeatTrigger) {
          rightHand.sisterTurned = !rightHand.sisterTurned;
          if (rightHand.sisterTurned) {
            rightHand.sisterTurnStart = millis();
          }
          rightHand.sisterAngry   = false;
          rightHand.heartDeducted = false;
        }
      }
      
      image(sisterImg, sisterX, sisterY, sisterWidth, sisterHeight);
    
      // enforcing max “turned” time
      if (rightHand.sisterTurned 
          && millis() - rightHand.sisterTurnStart > rightHand.maxTurnDuration) {
        rightHand.sisterTurned   = false;
        rightHand.sisterAngry    = false;
        rightHand.heartDeducted  = false;
      }
      
      // --- Collision ---
      leftHand.x = width/2 - leftHand.width - 50;
      leftHand.y = height/2 + leftHand.height/2;
      
      // compute & smooth right hand
      let targetX = constrain(rightX, width*0.1, width*0.9);
      rightHand.x += (targetX - rightHand.x)*0.2;
      rightHand.y  = height/2 + rightHand.height/2;
      
      let handsOverlap = checkHandCollision(leftHand, rightHand);
      let ledState = "";
      if (handsOverlap && rightHand.sisterTurned && !rightHand.heartDeducted) {
        // CAUGHT → angry → RED
        hearts.count--;
        rightHand.heartDeducted = true;
        rightHand.sisterAngry   = true;
        ledState = "RED";
        dialogSounds['angry.mp3'].play();
    
        if (hearts.count <= 0) {
          gameOver = true;
          gameEndingType = 'bad';
        }
      }
    
      else if (handsOverlap && rightHand.sisterTurned && rightHand.heartDeducted) {
        ledState = "RED";
      }
      else if (!rightHand.sisterTurned) { // sister is not turned
        // SAFE while she’s turned → GREEN
        ledState = "GREEN";
      }
      else if (rightHand.sisterTurned && !handsOverlap) { // sister is turned but hands are not overlapping
        // NORMAL (not turned) → YELLOW
        ledState = "YELLOW";
      }
    
      if (ledState && ledState !== rightHand.lastLedState) {
        writeSerial(ledState + "\n");
        rightHand.lastLedState = ledState;
      }
      
      // --- Eating Animation & Chips ---
      if (handsOverlap && !rightHand.animationStarted) {
        rightHand.animationStarted = true;
        rightHand.animationTimer   = millis();
        if (rightHand.currentImg === 3 && chipCounter.count > 0) {
          chipCounter.count--;
          if (chipCounter.count <= 0) {
            gameOver = true;
            gameEndingType = 'good';
          }
        }
        rightHand.currentImg = 1;
        dialogSounds['chip_crunch.mp3'].play();
      }
      else if (handsOverlap && rightHand.animationStarted) {
        let t = millis() - rightHand.animationTimer;
        if (rightHand.currentImg === 1 && t > 300) rightHand.currentImg = 2;
        if (rightHand.currentImg === 2 && t > 500) rightHand.currentImg = 3;
        if (rightHand.currentImg === 3 && t > 800) {
          rightHand.animationTimer = millis();
          rightHand.currentImg = 1;
          if (chipCounter.count > 0) {
            chipCounter.count--;
            if (chipCounter.count <= 0) {
              gameOver = true;
              gameEndingType = 'good';
            }
          }
          dialogSounds['chip_crunch.mp3'].play();
        }
      }
      else if (!handsOverlap && rightHand.animationStarted) {
        rightHand.animationStarted = false;
        dialogSounds['chip_crunch.mp3'].stop();
      }
      
      // drawing hands
      let handImg = rightHand.currentImg===1 ? rightHand.img
                  : rightHand.currentImg===2 ? rightHand.img2
                  : rightHand.img3;
      image(handImg, rightHand.x, rightHand.y, rightHand.width, rightHand.height);
      image(leftHand.img, leftHand.x, leftHand.y, leftHand.width, leftHand.height);
    
      image(chipCounter.img, chipCounter.x, chipCounter.y, chipCounter.width, chipCounter.height);
      push();
        fill(255); stroke(0); strokeWeight(2);
        textSize(24); textAlign(CENTER, CENTER);
        text(chipCounter.count, chipCounter.x+chipCounter.width/2, chipCounter.y+chipCounter.height/2);
      pop();
      for (let i=0; i<hearts.count; i++) {
        image(hearts.img, hearts.x+(hearts.width+hearts.spacing)*i, hearts.y, hearts.width, hearts.height);
      }
    }
    
    // function to check collision between hand objects
    function checkHandCollision(hand1, hand2) {
      // rectangular collision detection
      return (
        hand1.x + 10 < hand2.x + hand2.width &&
        hand1.x + hand1.width > hand2.x &&
        hand1.y < hand2.y + hand2.height &&
        hand1.y + hand1.height > hand2.y
      );
    }
    
    function drawHowToPlayScreen() {
        background(50); 
        let imgWidth = width * 0.9; // Use 90% of screen width
        let imgHeight = height * 0.7; // Use 70% of screen height
        image(howToPlayImg, width/2 - imgWidth/2, height/2 - imgHeight/2 + 30, imgWidth, imgHeight);
        
        fill(255);
    
        textSize(35);
        textAlign(CENTER, CENTER);
        text("How to Play", width/2, height/2 - imgHeight/2 - 15);
    
        backButton.isHovered = backButtonSelected;
        
        drawButton(backButton);
        
        startButton.x = -startButton.width; // Position off-screen
        howToPlayButton.x = -howToPlayButton.width; // Position off-screen
    }
    
    
    function startGame() {
      gameState = 'game';
      console.log("Game Started!");
    
      if (bgMusic.isPlaying()) {
        bgMusic.stop();
      }
    
      if (currentDialog) {
          currentDialog.currentLineIndex = 0;
          currentDialog.isComplete = false;
      }
    
      positionBackButton();
    }
    
    function showHowToPlay() {
      gameState = 'howToPlay';
      console.log("Showing How to Play");
       positionBackButton();
    }
    
    function goBackToStart() {
        gameState = 'start';
        console.log("Going back to Start Screen");
        positionStartScreenButtons();
    }
    
    
    // helper function to position buttons on the start screen
    function positionStartScreenButtons() {
        textSize(20); 
        startButton.width = textWidth(startButton.text) + 40; 
        howToPlayButton.width = textWidth(howToPlayButton.text) + 40; 
        startButton.height = 40; 
        howToPlayButton.height = 40; 
    
    
        let buttonY = height * 0.65; // Moved buttons higher (was 0.75)
    
        let buttonSpacing = 20;
        let totalButtonWidth = startButton.width + howToPlayButton.width + buttonSpacing;
        let leftShift = 15; 
        startButton.x = width / 2 - totalButtonWidth / 2 - leftShift;
        howToPlayButton.x = startButton.x + startButton.width + buttonSpacing;
    
        startButton.y = buttonY;
        howToPlayButton.y = buttonY;
    }
    
    function positionBackButton() {
        textSize(20); 
        backButton.width = textWidth(backButton.text) + 40;
        backButton.height = 40; 
        backButton.x = 20; 
        backButton.y = 20; 
    }
    
    
    // function to draw a button
    function drawButton(button) {
         if (button.x + button.width > 0 && button.x < width) {
             push(); // Save current drawing settings
    
            if (button.isHovered) {
                fill(255, 0, 0); 
            } else {
                fill(0); 
            }
    
            rect(button.x, button.y, button.width, button.height, 10); // 10 is corner radius
    
            fill(255);
            textSize(22);
            textAlign(CENTER, CENTER);
    
            text(button.text, button.x + button.width / 2, button.y + button.height / 2 - 5);
    
            pop(); // restoring drawing settings
         }
    }
    
    // helper function to update button hover state
    function updateButtonHover(button) {
        // inly update hover if the button is likely visible on screen
        if (button.x + button.width > 0 && button.x < width) {
            if (mouseX > button.x && mouseX < button.x + button.width &&
                mouseY > button.y && mouseY < button.y + button.height) {
                button.isHovered = true;
                cursor(HAND); // Change cursor on hover
            } else {
                button.isHovered = false;
                // only set cursor back to ARROW if NO button is hovered
                 let anyButtonHovered = false;
                 if (gameState === 'start') {
                     if (startButton.isHovered || howToPlayButton.isHovered) anyButtonHovered = true;
                 } else if (gameState === 'howToPlay' || gameState === 'game') {
                     if (backButton.isHovered) anyButtonHovered = true;
                 }
                 if (!anyButtonHovered) {
                     cursor(ARROW);
                 }
            }
        }
    }
    
    // helper function to update button hover state with joystick
    function updateButtonHoverWithJoystick(button, joyX, joyY) {
      if (button.x + button.width > 0 && button.x < width) {
        if (joyX > button.x && joyX < button.x + button.width &&
            joyY > button.y && joyY < button.y + button.height) {
          button.isHovered = true;
        } else {
          button.isHovered = false;
        }
      }
    }
    
    function mouseClicked() {
      switch (gameState) {
        case 'start':
          // Check for button clicks on the start screen
          if (mouseX > startButton.x && mouseX < startButton.x + startButton.width &&
              mouseY > startButton.y && mouseY < startButton.y + startButton.height) {
            startGame();
          } else if (mouseX > howToPlayButton.x && mouseX < howToPlayButton.x + howToPlayButton.width &&
                     mouseY > howToPlayButton.y && mouseY < howToPlayButton.y + howToPlayButton.height) {
            showHowToPlay();
          }
          break;
        case 'game':
          // handling dialog clicks in the game screen
          if (currentDialog && !currentDialog.isComplete) {
            if (mouseX > currentDialog.boxX && mouseX < currentDialog.boxX + currentDialog.boxWidth &&
                mouseY > currentDialog.boxY && mouseY < currentDialog.boxY + currentDialog.boxHeight) {
              currentDialog.advanceDialog();
            } else if (mouseX > backButton.x && mouseX < backButton.x + backButton.width && 
                       mouseY > backButton.y && mouseY < backButton.y + backButton.height) {
                goBackToStart(); // Go back if back button is clicked
            }
          } else { 
               if (mouseX > backButton.x && mouseX < backButton.x + backButton.width &&
                       mouseY > backButton.y && mouseY < backButton.y + backButton.height) {
                goBackToStart(); // Go back if back button is clicked
               }
          }
          break;
        case 'howToPlay':
            if (mouseX > backButton.x && mouseX < backButton.x + backButton.width &&
                mouseY > backButton.y && mouseY < backButton.y + backButton.height) {
                goBackToStart(); // Go back to start screen
            }
            break;
      }
    }
    
    
    class Character {
      constructor(key, displayName, imagePath) {
        this.key = key;
        this.displayName = displayName;
        this.imagePath = imagePath;
        this.img = null;
      }
      preload() {
        this.img = loadImage(this.imagePath);
      }
    }
    
    class Dialog {
      constructor(dialogLines) {
        // dialogLines: array of strings like 'sister_normal: "Hey!"' or just dialog text
        this.lines = dialogLines.map(lineObj => {
          if (typeof lineObj === 'string') {
            let match = lineObj.match(/^(\w+):\s*"([\s\S]*)"$/);
            if (match) {
              return { speaker: match[1], line: match[2] };
            } else {
              return { speaker: 'sister_normal', line: lineObj };
            }
          } else if (typeof lineObj === 'object') {
            return {
              speaker: lineObj.speaker || 'sister_normal',
              line: lineObj.line || '',
              sound: lineObj.sound || null 
            };
          }
        });
        this.currentLineIndex = 0;
        this.isComplete = false;
        this.boxWidth = width * 0.8;
        this.boxHeight = 100;
        this.boxX = (width - this.boxWidth) / 2;
        this.boxY = height - this.boxHeight - 20;
        this.textSize = 18;
        this.textMargin = 15;
        this.lineHeight = this.textSize * 1.2;
      }
    
      display() {
        if (this.isComplete) {
          return;
        }
        fill(0, 0, 0, 200);
        rect(this.boxX, this.boxY, this.boxWidth, this.boxHeight, 10);
    
        let currentLineObj = this.lines[this.currentLineIndex];
        let charKey = currentLineObj.speaker || 'sister_normal';
        
        let displayedCharKey = charKey; // Either 'me' or one of the sister variants
        
        let charObj = characters[displayedCharKey] || characters['sister_normal'];
        
        let charImgSize = (displayedCharKey === 'me') ? 150 : 220; 
        
        if (charObj && charObj.img) {
          let centerX = width / 2 - charImgSize / 2;
          
          let centerY = (displayedCharKey === 'me') 
            ? height / 2 - charImgSize / 2 - 20  
            : height / 2 - charImgSize / 2 - 45; 
          
          image(charObj.img, centerX, centerY, charImgSize, charImgSize);
          
          fill(255);
          textSize(18);
          textAlign(CENTER, TOP);
          text(charObj.displayName, width / 2, centerY + charImgSize + 5);
        }
    
        fill(255);
        textSize(this.textSize);
        textAlign(LEFT, TOP);
        let textY = this.boxY + this.textMargin;
        text(currentLineObj.line, this.boxX + this.textMargin, textY, this.boxWidth - this.textMargin * 2);
    
        if (this.currentLineIndex < this.lines.length - 1) {
          fill(255, 150);
          textSize(14);
          textAlign(RIGHT, BOTTOM);
          text("Click to continue...", this.boxX + this.boxWidth - this.textMargin, this.boxY + this.boxHeight - 5);
        }
      }
    
      advanceDialog() {
        if (this.isComplete) {
          return;
        }
        
        this.currentLineIndex++;
        
        if (this.currentLineIndex >= this.lines.length) {
          this.isComplete = true;
          console.log("Dialog complete!");
          return;
        }
        
        let currentLineObj = this.lines[this.currentLineIndex];
        if (currentLineObj.sound && dialogSounds[currentLineObj.sound]) {
          dialogSounds[currentLineObj.sound].play();
        }
      }
    }
    
    function drawGameOverScreen() {
      push();
      fill(0, 0, 0, 180); // dark overlay with 70% opacity
      rect(0, 0, width, height);
      
      let endingImg;
      if (gameEndingType === 'good') {
        endingImg = happyEndingImg;
      } else {
        endingImg = badEndingImg;
      }
      
      let imgWidth = 300;
      let imgHeight = 200;
      image(endingImg, width/2 - imgWidth/2, height/2 - 150, imgWidth, imgHeight);
      
      fill(255);
      textSize(40);
      textAlign(CENTER, CENTER);
      if (gameEndingType === 'good') {
        text("YOU WON!", width/2, height/2 - 30);
      }
      else {
        text("YOU LOST!", width/2, height/2 - 30);
      }
      textSize(24);
      text("Press R to restart", width/2, height/2 + 80);
      pop();
    }
    
    function keyPressed() {
      // Check if 'R' key is pressed to restart the game when game over
      if (key === 'r' || key === 'R') {
        if (gameState === 'game' && gameOver) {
          resetGame();
        }
      }
    }
    
    function resetGame() {
      hearts.count = 5;
      chipCounter.count = 10;
      gameOver = false;
      
      handsOverlap = false;
      rightHand.sisterTurned = false;
      rightHand.animationStarted = false;
      rightHand.currentImg = 1;
      rightHand.heartDeducted = false;
    }

     

5. Communication between Arduino and p5.js:

The Arduino and the p5.js sketch exchange data over a 9600 baud serial link, as we did in the sample homework 2 weeks back. On each loop, the Arduino reads the joystick X‐axis (A0), the pulse sensor (A1) and the select button (pin 8), then emits a line of the form “DIRX:<–1|0|1> PULSE:<0–1023> SEL:<0|1>
” (these were the specific format that I chose as it helped me debug as well). The p5.js sketch listens for these lines, parses out the numeric values for direction, heart rate, and button state, and uses them to drive menu selection, character motion, and game logic. When the sketch determines that the physical LED indicators should change color (based on the game state), it sends right back a single‐word command: “RED\n”, “YELLOW\n” or “GREEN\n” via writeSerial(), and the Arduino responds by calling applyLeds() to set the appropriate digital outputs. I’m glad that I was able to implement a bidirectional serial communication where not only the Arduino sends information to the p5 sketch but also vice versa.

Reflection

I was really proud of how seamlessly the physical box integrated into the game environment; it didn’t feel tacked on, but like a natural extension of the game world I created .  I’m also proud that I soldered the LEDs to the wires because I was really afraid of soldering in the beginning. However, after a couple of tries on a sample, I was able to solder all 12 legs . In the future, I’d look to add a custom cursor that follows the joystick movements and actually wire up the Y-axis input so users can move that cursor freely around the screen. During playtesting, at least two people instinctively pushed the stick up and down—and, of course, nothing happened (since I hadn’t implemented Y at all)—which led to a couple of awkward pauses. Enabling full two-axis control would make the interaction more intuitive and keep the game flow smooth.

Images from Showcase:

THANK YOU PROFESSOR & EVERYONE FOR AN AMAZING CLASS FOR MY LAST SEMESTER!

User Testing + Final Project

User Testing:

During user testing, the visuals weren’t clear in indicating what the project was for. The user mentioned that instructions to interact weren’t needed, but also that the purpose of the project wasn’t directly clear.

 

For technical improvements I adjusted the interval averaging to 8 samples, which improved stability but slightly delayed responsiveness. I also tested different tempoMul ranges (originally 0.2–3) and settled on 0.5–2 to keep the music and visuals within a comfortable range.

User Testing Video

For my final project, I developed an interactive audiovisual experience that transforms a user’s heart rate into dynamic music and visuals, creating a biofeedback-driven art piece. Built using p5.js, the project integrates Web Serial API to read pulse data from an Arduino-based heart rate sensor, generating a musical chord progression (Cmaj7-Am7-Fmaj7-G7) and WebGL visuals (swirling ellipses and a point field) that respond to the calculated BPM (beats per minute). Initially, I proposed a STEM-focused feature to educate users about heart rate, but I pivoted to make the music and visuals adjustable via mouse movements, allowing users to fine-tune the tempo and visual intensity interactively.

The heart rate sensor sends pulse data to p5.js, which calculates BPM to adjust the music’s tempo (chord changes, bass, kick drum) and visual animation speed. Mouse X position controls a tempo multiplier (tempoMul), scaling both music and visuals, while BPM directly influences animation speed and audio effects. The visuals were inspired by p5.js data visualization examples from class, particularly those using WebGL to create dynamic, responsive patterns. The project aims to create a meditative, immersive experience where users see and hear their heartbeat in real-time, with interactive control over the output.

Hardware:
  • Heart Rate Sensor: I used a PulseSensor connected to an Arduino Uno, wired directly to analog pin A0 without a resistor to simplify the circuit. The sensor is not attached and can freely be interacted with.
  • Fabrication: I fit the arduino cable and heart rate sensor through the cardboard sparkfun box. I avoided a finger strap due to wire fragility and inconsistent pressure, opting for a loose finger placement method.
  • Challenges: Direct wiring without a resistor may have increased noise in the pulse signal, requiring software filtering in the Arduino code. Loose finger contact sometimes caused erratic readings, so I adjusted the threshold and added a refractory period to stabilize detection.

The p5.js sketch reads serial data from the Arduino, calculates BPM, and updates music and visuals. Initially, I tried processing raw analog values in p5.js, but noise made it unreliable. After extensive debugging (around 10 hours), I modified the Arduino code to send pre-processed BPM estimates as integers (30–180 range), which streamlined p5.js logic. The mouse-driven tempoMul (mapped from mouse X) scales chord timing, note durations, and visual motion, replacing the STEM feature with an interactive control mechanism.

A significant challenge was balancing real-time BPM updates with smooth visualization. The visuals use the latest BPM, which can make animations appear jumpy if BPM changes rapidly. I averaged BPM over 8 intervals to smooth transitions, but this introduced a slight lag, requiring careful tuning. Serial communication also posed issues: the Web Serial API occasionally dropped connections, so I added robust error handling and a “Connect & Fullscreen” button for reconnection.

Arduino Code (Heart Rate Sensor):

#define PULSE_PIN A0
void setup() {
  Serial.begin(9600);
  pinMode(PULSE_PIN, INPUT);
}
void loop() {
  int pulseValue = analogRead(PULSE_PIN);
  if (pulseValue > 400 && pulseValue < 800) { // Basic threshold
    Serial.println(pulseValue);
  } else {
    Serial.println("0");
  }
  delay(10);
}
Images

 

The seamless integration of heart rate data into both music and visuals is incredibly rewarding. Seeing the ellipses swirl faster and hearing the chords change in sync with my heartbeat feels like a direct connection between my body and the art. I’m also proud of overcoming the Arduino noise issues by implementing software filtering and averaging, which made the BPM calculation robust despite the direct wiring.

Challenges Faced:
  • Arduino Code: The biggest hurdle was getting reliable pulse detection without a resistor. The direct wiring caused noisy signals, so I spent hours tuning the THRESHOLD and REFRACTORY values in the Arduino code to filter out false positives.
  • BPM Calculation: Calculating BPM in p5.js required averaging intervals (intervals array) to smooth out fluctuations, but this introduced a trade-off between responsiveness and stability. I used a rolling average of 8 intervals, but rapid BPM changes still caused slight visual jumps.
  • Visualization Balance: The visuals update based on the latest BPM, which can make animations feel abrupt if the heart rate spikes. I tried interpolating BPM changes, but this slowed responsiveness, so I stuck with averaging to balance real-time accuracy and smooth motion.
  • p5.js Visualization: Adapting WebGL examples from class to respond to BPM was tricky. The math for scaling ellipse and point field motion (visualSpeed = (bpm / 60) * tempoMul) required experimentation to avoid jittery animations while staying synchronized with the music.
  • Serial Stability: The Web Serial API occasionally dropped connections, especially if the Arduino was disconnected mid-session. Robust error handling and the reconnect button mitigated this, but it required significant testing.
Possible Improvements:
  • Smoother BPM Transitions: Implement linear interpolation to gradually transition between BPM values, reducing visual jumps while maintaining real-time accuracy.
  • Dynamic Color Mapping: Map BPM to the hue of the ellipses or points (e.g., blue for low BPM, red for high), enhancing the data visualization aspect.
  • Audio Feedback: Add a subtle pitch shift to the pad or bass based on BPM to make tempo changes more audible.
  • Sensor Stability: Introduce a clip-on sensor design to replace loose finger placement, improving contact consistency without fragile wires.
Reflection + Larger Picture:

This project explores the intersection of biofeedback, art, and interactivity, turning an invisible biological signal (heart rate) into a tangible audiovisual experience. It highlights the potential of wearable sensors to create personalized, immersive art that responds to the user’s physical state. The data visualization component, inspired by p5.js examples from class (e.g., particle systems and dynamic patterns), emphasizes how abstract data can be made expressive and engaging. Beyond art, the project has applications in mindfulness, where users can regulate their heart rate by observing its impact on music and visuals, fostering a deeper connection between the body and mind.

Week 13 — Final Project Progress

User Testing

So, I tried letting people play the game. I realized that most of them didn’t really look at the instructions. At first, I let them figure it out on their own, and after the first round, they usually managed to understand how to play, which was nice. However, starting with the third user, I started making more of an effort to explain things minimally, answer questions when they came up, or suggest they check the “How to Play” manual.

People really liked the design I was going for, so I kept that aesthetic and focused on making it pop and align with the theme I envisioned. One thing I noticed is that the wires could be better hidden with blue tape (to match the box), so I’ll likely add that along with other cosmetic improvements. I also want to tweak the win conditions, like changing how hearts are deducted or how chips are counted, because I felt it was a little too easy to win. I’d also like to add an LED that reflects the state of the game for clearer feedback.

One issue I noticed was that players didn’t fully understand the joystick controls, especially since there’s no visible cursor. I had to explain to at least two people that the joystick only moves left and right, not up and down. To help with this, I plan to make the “How to Play” manual more detailed so users can understand the controls better from the start.

Video 

(I know the video is after I added the LEDs..imagine them without because those were actually last minute additions + also for some reason I can’t attach the video as media so I’ll add a google drive link :>)

https://drive.google.com/file/d/1lNbEfx_gUZIfRPWL0JcJXUbqzv9c68sE/view?usp=sharing

 

Week 14: Final Project Documentation

Source

The full source code, and accompanying documentation, is available at on Github.

Concept

This project is an interactive media piece that combines audio, visuals, and interactive elements to share the personal stories of students who faced repercussions for their politics. Using both digital and physical interfaces, users can explore these narratives at their own pace and in thier own manner.

My motivation behind choosing this cause was to highlight the similarity and elicit the empathy of the audience, as fellow international students.

How It Works

The implementation consists of three main components working together:

1. Client Interface: A browser-based application built with p5.js that handles the visualization, audio playback, and user interaction.
2. Server Backend: A FastAPI backend that serves content and manages WebSocket connections for real-time control.
3. Physical Controls: An Arduino-based controller that provides tangible ways to interact with the digital content.

When a user accesses the application, they’re presented with a multimedia canvas that displays images related to a student’s story while playing their audio narrative. As the audio plays, transcribed text appears dynamically on screen, visualizing the spoken words. Users can control the experience either through on-screen interactions or using the physical Arduino controller.

Interaction Design

The interaction design prioritizes intuitive exploration and meaningful engagement:

Starting the Experience: Users simply click anywhere on the screen to begin
Content Navigation: Pressing buttons or clicking on screen transitions to the next story
Playback Controls:
– A physical switch pauses/resumes the audio
– Potentiometers adjust volume and playback speed
– On-screen interactions mirror these controls for users without the physical interface

The design deliberately minimizes explicit instructions, encouraging exploration and discovery. As found during user testing, providing just a small amount of context greatly improved the experience while still allowing for personal discovery. Users who spent more time interacting with the piece often discovered deeper layers of meaning as they engaged with multiple stories.

Arduino Implementation

The Arduino controller serves as a physical interface to the digital content, creating a more tangible connection to the stories. The circuit includes:

– A toggle switch for pausing/resuming content
– Two push buttons for transitioning between stories
– Three potentiometers for adjusting various playback parameters

The Arduino code continuously monitors these inputs and sends structured data to the server whenever a change is detected. The Arduino communicates with the server over a serial connection, sending a compact binary structure containing the state of all controls.

Future Improvements

Based on user testing and reflection, several areas could be enhanced in future iterations:

1. Introductory Context: Adding a brief introduction screen would help orient users.
2. More Intuitive Transitions: Some users were initially uncertain about how to navigate between stories.
3. Additional Narratives: Expanding the collection of stories would create a more comprehensive experience.

Final Project Documentation

Concept

I created this project to introduce the viewers to 4 regions in the Northern Caucasus: Adyghea, Dagestan, Ingushetia, Checnya.

As a physical object, i pivoted and decided to build with four distinct doors, each representing one of the regions. Each door is embedded with a color-coded button, inviting the user to press and explore a region.

  • Red Button – Ingusheita

  • Green Button – Dagestan

  • Blue Button – Chechnya

  • Yellow Button – Adyghea

This interaction was designed to make the user feel like a traveler stepping into a new cultural experience. Upon pressing a door button, users are “teleported” to a selected region. Each region features:

  • Must-See Spots: Highlighted landscapes, canyons and mountains

  • Traditions: Rituals, dances, or cultural practices—some of which are interactive

  • Taste of Region: Food visuals

Code

P5: Link to no arduino version

Arduino code:

const int buttonPin1 = 4;  // Dagestan button
const int buttonPin2 = 2;  // Adyghea button
const int buttonPin3 = 5;  // Checnya button
const int buttonPin4 = 3;  // Ingushetia button

int buttonState1 = 0;
int buttonState2 = 0;
int buttonState3 = 0;
int buttonState4 = 0;

int lastButtonState1 = HIGH;
int lastButtonState2 = HIGH;
int lastButtonState3 = HIGH;
int lastButtonState4 = HIGH;

void setup() {
  Serial.begin(9600);
  pinMode(buttonPin1, INPUT_PULLUP);
  pinMode(buttonPin2, INPUT_PULLUP);
  pinMode(buttonPin3, INPUT_PULLUP);
  pinMode(buttonPin4, INPUT_PULLUP);
}

void loop() {
  // Read button states
  buttonState1 = digitalRead(buttonPin1);
  buttonState2 = digitalRead(buttonPin2);
  buttonState3 = digitalRead(buttonPin3);
  buttonState4 = digitalRead(buttonPin4);

  // Check for button presses and only send once on press
  if (buttonState1 == LOW && lastButtonState1 == HIGH) {
    Serial.println("DAGESTAN");
  }    
  
  if (buttonState2 == LOW && lastButtonState2 == HIGH) {
    Serial.println("ADYGHEA");
  }
  
  if (buttonState3 == LOW && lastButtonState3 == HIGH) {
    Serial.println("CHECNYA");
  }
  
  if (buttonState4 == LOW && lastButtonState4 == HIGH) {
    Serial.println("INGUSHETIA");
  }
  
  // Save current button states for next comparison
  lastButtonState1 = buttonState1;
  lastButtonState2 = buttonState2;
  lastButtonState3 = buttonState3;
  lastButtonState4 = buttonState4;
  
  delay(50); // Debounce delay
}

Reflections:

This project cost me a lot of all nighters, confusion and frustration. At the same time, it taught me so much both in technical and non-technical aspects and became one of the most rewarding learning experiences I’ve had so far.

Originally, my idea was to create a compass-based interaction, but I struggled to get the p5.js and Arduino integration to work properly. After a whole night night of troubleshooting and brainstorming, I pivoted the concept and fully committed to building a physical house. To bring that idea to life, I had to learn Adobe Illustrator from scratch to design the house structure, prepare it for laser cutting, and assemble it manually. I also learned how to solder and wire hardware components, connecting each button to the Arduino and ensuring smooth communication with the p5.js interface.

Even though my project wasn’t a game or something instantly engaging for everyone, I was quite surprised by the positive reactions at the showcase. Many visitors had never heard of the Caucasus before, yet they showed genuine curiosity to learn about its culture through an interactive format. Several people commented on the effort it must have taken to build the physical installation and to research and create the digital content for each region.

Overall, this project and the entire class had its highs and lows, but I truly loved the creative process—from just having an idea to actual execution of it. Even when it seemed impossible at times, the process pulled me into a state of creative flow and i enjoyed it.

Week 13: Work on Final Project

User Testing

For my project, I created an interactive media piece that tells the stories of students who faced consequences due to political protests. The user testing aimed to see how intuitive and engaging the experience was without any instructions.

I found that providing a small amount of context greatly improved the user experience. Users could scan a QR code to access additional multimedia content, which helped them engage more deeply. While some users were initially uncertain, a brief introduction to the controls and background information made them feel more comfortable exploring the project.

I also noticed that users who spent more time interacting with the piece took away more meaningful insights. They often stood and engaged with it for several minutes, discovering layers of meaning for them as they went along.

To improve the work in the future, I may decide to add an introduction screen with some helpful guidance, as well as provide a bit more context to what I was attempting to convey.