Week 14: Final Game Day

Concept:

Growing up, I have heard a lot about the Royal Bengal Tiger of Bangladesh, which is our sigil for being brave and strong and they live in the Sundarban, which is the largest mangrove forest of the world.

In Bangladesh, we have myths related to Sundarban and its history of man-eating tigers and a spirit/deity, named “Bonbibi” who protects the human from the tigers. An amazing fact, I learned while searching for this project is that both Muslims and Hindus in those certain area, believe in Bonbibi which is very conflicting with the concept of Islam. Yet, she is thought to be the daughter of Prophet Abraham. I did not wanted to miss the chance to let people get introduced to “Sundarban” and the legendary myth.

Inspirations:

While working on this project, I listened to the song named “Bonobibi” by Coke Studio Bangladesh, which says the story of her and connects the life of Bangladeshi to a very deep level. Another book, I took as my inspiration, is Amitav Ghosh‘s “Hungry Tide. I am personally inspired and try to treasure these amazing parts of my culture close to my identity and personality. I am not sure, how much I was able to tell the story through my game project but it was a project, not only for IM course but also me  to navigate ways of spreading my culture to a diverse student body and faculties.

Project Interaction Archive:

Project Box:
Inner Circuit
Right Side View
Top View Of the Project Box

 

 

 

 

 

 

 

 

 

 

 

 

 

Player Interaction: IM Showcase 2024

INTERACTION DESIGN:

Arduino for Physical Interactions
The Arduino serves as the core hardware interface. It reads data from two ultrasonic sensors, a potentiometer, and a physical button. This data is sent to the p5.js sketch via serial communication which to manage gameplay.

The project is divided into two games. To start any of the game player needs to press a physical button. The physical button is connected to Arduino pin 07 which takes the input from the button and allows the p5.js through serial communication to start the game. Pressing the button changes the state (HIGH → LOW or LOW → HIGH) is how Arduino code listens to detect button presses. Using a button is a common practice to start any game, so I used it for my final project as well.

Ultrasonic sensors detect hand proximity for selecting words based on their color. Two sensors were connected to take output for either Yellow or for White Color. I have set the threshold value to 15, so anything that comes below 15cm of distance, the sonar sensors will count them and if the obstacle is detected on the right sensor, the score will go up. To decide which one is dedicated for Yellow and Which one is White, I mapped each ultrasonic sensor to a specific word color in the code. The sensor on the left is dedicated to detecting Yellow, while the one on the right is dedicated to White. This mapping is based on the hardware connection and the logic in the Arduino and p5.js code.

The Arduino identifies which sensor detects an obstacle by reading the distance from the sensors. If the left sensor’s reading goes below the threshold of 15 cm, it corresponds to Yellow, and if the right sensor detects proximity, it corresponds to White. The data is then sent to p5.js via serial communication, which matches the color of the detected word with the respective sensor input to determine if the interaction is correct.

For the second game of collecting flowers, the potentiometer allows players to control the basket horizontally to collect flowers. The potentiometer is connected to an analog pin on the Arduino, which reads its position as a voltage value ranging from 0 to 1023, but I saw in mine it was somewhere around 990 as max value. This raw input is then mapped to the screen width in p5.js, allowing the basket’s movement to correspond seamlessly with the player’s adjustments.  When the basket aligns with a falling flower, the game detects a collision, increasing the score.

 

arduino code:

 // Arduino Code for Sundarbans Challenge Game

// Define Button Pin
#define BUTTON_PIN 7

// Define Ultrasonic Sensor Pins
const int trigPinYellow = 9;
const int echoPinYellow = 10;
const int trigPinWhite = 11;
const int echoPinWhite = 12;

// Define Potentiometer Pin
const int potPin = A0;

// Variables to track button state
bool buttonPressed = false;

void setup() {
  Serial.begin(9600);
  pinMode(BUTTON_PIN, INPUT_PULLUP);

  // Initialize Ultrasonic Sensor Pins
  pinMode(trigPinYellow, OUTPUT);
  pinMode(echoPinYellow, INPUT);
  pinMode(trigPinWhite, OUTPUT);
  pinMode(echoPinWhite, INPUT);
}

void loop() {
  // Read Button State
  bool currentButtonState = digitalRead(BUTTON_PIN) == LOW; // Button pressed when LOW

  if (currentButtonState && !buttonPressed) {
    buttonPressed = true;
    Serial.println("button:pressed");
  } else if (!currentButtonState && buttonPressed) {
    buttonPressed = false;
    Serial.println("button:released");
  }

  // Read distances from both ultrasonic sensors
  long distanceYellow = readUltrasonicDistance(trigPinYellow, echoPinYellow);
  long distanceWhite = readUltrasonicDistance(trigPinWhite, echoPinWhite);

  // Read potentiometer value
  int potValue = analogRead(potPin); // 0 - 1023

  // Send data in "pot:<value>,ultra1:<value>,ultra2:<value>" format
  Serial.print("pot:");
  Serial.print(potValue);
  Serial.print(",ultra1:");
  Serial.print(distanceYellow);
  Serial.print(",ultra2:");
  Serial.println(distanceWhite);

  delay(100); // Update every 100ms
}

long readUltrasonicDistance(int trigPin, int echoPin) {
  // Clear the Trigger Pin
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);

  // Send a 10 microsecond HIGH pulse to Trigger
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  // Read the Echo Pin and calculate the duration
  long duration = pulseIn(echoPin, HIGH, 30000); // Timeout after 30ms

  // Calculate distance in centimeters
  long distance = duration * 0.034 / 2;

  // Handle out-of-range measurements
  if (duration == 0) {
    distance = -1; // Indicate out of range
  }

  return distance;
}

Description of Code:

The button, connected to pin 7, uses an internal pull-up resistor to detect when it’s pressed, sending a signal (button:pressed) to the computer. Two ultrasonic sensors measure distance: one for yellow and one for white. These sensors send a pulse and listen for its echo to calculate how close a hand is, allowing players to interact with words based on their color.

The potentiometer, connected to A0, controls the basket in the flower-collecting game by adjusting its position smoothly. All data — button states, distances, and potentiometer readings — is sent to the computer via serial communication in a structured format given as a file from the professor and it is read as serial data in p5.js.

Schematic Diagram: Tinkercad

P5.js Sketch:

P5.js Code with description commented:
// sketch.js 

// ----------------------
// 1. Global Variable Declarations
// ----------------------
let page = "start"; // Initial page
let howToPlayButton, mythologyButton, connectButton, fullscreenButton, backButton; // Buttons
let timer = 45; // Total game time in seconds
let countdown = timer;
let timerStart = false;
let gameOver = false;
let gameResult = "";
let score = 0;
let getReadyBg;
let endgame;



let gameStartInitiated = false; // Flag for transitioning from gameStart to game

// Typewriter effect variables
let typewriterText = "Press the green button to start your adventure!";
let currentText = ""; 
let textIndex = 0;  
let typewriterSpeed = 50; 




// Word display variables
let words = [
  "The", "Sundarbans", "is", "home", "to", "the",
  "man-eater", "Bengal", "tiger.", "Local", "honey", 
  "hunters,", "the", "Mowallis,", "brave", "its", "dangers",
  "to", "collect", "honey,", "often", "facing", "tiger", 
  "encounters.", "Myths", "surround", "these", "tigers,", 
  "people", "can", "only", "stay", "safe", "if", "they",
  "remember", "Bonbibi,", "the", "spirit."
];


let currentWordIndex = 0;
let wordInterval = 8000; // first 5 words every 4s
let fastInterval = 6000; // subsequent words every 2s
let wordDisplay = "";
let wordColor = "white";

// Basket game variables
let basketX, basketY;
let flowers = [];
let vehicles = []; // Water particles for aesthetic effect in basket game


let textToDisplay = "The Sundarbans, is home to the man-eater Bengal tiger. Local honey hunters, the Mowallis, brave its dangers to collect honey, often facing tiger encounters. Myths surround these tigers, people can only stay safe if they remember Bonbibi, the spirit.";


// Background Text for mythology and gameStart
let backgroundText = "In the Sundarbans myth, Bonbibi, the guardian goddess of the forest, defends the local woodcutters and honey gatherers from the malevolent tiger god, Dakshin Rai. The story highlights a young boy, Dukhey, who becomes a victim of a pact between his greedy employer and Dakshin Rai, aiming to sacrifice him to the tiger god. Bonbibi intervenes, rescuing Dukhey and ensuring his safe return, emphasizing her role as a protector and mediator between the natural world and human interests. This myth underscores the intricate balance between exploitation and conservation of the Sundarbans' resources.";

// Variables for serial communication
let latestData = "waiting for data";

// Images
let startBg, nightBg, bonbibi;

// Typewriter interval ID
let typewriterIntervalID;

// Variable to store the setTimeout ID for word transitions
let wordTimeoutID;

// Debugging Mode
let debugMode = false; // Set to false when using actual sensors

// Feedback Variables
let feedbackMessage = "";
let feedbackColor;
let feedbackTimeout;
let wordTouched = false;

// ----------------------
// 2. Function Declarations
// ----------------------

function setPage(newPage) {
  page = newPage;

  if (page === "start") {
    
    if (!bgMusic.isPlaying()) {
      bgMusic.loop(); // Loop the music
      bgMusic.setVolume(0.5); // Adjust volume (0.0 to 1.0)
    }
    
  
    connectButton.show();
    howToPlayButton.show();
    mythologyButton.show();
    fullscreenButton.show(); // Show fullscreen button on Start page
    backButton.hide(); // Hide Back button on Start page
    
  } else if (page === "howToPlay" || page === "mythology") {
    connectButton.hide();
    howToPlayButton.hide();
    mythologyButton.hide();
    fullscreenButton.hide(); // Hide fullscreen on these pages
    backButton.show(); // Show Back button on these pages
  } else if (page === "gameStart") { 
    console.log("Get Ready! Transitioning to game...");
    setTimeout(() => {
      setPage("game"); // Transition to game page after 15 seconds
    }, 18000);
    
    connectButton.hide();
    howToPlayButton.hide();
    mythologyButton.hide();
    fullscreenButton.hide();
    backButton.hide(); // hide Back button on new page
    
  }  else if (page === "secondGameIntro") {
    connectButton.hide();
    howToPlayButton.hide();
    mythologyButton.hide();
    fullscreenButton.hide();
    backButton.hide();
  } else {
    connectButton.hide();
    howToPlayButton.hide();
    mythologyButton.hide();
    fullscreenButton.hide();
    backButton.hide();
  }
}

function toggleFullscreen() {
  let fs = fullscreen();
  fullscreen(!fs);
  console.log(fs ? "Exited fullscreen mode." : "Entered fullscreen mode.");
}

function styleButtons(buttonColors = {}) {
  const defaults = {
    connect: '#855D08',
    howToPlay: '#B77607',
    mythology: '#8C5506',
    fullscreen: '#E1D8D8',
    back: '#555555'
  };

  const colors = { ...defaults, ...buttonColors };

  let buttonStyles = `
    color: rgb(255,255,255);
    font-size: 15px;
    border: none;
    border-radius: 5px;
    padding: 8px 20px;
    text-align: center;
    text-decoration: none;
    display: inline-block;
    margin: 5px;
    cursor: pointer;
    transition: all 0.3s ease;
    box-shadow: 0 4px 6px rgba(0,0,0,0.1);
  `;

  connectButton.style(buttonStyles + `background-color: ${colors.connect};`);
  howToPlayButton.style(buttonStyles + `background-color: ${colors.howToPlay};`);
  mythologyButton.style(buttonStyles + `background-color: ${colors.mythology};`);
  fullscreenButton.style(buttonStyles + `background-color: ${colors.fullscreen};`);
  backButton.style(buttonStyles + `background-color: ${colors.back};`);

  [connectButton, howToPlayButton, mythologyButton, fullscreenButton, backButton].forEach((btn, index) => {
    const baseColor = Object.values(colors)[index];
    btn.mouseOver(() => {
      btn.style('background-color', shadeColor(baseColor, -10));
      btn.style('transform', 'translateY(-2px)');
      btn.style('box-shadow', '0 6px 8px rgba(0,0,0,0.15)');
    });
    btn.mouseOut(() => {
      btn.style('background-color', baseColor);
      btn.style('transform', 'translateY(0px)');
      btn.style('box-shadow', '0 4px 6px rgba(0,0,0,0.1)');
    });
  });
}

function shadeColor(color, percent) {
  let f = parseInt(color.slice(1),16),t=percent<0?0:255,p=percent<0?percent*-1:percent,
  R=f>>16,G=(f>>8)&0x00FF,B=f&0x0000FF;
  return '#'+(0x1000000+(Math.round((t-R)*p)+R)*0x10000+(Math.round((t-G)*p)+G)*0x100+(Math.round((t-B)*p)+B)).toString(16).slice(1);
}


// All preload
//--------------
function preload() {
  startBg = loadImage('start.png', () => console.log("start.png loaded."), () => console.error("Failed to load start.png."));
  nightBg = loadImage('village.png', () => console.log("night.png loaded."), () => console.error("Failed to load night.png."));
  bonbibi = loadImage('bonbibi.png');
  endgame = loadImage('endgame.png');
  
    bgMusic = loadSound('start.mp3', () => console.log("Music loaded!"), () => console.error("Failed to load music."));

}

// setup
function setup() {
  createCanvas(windowWidth, windowHeight);
  textAlign(CENTER,CENTER);

  backButton = createButton('Back');
  backButton.position(width/2 - backButton.width/2, height-100);
  backButton.mousePressed(() => setPage("start"));
  backButton.hide();

  createButtons();
  styleButtons();
  setPage("start");

  feedbackColor = color(255);

  typewriterIntervalID = setInterval(() => {
    if(textIndex < typewriterText.length) {
      currentText += typewriterText[textIndex];
      textIndex++;
    }
  }, typewriterSpeed);
}

// draw
function draw() 

{
  if (page === "gameStart") {
    handleGameStartPage(); // Render the "Get Ready!" page
  }

{
  if(page === "start" && startBg && startBg.width > 0) {
    imageMode(CORNER);
    image(startBg,0,0,width,height);
  } else if((page === "gameStart"||page==="game"||page==="won"||page==="lost"||page==="secondGameIntro"||page==="basketGame") && nightBg && nightBg.width>0) {
    imageMode(CORNER);
    image(nightBg,0,0,width,height);
  } else {
    background(30);
  }

  switch(page) {
    case "start":
      handleStartPage();
      break;
    case "gameStart":
      handleGameStartPage();
      break;
    case "game":
      handleGamePage();
      break;
    case "basketGame":
      handleBasketGame();
      break;
    case "howToPlay":
      handleHowToPlay();
      break;
    case "mythology":
      drawMythologyPage();
      break;
    case "secondGameIntro":
      handleSecondGameIntro();
      break;
    case "won":
    case "lost":
      handleEndPage();
      break;
    default:
      break;
  }

  if(feedbackMessage!==""){
    push();
    let boxWidth=width*0.3;
    let boxHeight=50;
    let boxX=width/2 - boxWidth/2;
    let boxY=height-80;

    fill('#F2D9A4');
    noStroke();
    rect(boxX,boxY,boxWidth,boxHeight,10);

    fill(feedbackColor);
    textSize(24);
    text(feedbackMessage,width/2,boxY+boxHeight/2);
    pop();
  }
}
}

//------------
// createButtons

function createButtons() {
  connectButton = createButton('Connect to Serial');
  connectButton.position(width/2 - connectButton.width/2, height/2 -220);
  connectButton.mousePressed(()=>{
    setUpSerial();
    connectButton.hide();
    console.log("Serial connection initiated.");
  });

  howToPlayButton = createButton('How to Play');
  howToPlayButton.position(width/2 - howToPlayButton.width/2, height/2 +205);
  howToPlayButton.mousePressed(()=>{
    setPage("howToPlay");
    console.log("Navigated to How to Play page.");
  });

  mythologyButton = createButton('Read the Mythology');
  mythologyButton.position(width/2 - mythologyButton.width/2, height/2 +255);
  mythologyButton.mousePressed(()=>{
    setPage("mythology");
    console.log("Navigated to Mythology page.");
  });

  fullscreenButton = createButton('Fullscreen');
  fullscreenButton.mousePressed(toggleFullscreen);

  positionButtons();

  if(page==="start"){
    connectButton.show();
    howToPlayButton.show();
    mythologyButton.show();
    fullscreenButton.show();
  } else {
    connectButton.hide();
    howToPlayButton.hide();
    mythologyButton.hide();
    fullscreenButton.hide();
  }
}

// positionButtons
function positionButtons(){
  let centerX=width/2;
  connectButton.position(centerX - connectButton.width/2, height/2 -220);
  howToPlayButton.position(centerX - howToPlayButton.width/2, height/2 +225);
  mythologyButton.position(centerX - mythologyButton.width/2, height/2 +265);
  fullscreenButton.position(centerX - fullscreenButton.width/2, height/2 +305);
}


function setupBasketGameElements(){
  basketY=height-50;
  basketX=width/2;
  score=0;
  countdown=timer;
  gameOver=false;

  flowers=[];
  vehicles=[];

  for(let x=0;x<=width;x+=10){
    for(let y=-200;y<height;y+=10){
      let v=new Vehicle(x,y);
      vehicles.push(v);
    }
  }

  noCursor();
}

// Page Handling Functions
function handleStartPage(){
  fill(255);
  textSize(48);
  stroke(0);
  strokeWeight(2);
  text("Myths Of Sundarban", width/2, height/13);

  noStroke();
  drawInstructionBox(currentText);
}

function handleGameStartPage() {
  background(0); // Dark background
  fill('#F7E4BA');
  textSize(36);
  stroke(0);
  strokeWeight(2);
  text("Get Ready To Play Using One Hand!", width / 2, height / 2 - 50);

  fill(255);
  textSize(24);
  text("Words will appear in yellow or white.\n", width / 2, height / 2 + 50);
  
  fill(255);
  textSize(24);
  text("You have two ultrasound sensor: Yellow and White", width / 2, height / 2 + 100);
  
    fill(255);
  textSize(24);
  text("If the word is yellow, place your hand in front of the yellow sensor.\n"+
    "If the word is white, place your hand in front of the white sensor.\n"+
    "Respond Quickly and Accurately to score points.", width / 2, height / 2 + 195);
  
  
  fill(255);
  textSize(24);
  text("IF YOU SCORE 20, YOU WILL JUMP TO THE NEXT LEVEL!.", width / 2, height / 2 + 270);
}


function handleGamePage(){
  displayTimerAndScore();
  displaySensorData();

  if (wordDisplay !== "") {
  push();
  textAlign(CENTER, CENTER);
  textSize(64);

  // Choose a background color that contrasts well with yellow and white text
  let backgroundColor = wordColor === "yellow" ? color(0, 0, 0, 100) : color(0, 0, 0, 150); // Semi-transparent dark background
  
  // Border color to match the glowing effect
  let borderColor = wordColor === "yellow" ? color(255, 204, 0) : color(255); // Yellow border for yellow text, white border for white text
  
  // Calculate the width of the text to adjust the background size
  let textWidthValue = textWidth(wordDisplay);
  
  // Draw background rectangle with a border
  fill(backgroundColor);
  noStroke();
  let padding = 20; // Padding around the text
  rectMode(CENTER);
  rect(width / 2, height / 2, textWidthValue + padding, 80); // Adjust rectangle width based on text width
  
  // Draw the border around the background
  stroke(borderColor);
  strokeWeight(6);
  noFill();
  rect(width / 2, height / 2, textWidthValue + padding, 80); // Same size as the background
  
  // Draw the glowing text on top
  fill(wordColor === "yellow" ? color(255, 255, 0) : color(255)); // Glow color for text
  strokeWeight(4);
  stroke(wordColor === "yellow" ? color(255, 255, 0) : color(255)); // Glow color for text
  text(wordDisplay, width / 2, height / 2);
  pop();
}

}

function handleBasketGame(){
 
  background("#094F6D");
  
  noCursor();

  displayTimerAndScore();
  displaySensorData();

  for(let i=0;i<vehicles.length;i++){
    vehicles[i].update();
    vehicles[i].show();
  }

  for(let i=flowers.length-1;i>=0;i--){
    flowers[i].update();
    flowers[i].show();

    if(dist(flowers[i].pos.x,flowers[i].pos.y,basketX,basketY)<40){
      score++;
      flowers.splice(i,1);
    }

    if(flowers[i] && flowers[i].pos.y>height){
      flowers.splice(i,1);
    }
  }

  if(frameCount%30===0){
    let f=new Flower(random(50,width-50),-50,random(30,60),random(TWO_PI),floor(random(6)));
    flowers.push(f);
  }

  drawBasket();
  
 if (score >= 20 && !gameOver) {
    gameOver = true;
    gameResult = "Congratulations! You Win!";
    setPage("won"); // Transition to the winning page
    console.log("Player reached 20 flowers. Game won!");
    return; // Stop further game logic
  }
}

function handleEndPage() {
    background(50); // Set a dark background or whatever fits your game design
   image(endgame, 0, 0, width, height);
  
    textSize(48);
    textAlign(CENTER, CENTER);

    // Calculate rectangle and text positions
    let rectWidth = 800;
    let rectHeight = 100;
    let rectX = (width / 2);
    let rectY = (height / 2) - (rectHeight / 2);

    // Draw a background rectangle for the message
    fill(255, 204, 0); // Bright yellow color for both win and lose
    rect(rectX, rectY, rectWidth, rectHeight, 20); // Rounded corners with a radius of 20

    // Check the game result and set text and colors accordingly
    if (gameResult === "Congratulations! You Win!") {
        fill(0); // Black text for winning message
        text("Congratulations! You Win!", width / 2, height / 2-30);
    } else if (gameResult === "Time's Up! You Lose!") {
        fill(0); // Black text for losing message
        text("Time's Up! You Lose!", width / 2, height / 2-30);
    }

    // Additional UI elements
    textSize(24);
    fill(255); // White color for secondary text
    text("Press 'R' to Restart", width / 2, rectY + rectHeight + 40);
}


function handleHowToPlay(){
  clear();
  background(34,139,34);
  fill(255);
  textSize(32);
  text("How to Play:",width/2,height/5);

  textSize(24);
  text(
    "Words will appear in yellow or white.\n"+
    "If the word is yellow, use the yellow sensor.\n"+
    "If the word is white, use the white sensor.\n"+
    "Respond quickly and accurately to score points.",
    width/2,height/2
  );

  backButton.show();
}

function drawMythologyPage(){
  clear();
  background(34,139,34);
  fill(255);
  textSize(28);
  textAlign(LEFT,TOP);

  text(backgroundText,50,50,width-100,height-100);

  backButton.show();
  textAlign(CENTER,CENTER);
}

function handleSecondGameIntro() {
  // Change background image for this page
  if (bonbibi && bonbibi.width > 0) {
    imageMode(CORNER);
    image(bonbibi, 0, 0, width, height);
  } else {
    background('#7A3B0C'); // Fallback background color
  }

  // Box styling
  let boxWidth = width * 0.8; // Box width (80% of screen width)
  let boxHeight = height * 0.4; // Box height (40% of screen height)
  let boxX = (width - boxWidth) / 2; // Center horizontally
  let boxY = (height - boxHeight) / 2; // Center vertically

  noStroke();
  fill(0, 150); // Semi-transparent background for the box
  rect(boxX, boxY, boxWidth, boxHeight, 10); // Draw the box with rounded corners

  // Text inside the box
  fill(255);
  textSize(24);
  textAlign(CENTER, CENTER);

  // Split the text into vertical lines and display it
  let instructions = [
    "Collect flowers for Bonobibi."," Move basket by rotating the potentiometer.",
    "Press the button to start the basket game."
  ];
  let lineSpacing = 35; // Space between each line
  let textY = boxY + boxHeight / 2 - (instructions.length * lineSpacing) / 2;

  instructions.forEach((line, index) => {
    text(line, width / 2, textY + index * lineSpacing);
  });

  // Title text at the top
  textSize(36);
  stroke(2);
  strokeWeight(4);
  text("Prepare for the Next Challenge!", width / 2, boxY - 40);

  // Hide the back button
  backButton.hide();
}


// drawInstructionBox
function drawInstructionBox(textContent){
  textSize(18);

  let boxWidth=width*0.4;
  let boxHeight=60;
  let boxX=width/2 - boxWidth/2;
  let boxY=height/1.5 - boxHeight/12;

  noStroke();
  fill('rgb(165,88,8)');
  rect(boxX,boxY,boxWidth,boxHeight,10);

  fill(255);
  text(textContent,width/2,boxY+boxHeight/2);
}

function displayTimerAndScore(){
  push();
  textAlign(CENTER,CENTER);
  textSize(24);
  noStroke();
  fill(0,150);
  rectMode(CENTER);
  rect(width/2,50,220,60,10);
  fill(255);
  text("Time: "+countdown+"s | Score: "+score,width/2,50);
  pop();
}

function displaySensorData(){
  push();
  textAlign(LEFT,CENTER);
  textSize(16);
  noStroke();
  fill(0,150);
  rectMode(CORNER);
  rect(20,height-60,320,40,10);
  fill(255);
  text("Latest Data: "+latestData,40,height-40);
  pop();
}

function setupGameElements(){
  currentWordIndex=0;
  wordDisplay="";
  wordColor="white";
  countdown=timer;
  gameOver=false;
  gameResult="";
  score=0;

  wordInterval= 8000;
  console.log("Game elements reset.");

  gameStartInitiated=false;
}

function setNextWord() {
  if (gameOver) return;

  if (score > 20) {
    gameOver = true;
    gameResult = "Congratulations! You Win the First Game!";
    console.log("Score exceeded 20. Transitioning to second game intro...");
    setPage("secondGameIntro");
    return;
  }
  

  // Set the next word and its color
  wordDisplay = words[currentWordIndex];
  wordColor = random(["yellow", "white"]); // Assign color randomly
  currentWordIndex++;
  wordTouched = false; // Reset interaction flag

  // Schedule the next word transition
  wordTimeoutID = setTimeout(setNextWord, currentWordIndex < 5 ? 8000 : 6000);
}


let timerInterval;

function startTimer() {
  // Prevent multiple intervals
  if (timerStart) return;

  timerStart = true;
  clearInterval(timerInterval); // Clear any existing timer interval

  timerInterval = setInterval(() => {
    if (countdown > 0) {
      countdown--; // Decrement the timer
      console.log(`Timer: ${countdown}s`);
    } else {
      clearInterval(timerInterval); // Stop the timer when it reaches zero
      timerStart = false;
      if (!gameOver) {
        gameOver = true;
        gameResult = "Time's Up! You Lose!";
        setPage("lost"); // Go to the game over page
        console.log("Timer ended. Time's up!");
      }
    }
  }, 1000);
}

//--------------
// Debug with keyboard
//---------------

function keyPressed(){
  if(debugMode){
    if(key==='Y'||key==='y'){
      let simulatedData="5,15";
      console.log("Simulated Yellow Sensor Activation:",simulatedData);
      readSerial(simulatedData);
    }

    if(key==='W'||key==='w'){
      let simulatedData="15,5";
      console.log("Simulated White Sensor Activation:",simulatedData);
      readSerial(simulatedData);
    }

    if(key==='B'||key==='b'){
      let simulatedData="ButtonPressed";
      console.log("Simulated Button Press:",simulatedData);
      readSerial(simulatedData);
    }

    if(key==='P'||key==='p'){
      let simulatedData="600";
      console.log("Simulated Potentiometer Activation:",simulatedData);
      readSerial(simulatedData);
    }
  } else {
    if(key==='r'||key==='R'){
      if(page==="secondGameIntro"||page==="won"){
        setupGameElements();
        setPage("start");
        console.log("Game restarted to Start page.");
        currentText="";
        textIndex=0;

        clearInterval(typewriterIntervalID);

        typewriterIntervalID=setInterval(()=>{
          if(textIndex<typewriterText.length){
            currentText+=typewriterText[textIndex];
            textIndex++;
          }
        },typewriterSpeed);
      } else {
        setupGameElements();
        setPage("start");
        console.log("Game restarted to Start page.");
        currentText="";
        textIndex=0;

        clearInterval(typewriterIntervalID);

        typewriterIntervalID=setInterval(()=>{
          if(textIndex<typewriterText.length){
            currentText+=typewriterText[textIndex];
            textIndex++;
          }
        },typewriterSpeed);
      }
    }
  }
}
//------------


//----------
//Window Resize Code
//------------
function windowResized(){
  resizeCanvas(windowWidth,windowHeight);
  positionButtons();
  backButton.position(width/2 - backButton.width/2, height-100);
}

function calcTextSize(baseSize){
  return min(windowWidth,windowHeight)/800 * baseSize;
}

function updateTextSizes(){
  // Can be expanded if needed
}


// class constrauction for driving particles
//---------------------
class Vehicle {
  constructor(x,y){
    this.pos=createVector(x,y);
    this.vel=createVector(0,random(1,3));
    this.acc=createVector(0,0);
  }

  update(){
    this.vel.add(this.acc);
    this.pos.add(this.vel);
    this.acc.mult(0);

    if(this.pos.y>height){
      this.pos.y=0;
      this.pos.x=random(width);
    }
  }

  show(){
    stroke(173,216,230,150);
    strokeWeight(2);
    point(this.pos.x,this.pos.y);
  }
}

let gameSpeed = 2.8; // Default speed multiplier for the game

class Flower {
  constructor(x, y, size, rotation, type) {
    this.pos = createVector(x, y);
    this.size = size;
    this.rotation = rotation;
    this.type = type;
    this.speed = random(3, 6) * gameSpeed; // Adjust speed with multiplier
  }

  update() {
    this.pos.y += this.speed;
  }

  show() {
    push();
    translate(this.pos.x, this.pos.y);
    rotate(this.rotation);
    drawFlower(0, 0, this.size, this.type);
    pop();
  }
}


function drawFlower(x,y,size,type){
  switch(type){
    case 0:
      drawDaisy(x,y,size);
      break;
    case 1:
      drawTulip(x,y,size);
      break;
    case 2:
      drawRose(x,y,size);
      break;
    case 3:
      drawSunflower(x,y,size);
      break;
    case 4:
      drawLily(x,y,size);
      break;
    case 5:
      drawMarigold(x,y,size);
      break;
    default:
      drawDaisy(x,y,size);
      break;
  }
}

function drawDaisy(x,y,size){
  let petalCount=9;
  let petalLength=size;
  let petalWidth=size/3;

  stroke(0);
  fill('#D9E4E6');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,petalLength);
  }
  pop();

  fill('#F2F2F2');
  noStroke();
  ellipse(x,y,size/2);
}

function drawTulip(x,y,size){
  let petalCount=6;
  let petalWidth=size/2;

  stroke(0);
  fill('#AEB7FE');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,size);
  }
  pop();

  fill('#EDEAE6');
  noStroke();
  ellipse(x,y,size/3);
}

function drawRose(x,y,size){
  let petalCount=10;
  let petalWidth=size/3;

  stroke(0);
  fill('#D87373');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,size/1.5);
  }
  pop();

  fill('#F5E6E8');
  noStroke();
  ellipse(x,y,size/4);
}

function drawSunflower(x,y,size){
  let petalCount=20;
  let petalLength=size*1.5;
  let petalWidth=size/2;

  stroke(0);
  fill('#FACA49');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,petalLength);
  }
  pop();

  fill('#6E4B1B');
  noStroke();
  ellipse(x,y,size);
}

function drawLily(x,y,size){
  let petalCount=6;
  let petalWidth=size/2;

  stroke(0);
  fill('#998D30');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,size);
  }
  pop();

  fill('#FBE7E7');
  noStroke();
  ellipse(x,y,size/4);
}

function drawMarigold(x,y,size){
  let petalCount=12;
  let petalLength=size;
  let petalWidth=size/2;

  stroke(0);
  fill('#F4A263');
  push();
  translate(x,y);
  for(let i=0;i<petalCount;i++){
    rotate(TWO_PI/petalCount);
    ellipse(0,-size/2,petalWidth,petalLength);
  }
  pop();

  fill('#FFC107');
  noStroke();
  ellipse(x,y,size/3);
}

function drawBasket() {
  fill("#F6DC89"); // Set the fill color to a brown, typical of baskets
  rectMode(CENTER);
  rect(basketX, basketY-10, 60, 20, 5); // Main basket shape
  
  // Adding lines to create a woven effect
  for (let i = -30; i <= 30; i += 6) {
    stroke(139, 69, 19); // Darker brown for the weave lines
    line(basketX + i, basketY - 20, basketX + i, basketY); // Vertical lines
  }
  
  for (let j = -10; j <= 0; j += 5) {
    stroke(160, 82, 45); // Lighter brown for a highlight effect
    line(basketX - 30, basketY + j - 10, basketX + 30, basketY + j - 10); // Horizontal lines
  }
  
  noStroke(); // Resetting stroke to default
}
function handleSerialEvent(event) {
  console.log("Handling Serial Event:", event);

  // Parse sensor data
  let keyValues = event.trim().split(',');
  let data = {};
  keyValues.forEach((kv) => {
    let [key, value] = kv.split(':');
    if (key && value !== undefined) data[key.trim()] = value.trim();
  });

  console.log("Parsed Data:", data);

   // Check for the physical button press
  // Check for physical button press
  if (data.button === "pressed") {
    if (page === "start") {
      console.log("Physical button pressed. Transitioning to gameStart page.");
      setPage("gameStart"); // Transition to 'Get Ready' page

      // Add a 3-second delay before transitioning to the 'game' page
      setTimeout(() => {
        setPage("game");
        setupGameElements();
        setNextWord();
        startTimer();
        console.log("Word game started after 10-second delay.");
      }, 15000);
      return;
    }

    if (page === "secondGameIntro") {
      console.log("Physical button pressed. Transitioning to basketGame page.");
      setPage("basketGame");
      setupBasketGameElements();
      startTimer();
      return;
    }
  }


  // Handle Sensor Data (Ultrasonics) for Game 1
  if (page === "game" && !gameOver && !wordTouched) {
    let distanceYellow = parseFloat(data.ultra1);
    let distanceWhite = parseFloat(data.ultra2);

    if (!isNaN(distanceYellow) && !isNaN(distanceWhite)) {
      console.log(`Sensor Readings - Yellow: ${distanceYellow}cm, White: ${distanceWhite}cm`);
      let touchThreshold = 5; // Proximity threshold

      // Check if the correct sensor is touched
      if (wordColor === "yellow" && distanceYellow < touchThreshold) {
        handleCorrectTouch();
      } else if (wordColor === "white" && distanceWhite < touchThreshold) {
        handleCorrectTouch();
      } else if (
        (wordColor === "yellow" && distanceWhite < touchThreshold) ||
        (wordColor === "white" && distanceYellow < touchThreshold)
      ) {
        handleIncorrectTouch();
      }
    } else {
      console.log("Invalid sensor data received.");
    }
  }

  // Handle Potentiometer for Basket Game (Game 2)
  if (page === "basketGame" && !gameOver) {
    let potValue = parseFloat(data.pot);
    console.log("Potentiometer Reading:", potValue);

    if (!isNaN(potValue)) {
      basketX = map(potValue, 0, 1023, 50, width - 50);
      basketX = constrain(basketX, 50, width - 50);
    } else {
      console.log("Invalid potentiometer value:", potValue);
    }
  }
}

// Helper for correct touch
function handleCorrectTouch() {
  score++;
  wordTouched = true;
  feedbackMessage = "Correct!";
  feedbackColor = color(0, 200, 0);

  // Clear the current word and load the next one
  wordDisplay = "";
  clearTimeout(wordTimeoutID); // Cancel any pending word transitions
  setTimeout(() => {
    feedbackMessage = ""; // Clear feedback after 500ms
    setNextWord(); // Transition to the next word
  }, 500); // Allow brief feedback display
}

// Helper for incorrect touch
function handleIncorrectTouch() {
  feedbackMessage = "Incorrect!";
  feedbackColor = color(200, 0, 0);
  wordTouched = true;

  // Clear feedback and reset touch state after a short delay
  setTimeout(() => {
    feedbackMessage = "";
    wordTouched = false; // Allow interaction again
  }, 500);
}



function readSerial(data){
  if(data.trim().length>0){
    latestData=data.trim();
    handleSerialEvent(latestData);
  }
}

Code Explanation: 

As I had two sensors, potentiometer and button along with two levels, I had to maintain the setPage  very carefully. From here, I structured the game into multiple pages like start, gameStart, game, and basketGame. The setPage() function ensures smooth transitions between these stages. For example, when the button is pressed, the game transitions from start to gameStart with a delay before entering the gameplay. I decided to keep my words that will come with yellow or white color randomly, in an array together. To draw the flowers, I generated different types of them following to the styles of flowers found at Sundarban region.

Two sensors are dedicated to detecting hand proximity for selecting yellow and white words. My code receives distance values as ultra1 (Yellow) and ultra2 (White). If a hand comes within the set threshold (15 cm), the game registers the corresponding selection and updates the score. I also kept them on the console log to monitor how are the sensors working.

Feedback messages like “Correct!” or “Incorrect!” are displayed dynamically based on player interactions. These are managed using helper functions like handleCorrectTouch() and handleIncorrectTouch(). The code also includes a timer and score tracker to keep the gameplay engaging and competitive.

The readSerial() and handleSerialEvent() functions handle all incoming data from the Arduino. I used the P5.web-serial.js file given by the Professor.

Aspects that I’m proud of: 

I felt even though, it does not exactly showcase how I wanted to tell the story to my audience, but it was a satisfying play for the audience. Specially, the flower game was tough to finish with the randomness of flower speed and generation, that it was even tough for me to win my own game. I heard feedback from Professor Aya Riad that the flowing collection using potentiometer was very satisfying to play and collect the flowers while the time is running down. Kids enjoyed this one as they liked the visuals and this was also a part of my class assignments. I am glad that I was able to refine my codes to give it a new look.

For the word touching game, I am proud that the concept was unique in this show and was a bit of psychology side that one needs to pay full attention to the color and to their hand movement as well.  Even though I had tons of problems with sensor calibration, I was able to pull it off nicely.

I also got reviews from players who read the whole mythology and listened to the song and asked me about it. It was the biggest win for me that people were really interested to know what is the story behind.

Future Implementations:

I plan to implement all the levels and make it a full fledged game that  everyone can play while learning about the culture and stories of this region with fun. I want to change the ultrasonic distance sensors to some other sensors as they are tough to maintain when two sensors are kept together. 

Week 13: Infinity Trials and Errors

Backstory: Giving Up, Trying Again and Moving Forward

I decided to work on my midterm project as it will be less time consuming, but unfortunately the project did not work according to my plan. I made this, which is a 3D flying game (well, was supposed to be a 3D flying game). I tried without sensors first to see if the graphics works well or not and showed this to my roommate as an user opinion, who said it might not be one of the best choices to add this as my final game. I still spend some more time on it but no luck!

Moving Forward: 

After digging more on the feasibility of finishing a nicely done project and the limitations of sensors, I decided to change my project. As I am a fan of making something impactful (which I also criticise about myself because making something fun and relaxing can be an amazing accomplishment), I decided to take Sundarban Mangrove Forest as my game idea and the mythology of Bengal Region as my game.

Context of the myth:

In the Sundarbans myth, Bonbibi, the guardian goddess of the forest, defends the local woodcutters and honey gatherers from the malevolent tiger god, Dakshin Rai. The story highlights a young boy, Dukhey, who becomes a victim of a pact between his greedy employer and Dakshin Rai, aiming to sacrifice him to the tiger god. Bonbibi intervenes, rescuing Dukhey and ensuring his safe return, playing her role as a protector and mediator between the natural world and human interests.

Game flow:

Game Flow On Paper

 

 

 

 

 

 

 

User Testing:

Till this point, the game is done only at the initial stage. The design was done and the flex sensor was working correctly to go to first stage of the game. Graphics:

 

  Start Scene
Game Transition Scene

For the testing, people played only the very beginning part, and they liked it because of the idea and the graphics, which gave me some affirmation that yes, I can move forward with it.

I did second part of the game earlier as my flex sensors were working. This is the link to the second part, Bee, here the sensors were able to connect but calibrating two flex sensors, one for moving the bees and another one to collect honey, was not running smoothly. But, as it was somewhat working at least with keyboards, I kept it as my second part.

 

The third of the game while player needs to run away from the tiger was the crucial one where I keep two sonar sensors side by side and player needs to simulate steps like running, like when left sensor detects obstacles, the right sensor does not as the right leg is up. Again when the right sensor detects obstacles, left sensor does not because now as part of running the left leg is up and right leg is on the ground.

I created a dummy version first to implement the logic and some basic visuals, this one I liked as my logic worked with keyboard properly. Link to see the initial one.    and the second draft done one. I was not satisfied with any of them.

Sensor connection on Arduino:

The Arduino Connections

Unfortunately, I did not take any video but when my roommate tried to play the game, she was very confused with the sensors as it was not working properly.

Only the storyline of the experience was satisfying for her and also for my fellow classmates on saw my works on Wednesday.

I needed to explain my game that what I am trying to achieve. Moreover, as the sensor calibration was very troubling and the movements were very poor, I decided to make it easier.

Having different sensors for different stages, made it tough for users to understand what is happening and why does the user needs to use different sensors for different games. It was challenging from the circuit building to the end level of user testing as well.

I added clear instructions on how to play, what is the mythology about and what are the end goals. Moving on, I again needed to drop the plan of tiger and player chasing along with the bee but I did keep my plan with boat and two sensors.

To get rid of any risk, I followed simple mechanism and rules to build my final game. One thing I was sure after user testing that whoever heard the story or saw it, they all liked it. The only problem was how to design the game play.

 

 

Week 12: Final Project Draft

Concept:

The proposed project creates an interactive hand-gesture-based drawing system that enables users to draw on a digital canvas using hand movements. By combining P5.js, ML5.js Handpose, and Arduino, the project bridges physical gestures with digital creativity. The system uses hand tracking to allow natural interactions and integrates real-time physical feedback through LEDs and a vibration motor, offering an immersive experience.

The primary goal is to deliver a tactile and visual interface where users feel a sense of creation and engagement as their gestures directly influence the drawing and physical environment. The interplay between the digital canvas and physical feedback fosters an innovative and intuitive drawing system.

 

Arduino Program

The Arduino system responds to commands received from P5.js, controlling physical feedback components:

  • NeoPixel LEDs display colors that reflect the brush dynamics, such as the current color selected for drawing. Faster hand movements can increase brightness or trigger lighting effects.
  • Vibration motors provide tactile feedback during specific actions, such as clearing the canvas or activating special effects.

The Arduino continuously listens to P5.js via serial communication, mapping the incoming data (e.g., RGB values, movement speed) to appropriate hardware actions

P5.js Program

The P5.js program is using ML5.js Handpose to track the user’s hand and fingers. The tracked position—primarily the palm center or fingertip—is mapped to the digital canvas for drawing.

As the user moves their hand, the P5.js canvas renders brush strokes in real-time. Brush properties such as size and color dynamically adjust based on gestures or key inputs. For instance:

  • Moving the palm creates brush strokes at the detected location.
  • Pressing specific keys changes brush size (+/-) or color (C).
  • Gestures like a quick swipe could trigger special visual effects.

The program communicates with Arduino to send brush-related data such as the selected color and movement intensity. This ensures the physical environment (e.g., LED lighting) mirrors the user’s actions on the canvas.

Interaction Flow

  1. The system starts with a webcam feed processed by ML5.js Handpose.
  2. P5.js tracks hand movements and maps positions to the canvas.
  3. Users draw by moving their hand, with brush strokes appearing at the tracked position.
  4. Real-time brush properties, such as color and size, are sent to Arduino.
  5. Arduino reflects these changes through LEDs and tactile feedback:
    • The LEDs light up with the same brush color.
    • Vibration occurs during specific gestures or interactions, providing physical confirmation of actions.
  6. The user experiences synchronized digital and physical interactions, offering a sense of control and creativity.

Goals and Outcomes

The project aims to deliver an interactive tool that provides:

  1. A dynamic and intuitive drawing experience using hand gestures.
  2. Real-time physical feedback through LEDs and vibration motors.
  3. A visually and physically engaging system that bridges the digital and physical worlds.

Current Code:

Final Project: Initial Idea

Inspiration and Concept

“The Glass Box” draws inspiration from some of the most renowned interactive installations that seamlessly blend art, technology, and human emotion. Works like Random International’s “Rain Room” and Rafael Lozano-Hemmer’s “Pulse” have redefined how art responds to and engages with the presence of its audience. These pieces demonstrate how technology can turn human interactions into immersive, deeply personal experiences. For instance, “Rain Room” creates a space where participants walk through a field of falling rain that halts as they move, making their presence an integral part of the art. Similarly, “Pulse” transforms visitors’ biometric data, like heartbeats, into mesmerizing light and sound displays, leaving an impression of their presence within the installation.

In this spirit, “The Glass Box” is conceived as an ethereal artifact—a living memory keeper that reacts to touch, gestures, sound, and even emotions. It is designed to transform fleeting human moments into tangible, evolving displays of light, motion, and sound. Inspired further by works like “Submergence” by Squidsoup, which uses suspended LEDs to create immersive, interactive environments, and TeamLab’s Borderless Museum, where visuals and projections shift dynamically in response to viewers, “The Glass Box” similarly blurs the line between viewer and art. It invites users to actively shape its form and behavior, making them co-creators of a dynamic, ever-changing narrative.

The central theme of “The Glass Box” is the idea that human presence, though transient, leaves a lasting impact. Each interaction—whether through a gesture, a clap, or an expression—is stored as a “memory” within the box. These memories, visualized as layers of light, sound, and movement, replay and evolve over time, creating a collaborative story of all the people who have interacted with it. For example, a joyful wave might create expanding spirals of light, while a gentle touch might ripple across the sculpture with a soft glow. When idle, the box “breathes” gently, mimicking life and inviting further interaction.

 

Key Features

  1. Dynamic Light and Motion Response:
    • The Glass Box uses real-time light and motion to respond to user gestures, touch, sound, and emotions. Each interaction triggers a unique combination of glowing patterns, pulsating lights, and kinetic movements of the artifact inside the box.
    • The lights and motion evolve based on user input, creating a sense of personalization and engagement.
  2. Emotion-Driven Feedback:
    • By analyzing the user’s facial expression using emotion recognition (via ml5.js), the box dynamically adjusts its response. For example:
      • A smile produces radiant, expanding spirals of warm colors.
      • A neutral expression triggers soft, ambient hues with gentle movements.
      • A sad face initiates calming blue waves and slow motion.
  3. Memory Creation and Replay:
    • Each interaction leaves a “memory” stored within the box. These memories are visualized as layered patterns of light, motion, and sound.
    • Users can replay these memories by performing specific gestures or touching the box in certain areas, immersing them in a past interaction.
  4. Interactive Gestural Control:
    • Users perform gestures (like waving, pointing, or swiping) to manipulate the box’s behavior. The ml5.js Handpose library detects these gestures and translates them into corresponding light and motion actions.
    • For example, a waving gesture might create rippling light effects, while a swipe can “clear” the display or shift to a new pattern.
  5. Multi-Sensory Interactivity:
    • The box reacts to touch via capacitive sensors, sound via a microphone module, and visual gestures through webcam-based detection. This multi-modal interaction creates an engaging, immersive experience for users.
  6. Dynamic Visual Narratives:
    • By combining input data from touch, gestures, and emotions, the box generates unique, evolving visual patterns. These patterns are displayed as 3D light canvases inside the box, blending aesthetics and interactivity.

 

Components

  1. Arduino Uno
  2. Servo Motors
  3. Capacitive Touch Sensors
  4. Microphone Module
  5. RGB LED Strips (WS2812)
  6. Webcam
  7. 3D-Printed Structure
  8. Glass Box (Frosted or Transparent)
  9. Power Supply (5V for LEDs)
  10. p5.js (Software)
  11. ml5.js Library (Gesture and Emotion Detection)

Image: Dall-E

Reading Reflection: Design Meets Disability

Reflecting on the insights from the “Design Meets Disability” document, it’s clear that not enough designs are created with disability in mind. The beginning of the book, which references San Francisco’s disability-friendly environment, reminds me of an interview between Judith Butler and Sunaura Taylor discussing how accessible San Francisco is compared to New York. This backdrop sets the stage for the book’s intriguing point about glasses. Glasses have significantly changed our views on vision impairment; they’re no longer seen as a taboo or an expensive burden. Thanks to their design evolution, people with vision impairments are not commonly tagged as disabled.

During a guest lecture at NYUAD, Professor Goffredo Puccetti, a graphic designer with visual impairments, shed light on the importance of inclusive design. His own experiences and professional expertise underscored how subtle design elements can vastly improve accessibility. He pointed out specific shortcomings at NYUAD, such as some door designs that fail to be truly disability-friendly. This gap between the institution’s inclusive intentions and their actual implementations has heightened my awareness of the practical challenges in achieving genuine accessibility.

Moreover, noise-canceling headphones have emerged as an assistive technology beneficial for individuals like my friend Shem, who has autism. She uses these headphones daily to work without distraction, showing how design can aid in overcoming some challenges posed by disabilities. However, mainstream designs often inadvertently promote inaccessibility, like buildings with stairs but no ramps, presenting significant barriers to the disabled community.

Even as NYUAD champions inclusion and accessibility, the actual campus design tells a different story. Those with permanent visual impairments struggle to access Braille signage without assistance, and the dining hall doors pose challenges for wheelchair users. This disparity prompts critical questions: What steps can institutions like NYUAD take to bridge the gap between their inclusive ideals and their physical implementations? How can designers, both current and future, better anticipate and address the diverse needs of their audience?

Understanding that there is no “one-size-fits-all” solution in design for disability, it becomes clear that more thoughtful methodologies and steps are needed. Designs should not only meet minimum standards of accessibility but should also strive to enhance autonomy and integration, ensuring that everyone can navigate spaces independently and with dignity.

In Class Activity: Zavier and Taskin

# Jump To:


# Introduction

Hi there! 👋

These are a few in-class activities Zavier and I did this week (and had to post), resolving around serial communication between p5 and Arduino.

# Exercise 1: Arduino Affecting p5

 

Task:

“Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5.”

 

Code:
Arduino:
 
void setup() {
	Serial.begin(9600);
	pinMode(A0, INPUT);
}

void loop() {
	Serial.println(analogRead(A0));
	delay(10);
}
 
p5:
 
let xPos = 0;

function setup() {
	createCanvas(600, 600);
	noFill();
}

function draw() {
	background(32, 64);
	
	stroke("white")

	ellipse(map(xPos, 0, 1023, 0, width), height/2, 100, 100);
	
	// Turn the screen red to make it very clear that we aren't connected to the Arduino
	if (!serialActive)
		background(128, 0, 0);
}

function keyPressed() {
	if (key == " ")
		setUpSerial(); // Start the serial connection
}

function readSerial(data) {
	if (data != null) // Ensure there's actually data
		xPos = int(data);
}			
 

 

# Exercise 2: p5 Affecting Arduino

 

Task:

“Make something that controls the LED brightness from p5.”

 

Code:
Arduino:
 
void setup() {
	Serial.begin(9600);
	pinMode(3, OUTPUT);
}

void loop() {
	analogWrite(3, Serial.parseInt());
	Serial.println();
	delay(10);
}
 
p5:
 
let xPos = 0;
let LEDBrightness = 0; // 0 - 255

function setup() {
	createCanvas(600, 600);
}

function draw() {
	if (keyIsDown(UP_ARROW) && LEDBrightness < 255) LEDBrightness += 1; else if (keyIsDown(DOWN_ARROW) && LEDBrightness > 0)
		LEDBrightness -= 1;
	
	// Just a visual indicator of the brightness level on p5
	background(LEDBrightness);
	fill(LEDBrightness < 128 ? 'white' : 'black')
	text(LEDBrightness, 25, 25);
	
	// Turn the screen red to make it very clear that we aren't connected to the Arduino
	if (!serialActive)
		background(128, 0, 0);
}

function keyPressed() {
	if (key == " ")
		setUpSerial(); // Start the serial connection
}

function readSerial(data) {
	writeSerial(LEDBrightness);
}
 

 

# Exercise 3: Arduino and p5 Affecting Each Other

 

Demo:

 

Task:

“Take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor.”

 

Code:
Arduino:
 
const int LED_PIN = 3;
const int POT_PIN = A0;

void setup() {
	Serial.begin(9600);
	pinMode(LED_PIN, OUTPUT);
	pinMode(POT_PIN, INPUT);

	// Start the handshake
	while (Serial.available() <= 0) {
		Serial.println("0"); // Send a starting message
		delay(300); // Wait ~1/3 second
	}
}

void loop() {
	while (Serial.available()) {
		int LEDState = Serial.parseInt(); // 0 or 1
		
		if (Serial.read() == '\n') {
			digitalWrite(LED_PIN, LEDState);
			Serial.println(analogRead(POT_PIN));
		}
	}
}
 
p5:
 
let position, velocity, acceleration, gravity, wind; // vectors
let drag = 0.99, mass = 50, hasBounced = false;

function setup() {
	createCanvas(640, 360);
	noFill();
	position = createVector(width/2, 0);
	velocity = createVector(0,0);
	acceleration = createVector(0,0);
	gravity = createVector(0, 0.5*mass);
	wind = createVector(0,0);
	
	textAlign(CENTER);
	textSize(18);
}

function draw() {
	background(255);
	applyForce(wind);
	applyForce(gravity);
	velocity.add(acceleration);
	velocity.mult(drag);
	position.add(velocity);
	acceleration.mult(0);
	fill(hasBounced ? 'green' : 'white')
	ellipse(position.x,position.y,mass,mass);
	if (position.y > height-mass/2) {
		velocity.y *= -0.9;  // A little dampening when hitting the bottom
		position.y = height-mass/2;
		
		if (!hasBounced && abs(velocity.y) > 1) {
			hasBounced = true;
			setTimeout(() => hasBounced = false, 100); // Set hasBounced to false after 0.1 s
		}
	}  
	
	if (!serialActive) {
		background(8);
		fill('white');
		text("Press c to connect to the Arduino", width/2, height/2)
	}
}

function applyForce(force){
	// Newton's 2nd law: F = M * A
	// or A = F / M
	let f = p5.Vector.div(force, mass);
	acceleration.add(f);
}

function keyPressed(){
	if (key == ' '){
		mass = random(15,80);
		position.set(width/2, -mass);
		velocity.mult(0);
		gravity.y = 0.5*mass;
		wind.mult(0);
		
	} else if (key == 'c') {
		setUpSerial(); // Start the serial connection
	}
}

function readSerial(data) {
	if (data != null) { // Ensure there's actually data
		wind.x = map(int(data), 0, 1023, -2, 2);
		writeSerial((hasBounced ? 1 : 0) + '\n');
	}
}
 

 

Circuit:

 

 

Assignment Week 10: Interactive Musical Instrument

Inspiration

The WaveSynth project is inspired by the theremin, one of the first electronic instruments, invented in 1920 by Russian physicist Léon Theremin. Known for its eerie, vocal-like sound and its unique, touchless control, the theremin uses two antennas to detect the position of the player’s hands: one antenna controls pitch, and the other controls volume. By moving their hands through electromagnetic fields, players can create smooth, flowing sounds without touching the instrument. This expressive control has influenced generations of musicians and has become iconic in sci-fi, horror, and experimental music.

Concept

The WaveSynth is a gesture-controlled musical instrument that turns hand movements and environmental factors into dynamic sound. Designed to be both intuitive and expressive, the WaveSynth combines multiple sensors—ultrasonic, temperature, and a potentiometer—to create a cohesive interface.

The ultrasonic sensor detects hand distance, adjusting either pitch or volume based on the player’s proximity. The potentiometer serves as a mode selector, allowing the user to switch between pitch and volume control, as well as access different sound effects like vibrato, pulse, and temperature modulation. The temperature sensor adds an additional layer of subtlety, with ambient temperature shifts introducing slight pitch modulations, making the instrument responsive to its surroundings.

List of the hardware components used in the WaveSynth project:

  • Arduino Uno
  • HC-SR04 Ultrasonic Sonar Sensor (for gesture-based distance measurement)
  • TMP36GZ Temperature Sensor (for ambient temperature-based modulation)
  • 10k Ohm Potentiometer (for mode and effect selection)
  • Piezo Speaker (for sound output)
  • Connecting Wires (for connections between components and the Arduino)
  • Breadboard (for prototyping and circuit connections)
  • 310 Ohm Resistor (for LED circuit)

Schematic Diagram:

 

Code:

// Pin definitions
const int potPin = A0;            // Analog pin for potentiometer
const int tempPin = A1;           // Analog pin for TMP36GZ temperature sensor
const int trigPin = 3;            // Digital pin for sonar trigger
const int echoPin = 4;            // Digital pin for sonar echo
const int speakerPin = 9;         // Digital pin for speaker
const int ledPin = 5;             // Digital pin for LED (PWM-enabled)

// Variables
int effectType = 0;               // Tracks which effect is active (0: none, 1: vibrato, 2: pulse, 3: temperature modulation)

void setup() {
  pinMode(speakerPin, OUTPUT);      // Speaker as output
  pinMode(trigPin, OUTPUT);         // Sonar trigger as output
  pinMode(echoPin, INPUT);          // Sonar echo as input
  pinMode(ledPin, OUTPUT);          // LED as output
  Serial.begin(9600);               // For debugging output
}

// Function to read distance from the sonar sensor
long readDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  // Measure the pulse duration on the echo pin
  long duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in centimeters
  long distance = duration * 0.034 / 2;
  return distance;
}

// Function to read temperature from the TMP36GZ
float readTemperature() {
  int tempReading = analogRead(tempPin);             // Read analog value from TMP36
  float voltage = tempReading * (5.0 / 1023.0);      // Convert reading to voltage (0-5V)
  float temperatureC = (voltage - 0.5) * 100.0;      // Convert voltage to temperature in Celsius
  return temperatureC;
}

void loop() {
  // Potentiometer to control mode and effect
  int potValue = analogRead(potPin);                  // Read potentiometer (0-1023)
  bool pitchMode = potValue < 512;                    // Below midpoint is pitch mode, above is volume mode
  
  // Determine the effect based on the potentiometer value ranges
  if (potValue < 256) {
    effectType = 0;                                   // No effect
  } else if (potValue < 512) {
    effectType = 1;                                   // Vibrato
  } else if (potValue < 768) {
    effectType = 2;                                   // Pulse
  } else {
    effectType = 3;                                   // Temperature modulation
  }

  // Read sonar distance and map to a lower pitch range for soothing tones
  long distance = readDistance();                     // Distance in cm
  int baseToneValue = pitchMode ? map(distance, 5, 50, 150, 600) : 440;  // Map distance to pitch if in Pitch Mode
  
  // Control LED brightness based on distance
  int ledBrightness = map(distance, 5, 50, 255, 0);   // Closer is brighter (5 cm = max brightness)
  ledBrightness = constrain(ledBrightness, 0, 255);   // Constrain within 0-255
  analogWrite(ledPin, ledBrightness);                 // Set LED brightness
  
  // Read temperature and map it to a gentle pitch effect
  float temperature = readTemperature();
  int tempEffect = map(temperature, 20, 35, 20, 80);  // Map temperature to subtle pitch modulation
  
  // Debug output to Serial Monitor
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.print(" cm, LED Brightness: ");
  Serial.print(ledBrightness);
  Serial.print(", Pot Value: ");
  Serial.print(potValue);
  Serial.print(", Effect Type: ");
  Serial.print(effectType);
  Serial.print(", Temperature: ");
  Serial.print(temperature);
  Serial.println(" C");

  // Play sound based on the selected effect type
  switch (effectType) {
    case 0: // No effect
      tone(speakerPin, baseToneValue); // Basic tone based on distance
      break;
    
    case 1: // Smooth Vibrato
      for (int i = 0; i < 20; i++) {
        int vibratoTone = baseToneValue + (sin(i * 0.3) * 10); // Soft vibrato effect with lower amplitude
        tone(speakerPin, vibratoTone, 50); // Short tone bursts for vibrato
        delay(20); // Slightly slower delay for soothing vibrato effect
      }
      break;
      
    case 2: // Gentle Pulse
      tone(speakerPin, baseToneValue);      // Play base tone continuously
      analogWrite(speakerPin, 128);         // Soft fade for pulse effect
      delay(100);                           // Adjust pulse duration for gentler effect
      noTone(speakerPin);                   // Turn off sound briefly to create pulse
      delay(100);                           // Wait before next pulse
      break;
      
    case 3: // Temperature Modulation
      int tempModulatedTone = baseToneValue + tempEffect;  // Adjust pitch slightly based on temperature
      tone(speakerPin, tempModulatedTone); // Continuous tone with slight modulation
      delay(200); // Keep tone smooth
      break;
  }
  
  delay(50); // Small delay for stability
}

Media:

Working Process:

 

  1. Initial Setup and Calibration:
    1. When powered on, the Arduino initializes all sensors and components, including the ultrasonic sensor, temperature sensor, potentiometer, and speaker.
    2. The potentiometer’s position is read to determine the initial mode (Pitch or Volume) and effect (Vibrato, Pulse, Temperature Modulation, or None). The instrument is ready to interpret the player’s gestures and environmental inputs to start producing sound.
  2. Gesture Detection and Distance Measurement:
    1. The player positions their hand near the ultrasonic sensor and moves it to change sound properties.
    2. The ultrasonic sensor measures the distance between the player’s hand and the sensor by sending out an ultrasonic pulse and timing how long it takes for the pulse to bounce back.
    3. The distance value is calculated and then mapped to control either pitch or volume based on the selected mode:
      1. Pitch Mode: The distance between the sensor and the player’s hand changes the pitch of the sound. Closer hand positions produce higher pitches, while farther positions result in lower pitches.
      2. Volume Mode: In this mode, the distance controls the volume of the sound. Closer distances yield louder sounds, and farther distances make the sound quieter.
  3. Sound Modification through Effects:
    1. The potentiometer serves as a selector for various sound effects that add dynamic layers to the base tone. Depending on the potentiometer’s position, the following effects are applied:
      1. No Effect (Basic Tone): The sound responds directly to the pitch or volume based on the hand distance with no additional modulation.
      2. Vibrato Effect: The instrument adds a wave-like oscillation to the pitch, producing a gentle, undulating sound. This effect is applied continuously, allowing the sound to vary smoothly.
      3. Pulse Effect: The sound output is pulsed, creating a rhythmic on-and-off pattern. This effect provides a percussive quality, ideal for rhythmic play.
      4. Temperature Modulation: Ambient temperature subtly adjusts the pitch, creating an atmospheric and evolving sound that changes with the surrounding environment. This effect responds more slowly, allowing the sound to naturally vary over time.
  4. Environmental Adaptation with Temperature Modulation:
    1. When Temperature Modulation is selected, the temperature sensor reads the ambient temperature. The Arduino then uses this temperature reading to modulate the pitch subtly.
    2. For example, warmer temperatures gradually increase the pitch, while cooler temperatures lower it. This effect is gradual and blends naturally with the other sound properties, adding a unique, ambient quality to the instrument’s sound.
  5. Real-Time Sound Output:
    1. The piezo speaker produces sound based on the interpreted data, transforming distance measurements, temperature readings, and selected effects into real-time audio.
    2. The speaker continuously updates its output to reflect the current settings and environmental conditions, providing an immediate response to hand movements and mode changes.
    3. As the player moves their hand closer or farther from the ultrasonic sensor, the sound changes instantly in pitch or volume. Additionally, adjustments to the potentiometer instantly modify the effect applied to the sound.
  6. Interactive Feedback Loop:
    1. The player continuously interacts with the WaveSynth by adjusting their hand position, changing the potentiometer setting, and experiencing the evolving sound.
    2. This interactive feedback loop allows the player to dynamically control and modify the instrument’s output, creating an immersive musical experience that feels responsive and alive.

 

Future Improvement and Challenges

One of the primary challenges encountered was calibrating the sensors to respond smoothly and accurately to the user’s hand movements. Fine-tuning the pitch range and ensuring that the effects—such as vibrato and pulse—blended naturally with the sound output took several iterations to achieve a pleasing result.

The temperature sensor was tough to work with on the Arduino board.

Additionally, integrating digital sound synthesis or MIDI compatibility would enable users to connect the WaveSynth with other musical devices or software, greatly expanding its versatility as a tool for music creation.

Another possible enhancement could be the inclusion of LEDs or other visual feedback elements to indicate mode selection and provide dynamic light effects that correspond to the sound output. This would enhance the visual aspect of the instrument, making it even more engaging for live performances.

Reading Responses: Brief Rant

As a reader and technology enthusiast, I find Bret Victor’s “A Brief Rant on the Future of Interaction Design” to be a thought-provoking critique of current trends in human-computer interaction. Victor’s argument against “Pictures Under Glass” technology and her call for more tactile, three-dimensional interfaces resonates with my own experiences and frustrations with touchscreen devices.Victor’s vivid descriptions of how we use our hands to manipulate objects in the real world highlight the limitations of current touchscreen interfaces. I’ve often felt that something was missing when using my smartphone or tablet, and Victor’s examples of reading a book or drinking from a glass perfectly capture that sense of disconnection. The richness of tactile feedback we get from physical objects is indeed absent in our flat, glassy screens

However, I believe Victor’s critique, while insightful, doesn’t fully acknowledge the benefits of touchscreen simplicity and accessibility. In my experience, touchscreens have made technology more approachable for a wider range of users, including children and the elderly. The ease of use and intuitiveness of swiping and tapping have democratized access to digital tools in ways that more complex interfaces might not.That said, I agree with Victor’s call for more ambitious visions in interaction design. Her example of Alan Kay envisioning the iPad decades before its creation is inspiring and reminds us of the power of long-term, visionary thinking.

As someone who uses technology daily, I’m excited by the possibility of interfaces that better utilize our hands’ capabilities and even our entire bodies.Victor’s argument extends beyond just hands to encompass our entire bodies, noting that we have “300 joints” and “600 muscles”

This resonates with my own experiences of how we naturally use our whole bodies when interacting with the physical world. I’ve often felt constrained by the limited range of motion required to use current devices, and the idea of more holistic, full-body interfaces is intriguing.While I appreciate Victor’s vision, I also recognize the practical challenges of implementing more tactile and three-dimensional interfaces. Issues of cost, durability, and scalability would need to be addressed. Additionally, I believe the future of interaction design will likely involve a combination of approaches, including enhanced haptic feedback, hybrid interfaces that combine touchscreens with physical controls, and multimodal interaction incorporating touch, voice, and gesture.

Assignment: GripSense

Concept:

GripSense  is designed to empower recovery and build strength through simple, intuitive grip exercises. Inspired by the resilience required in rehabilitation, it is designed with individuals in mind who are recovering from hand injuries, surgeries, or conditions like arthritis that can limit hand mobility and grip strength. The idea is to create a device that not only helps measure improvement over time but also motivates users to engage in strength exercises by giving real-time feedback and adapting to different levels of force.

Work Procedure:

A squishy ball, embedded with Force Sensor, connected to 5V and Analog Pin A0 on the Arduino through a 1kΩ resistor to ground. This creates a voltage divider, enabling the Arduino to detect varying pressure levels. For feedback, two LEDs are connected: a Light Touch LED on Pin 8 with a 310Ω resistor for gentle squeezes, and a Strong Force LED on Pin 10 with a 310Ω resistorfor firmer grips. Each LED lights up based on the detected force level, providing immediate, intuitive feedback on grip strength.

The Arduino reads these values and provides real-time feedback through LEDs:

  • Light Touch LED (Blue): Lights up for a gentle squeeze ensuring the sensor is working.
  • Strong Force LED (Yellow): Lights up for a firmer, stronger grip.

Code:

const int forceSensorPin = A0;  // Pin connected to the force sensor
const int lightTouchLED = 8;    // Pin for the first LED (light touch)
const int strongForceLED = 10;  // Pin for the second LED (strong force)

int forceValue = 0;             // Variable to store the sensor reading
int lightTouchThreshold = 60;   // Threshold for light touch
int strongForceThreshold = 300; // Threshold for strong force

void setup() {
  Serial.begin(9600);               // Initialize serial communication at 9600 bps
  pinMode(lightTouchLED, OUTPUT);   // Set LED pins as outputs
  pinMode(strongForceLED, OUTPUT);
}

void loop() {
  // Read the analog value from the force sensor
  forceValue = analogRead(forceSensorPin);

  // Print the value to the Serial Monitor
  Serial.print("Force Sensor Value: ");
  Serial.println(forceValue);

  // Check for strong force first to ensure it takes priority over light touch
  if (forceValue >= strongForceThreshold) {
    digitalWrite(lightTouchLED, LOW);    // Turn off the first LED
    digitalWrite(strongForceLED, HIGH);  // Turn on the second LED
  }
  // Check for light touch
  else if (forceValue >= lightTouchThreshold && forceValue < strongForceThreshold) {
    digitalWrite(lightTouchLED, HIGH);   // Turn on the first LED
    digitalWrite(strongForceLED, LOW);   // Ensure the second LED is off
  }
  // No touch detected
  else {
    digitalWrite(lightTouchLED, LOW);    // Turn off both LEDs
    digitalWrite(strongForceLED, LOW);
  }

  delay(100);  // Small delay for stability
}

Schematic Diagram:

Schematic Diagram
Figure: Schematic Diagram of GripSense

 

 

 

 

 

 

 

Media:

 

 

Reflection and Future Improvements:

In the future, I plan to enhance GripSense with several key upgrades:

  1. Secure Sensor Integration: I’ll embed the force sensor inside the squish ball or design a custom 3D-printed casing to hold both the ball and sensor. This will make the setup more professional, durable, and comfortable for users.
  2. Add OLED Display for Instant Feedback: I want to incorporate an OLED screen that will show live grip strength data, allowing users to see their strength level and progress instantly with each squeeze.
  3. Implement Multi-Level LED Indicators: To make feedback more visual, I’ll add a series of LEDs that light up at different levels of grip strength. For example, green will represent light grip, yellow for medium, and red for strong, giving users an intuitive, color-coded response.
  4. Data Logging for Progress Tracking: I aim to add an SD card module to log grip strength data over time, stored in a CSV format. This feature will enable users to track their improvement and analyze their progress with tangible data.
  5. Enable Bluetooth or Wi-Fi Connectivity: I plan to incorporate wireless connectivity so that users can view their grip data on a smartphone app. This will allow them to track progress visually with graphs, set goals, and receive real-time feedback, making the recovery journey more engaging and motivating.

These upgrades will help make GripSense a comprehensive, interactive tool that supports and inspires users through their rehabilitation process.

Reading Reflection: Week 09

Reading Tigoe’s blog post, “Physical Computing’s Greatest Hits (and Misses),” I feel both inspired and challenged by the themes he describes. Physical computing, as he explains, is filled with opportunities for creativity and reinvention. Some themes—like theremin-like instruments or LED displays—show up in physical computing classes over and over again, but Tigoe reminds us that even if something’s been done a hundred times before, there’s always room to put your own spin on it. This resonates with me, especially when I catch myself thinking, “Is this idea too overdone?” It’s encouraging to realize that originality can come from how we interpret and reimagine these classic projects.

Then there’s the question, How can I make my work intuitive and open-ended at the same time? It’s tricky to strike the balance between guiding users and letting them explore freely. For example, in my “soccer bot” project, I wanted the bot to recognize and interact with a soccer ball by “kicking” it. But the challenge wasn’t just in the technical aspects of recognizing the ball and calibrating the bot’s stability; it was about making the interaction feel intuitive and natural, like something anyone could understand right away. I realized that if the bot’s movements felt intentional and even a little lifelike, users could engage with it without needing instructions, adding an unexpected layer of playfulness to the project.

Another question Tigoe’s post brings up is, Can technology actually communicate emotions or sensations? This is something he addresses when he talks about projects like “remote hugs” and “meditation helpers.” I agree with him—replicating emotions or calm states is complex, maybe even impossible in some cases. But I wonder if there are ways technology could enhance or evoke those states instead of replicating them. Maybe it’s not about simulating emotion but creating something that encourages users to bring their own feelings into the experience. This could be a reminder that emotional connections in interactive art come from the users, not just from the tech or design itself.

Second Reading:

The idea that interactive art should leave room for the audience to interpret, experience, and even shape it is powerful—and surprisingly challenging. As artists, we’re often conditioned to pour ourselves into our work as a complete expression of what we want to say. But interactive art pushes us to let go of that control, to see ourselves less as “sole authors” and more as facilitators of an experience that isn’t ours alone.

One thing that really resonates with me is Tigoe’s assertion that our role, especially in interactive art, is not to dictate meaning or guide the viewer’s every move, but to create a space where people can explore. I think this goes against the grain of traditional art, where the artist’s interpretation and intent are often front and center. For me, that’s what’s exciting about interactive art: it’s unpredictable. You’re creating an experience without fully knowing where it will go, and you have to be okay with that ambiguity. There’s something freeing about stepping back and letting the audience complete the work in their way, with their own reactions, ideas, and personal connections.

In my own projects, I’ve noticed that the most memorable moments come not from following my own scripts but from watching how others interpret and transform what I’ve made. I remember one piece where I built a simple interactive environment with lights and motion sensors to respond to presence and movement. I’d imagined that people would move slowly through the space, savoring the changing light patterns. Instead, they ran, laughed, and turned it into an almost game-like experience. It wasn’t what I envisioned, but it was better in many ways—more alive, more spontaneous. This unpredictability is what Tigoe captures when he says to “listen” to the reactions and responses. Observing others engage with your work like this gives you insights you couldn’t have planned for, and it makes the artwork feel truly collaborative.