Final Project Documentation – Healthy Catch

Concept:

For my final project, I created an interactive healthy-choice game inspired by foods and everyday habits familiar in Kazakh culture. It features a girl character positioned at the center of the canvas. The core mechanic involves catching and avoiding falling items that represent healthy and unhealthy choices. Healthy foods such as apples, dates, milk, and water increase the player’s score, while unhealthy items like burgers and Coca-Cola reduce it by 10 points. Different point values are assigned to healthy foods, for example apples and milk give +20 points, while dates and water give +10 to encourage players to visually recognize and prioritize healthier options rather than relying solely on quick reactions.

The game is structured into two alternating phases to maintain engagement. In Phase 1, apples, dates, and burgers appear (food), while Phase 2 introduces milk, water, and Coca-Cola (liquid). Player movement is controlled through real apples connected to Arduino using capacitive touch sensors, with Arduino sending control signals to p5.js. Touching the green apple moves the character left, while touching the red apple moves the character right, translating real-world interaction into on-screen movement. When the timer ends, p5.js communicates back to Arduino to trigger an LED.  The Kazakh national music plays during gameplay, reinforcing cultural context and creating an immersive experience that combines physical computing, digital interaction, and culturally grounded storytelling. Adding on that, to hide the wiring and make the project look clean, I built a wooden box that holds the Arduino. The box has the game name engraved on top, apple designs on the sides, and traditional Kazakh ornaments engraved along the bottom and in the corners (please see the picture below)

Project Image – 1

Project Image – 2

User Testing Video:

I am especially happy with my choice of controllers—the apples. While working on the project in the IM Lab, many people were curious about how the game worked and approached me with questions or asked to try it themselves. This led to informal user testing, where I was able to observe how people interacted with the game in real time. From this feedback, I learned how to make the project clearer and more accessible, and I added improved and more descriptive instructions so that first-time players could understand the controls more easily. Many users also mentioned that they enjoyed touching the real apples while playing. Some even pointed out the fresh apple smell, which made the experience feel more engaging and sensory. Please see the first user experience video:

User Testing Video

Interaction design:

The interaction design of the game relies heavily on embodied physical input through real apples, which successfully bridges the digital and physical worlds. This approach encourages players to engage tactilely with the interface, making the act of moving the character more memorable and immersive than conventional keyboard or touchscreen controls. Observations during testing revealed that the novelty of physically touching an apple, paired with its natural scent, enhances user engagement and creates an intuitive connection between the player’s actions and in-game consequences. This sensory dimension supports learning, as players are more likely to remember which items are healthy or unhealthy when the interaction is multisensory.

Arduino Part:

For the Arduino part I used the following wiring:

Schematic:

Code Snippet:

// Determine if sensors are being touched
    bool rightActive = rightValue > (rightBase + touchOffset);
    bool leftActive  = leftValue  > (leftBase  + touchOffset);


 // Send RIGHT command only when the right sensor is newly touched
    if (rightActive && !rightTouched) {
      Serial.println("RIGHT");
    }
 // Send LEFT command only when the left sensor is newly touched
    if (leftActive && !leftTouched) {
      Serial.println("LEFT");
    }
  // Detect release: touch was active before but now inactive
    bool rightReleased = rightTouched && !rightActive;
    bool leftReleased  = leftTouched  && !leftActive;
// Send STOP only when both sensors are released
    if ((rightReleased || leftReleased) && !(rightActive || leftActive)) {
      Serial.println("STOP");
    }
  // Update current touch state
    rightTouched = rightActive;
    leftTouched  = leftActive;

I really like the part of the Arduino code that checks when the apples are touched and sends commands. It only sends a command when the touch actually starts or ends, which prevents too many repeated messages and keeps the game running smoothly. The code tracks three states: touch started, touch ended, and no touch, making the sensors easy to understand and reliable. By remembering whether each apple was already being touched, the Arduino responds only to intentional actions, so the character in the game moves accurately. This design makes the game feel responsive and natural when players use the apples. To build the capacitive touch sensors, I used the CapacitiveSensor library (Paul Badger’s CapacitiveSensor) and a 1 MΩ resistor, which allows the Arduino to detect even gentle touches on the apples. I also soldered my LEDs to put it into wood box.  This setup is effective for creating sensitive, stable input for the game.

Arduino Full Github Link

P5 Code:

Sketch Link

function handleFallingObjects() {
    // Generate a new object approximately once every second (60 frames)
    if (frameCount % 60 === 0) {
        generateFallingObject();
    }

    // Iterate backwards to safely remove objects during the loop
    for (let i = fallingObjects.length - 1; i >= 0; i--) {
        let obj = fallingObjects[i];
        obj.y += objectSpeed; // Move object down
        drawObject(obj); // Draw object

        // Collision check 
        let d = dist(obj.x, obj.y, charX, charY);
        if (d < (obj.size / 2) + (CATCHER_RADIUS / 2)) {
            handleCatch(obj, i); // Process score and remove item
            continue; // Skip to next item since this one was removed
        }
        // Check if object has fallen off the bottom of the screen
        if (obj.y > height + 50) {
            fallingObjects.splice(i, 1); // Remove missed object
        }
    }
}

I like this part because it keeps the gameplay dynamic and challenging while staying organized. Objects appear randomly, move smoothly, and are removed cleanly when they are caught or missed, which prevents the array from growing unnecessarily. It also checks for collisions in a simple and effective way, making the game feel responsive and fun.

Communication between Arduino and P5.js

The communication between Arduino and p5.js works by sending and receiving serial data. Arduino detects touches on the capacitive sensors and sends commands like left, right, or stop. The p5.js sketch listens for these messages and changes the character’s movement on the screen, creating a seamless interaction between the physical sensors and the digital game. From p5.js to Arduino interaction happens when the game is over, as the LED lights up.

// Serial Data Processing
function readSerial(data) {
    let msg = data.trim(); // Clean up incoming string from Arduino

    // Set movement direction based on serial commands
    if (msg === "LEFT") {
      currentDirection = -1; // Move left
    }
    else if (msg === "RIGHT"){
       currentDirection = 1; // Move right
    }
    else if (msg === "STOP") {
      currentDirection = 0; // Stop movement
    }
}

Challenges and what I am most proud of:

This part of the challenge is handled in the Handle Falling Object, and Generate Falling Object functions. As it was my first time creating the game in p5, where the objects fall down from the top, it was at first hard to manage it. As I started I noticed that too many objects were appearing at the same time or in the same spot, which made it hard for the player to catch anything. To manage how often objects appear, I used the game’s frame count as a timer. The program runs many times per second, and I checked the frame count to create a new object only when it reached a certain number, specifically every 60 frames. Since the game usually runs at 60 frames per second, this means a new object appears roughly once per second. This method made the falling items appear at a steady, predictable pace, giving the player enough time to react and catch them. I also set a random horizontal position for each object within the valid range using random (minX, maxX). What I am proud of is that after these adjustments, the game became much smoother and more playable. Each object now appears evenly across the screen and at a steady pace, making it easier for the player to react. This shows how careful timing and thoughtful use of randomness can improve the gameplay experience.

Link to resources and use of AI:

I used new library: Paul Badger’s CapacitiveSensor.

ChatGPT was used to generate the image for the start and play pages and helping me finding the sources about the capacity touch sensor, to be more specific, by asking ChatGPT: give me more information and tutorials on capacity touch sensor, he gave me this tutorial, which I looked up to understand the wiring and the whole circuit better.

The Game Over page was made using Canva.

The images of the food, liquid and the character was found on the internet, specifically this website.

Future Improvements:

For future improvements, I could make the game more dynamic and engaging by adding different difficulty levels where objects fall faster or in more complex patterns. I could also include more types of items with different point values or effects, like power-ups or obstacles. Another improvement would be to make the character movement smoother or add animations for catching objects to make it feel more responsive. On the Arduino side, I could refine the touch sensors to be more sensitive and consistent, or even add more sensors to allow new types of interactions. Finally, integrating sound effects for each catch or miss could make the game more immersive and fun.

IM Showcase:

Many people tried out my game and they really liked the interaction. Please see the photos below

Week 13 – User Testing

When I gave people a chance to try my game, they were generally able to figure it out easily. From my experience during the midterm project, I learned that providing clear instructions is very important, and I applied that lesson here. At first, players were surprised to see apples used as controllers, but I am happy with this decision because it made the game more engaging and encouraged curiosity. The instructions I provided explaining that the red apple moves the character to the right and the green apple moves the character to the left made it easy for them to understand the mapping between the controls and the game actions.

One small point of confusion was that I wrote “touch the apple,” but in practice, players needed to press or squish the apple for the character to move. This is something I can clarify better in the instructions for future users.The part of the game that worked well was moving the character from one side to another, as it felt natural and responsive. Another improvement suggested during user testing was related to catching items: originally, the character caught food near her legs at the very bottom of the screen, but after the suggestion, I moved the catch area so it aligns better with the character’s position. This made the game feel more intuitive and enjoyable.

The part of the experience that required more explanation was how the apples work as physical controllers. While users understood the concept after reading the instructions, I could improve this by adding a visual cue or short tutorial in the game showing how to press the apples to move the character. Overall, clear instructions, intuitive controls, and small adjustments based on user feedback made the game easy to understand and engaging, but highlighting the physical interaction more clearly could improve the first-time experience.

Please see below the user testing video:

User Testing

Week 12 – Final Project Proposal

Concept & Progress

For my final project, I chose to design an interactive healthy-choice game inspired by foods and habits familiar in Kazakh culture. By combining a digital game with physical inputs using Makey Makey, I want to create an experience that is playful, educational, and rooted in cultural references that feel personal and recognizable. The game features a girl character standing at the bottom of the p5.js canvas. The main mechanic of the game revolves around catching and avoiding falling items that drop from the top of the screen.  I have already made a design of the playing page, you can see it below.

These items that will fall down divided into healthy and unhealthy options that commonly appear in everyday life in Kazakhstan. Healthy items such as apples, dates, milk, and water increase the player’s score, while unhealthy items like burgers and Coca-Cola must be avoided. Each healthy item gives a specific number of points: apples give +20, dates +10, milk +20, and water +10. This scoring system encourages players not just to move and react quickly, but also to distinguish between foods visually. The game is structured in two phases, which alternate after a set time. In Phase 1, apples, dates, and burgers fall. In Phase 2, milk, water, and Coca-Cola appear. When the timer runs out, the round ends, and p5.js communicates with Arduino to display GAME OVER on an LCD module attached to the Arduino. Players can restart the game to continue the experience. Physical interaction happens through real apples connected with Makey Makey, which transforms the apples into left and right movement controls. Touching one apple makes the character move left, and touching the other moves her right. This adds a tangible element to the game and ties the controller design back to the cultural theme of the project.

The p5.js program handles all visuals, item generation, collision detection, scoring, phase transitions, and timer logic. It also sends a serial signal to Arduino at the end of the game so that the LCD screen displays “GAME OVER.” Arduino, meanwhile, receives movement inputs from Makey Makey and translates the end-of-game signal into LCD output. I have started working on the basic structure of the p5.js code, including the falling item class, the character movement, the timer, and the two-phase logic. I am also working on the sprite sheet for the girl character so she matches the visual style of the game.

To summarize inputs and outputs

Arduino → p5.js

  • Makey Makey “key” inputs from apples (Left / Right movement)
    (interpreted in p5 as arrow keys or assigned keys)

  • p5.js → Arduino
  • Message: Game Over sent via serial

  • Arduino displays GAME OVER on the LCD screen

Preliminary concept for Final Project

For my final project, I will design an interactive healthy-choice game using both Arduino and p5.js. The game features a Kazakh girl character positioned at the bottom of the screen while different items fall from the top. The healthy items, apples, dates, milk, and water are foods and drinks that people commonly consume in Kazakhstan, making the game culturally relevant and grounded in everyday life. These are contrasted with less healthy options such as burgers and Coca-Cola.

The game progresses in two alternating phases. In the first phase, apples, dates, and burgers fall from the sky. The player must collect apples and dates while avoiding burgers. After a set period, the game switches to a second phase, where milk, water, and Coca-Cola fall. In this phase, the player should collect milk and water while avoiding Coca-Cola. Each healthy item increases the score: apples give 20 points, dates 10, milk 20, and water 10. The player starts with three lives and loses one whenever they collide with an unhealthy item. When all lives are lost, the game ends, and p5 displays the final score.

Arduino provides the physical interaction through three buttons: one for start/restart and two for moving left and right. Arduino continuously senses button presses and sends serial messages to p5.js, which updates movement, animations, and game states.

Week 11 – Serial Communication

Exercise 1: Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5. 

Concept: When a person covers the light sensor, the reduced light makes the ball move horizontally in p5. The player controls the ball simply by changing how much light reaches the sensor.

P5 Code:

let address = 0;

function setup() {
  createCanvas(600, 600);
  noFill();
}

function draw() {
  background("purple");
  stroke("white");

  // Convert the incoming sensor reading (0–1023) into a horizontal screen position
  ellipse(map(address, 0, 1023, 0, width), height / 2, 100, 100);

  if (!serialActive) {
    // Show a connection screen while serial communication hasn’t started yet
    background("rgb(70,9,70)");
    stroke("white");
    textSize(50);
    text("Press Space Bar to select Serial Port", 20, 30, width - 30, 200);
  }
}

function keyPressed() {
  // When the space bar is pressed, begin the setup process for the serial port
  if (key == " ") setUpSerial();
}

function readSerial(data) {
  // If valid data arrives from the Arduino, save it for use in draw()
  if (data != null) {
    address = int(data);
  }
}

Arduino Code:

int LED = A0;

void setup() {
Serial.begin(9600);
pinMode(LED, INPUT);
}

void loop() {
int sensorValue = analogRead(A0);
Serial.println(sensorValue);
delay(5);
}

Github Link

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Exercise 2: Make something that controls the LED brightness from p5

Concept: When the player touches the trackpad and increases the sketch’s color from black to white, the LED also gets brighter.

// Holds the brightness value we will send to the Arduino
let brightness = 0;

// Stores any data received back from Arduino (not used, but required)
let latestData = "";

function setup() {
  // Create the canvas where visual feedback will appear
  createCanvas(600, 400);
  noStroke();
}

function draw() {
  // Clear the screen each frame with a black background
  background(0);

  // Convert trackpad/mouse X position (0 → width) into brightness (0 → 255)
  brightness = int(map(mouseX, 0, width, 0, 255));

  // Draw a rectangle whose fill intensity matches the brightness value
  fill(brightness);
  rect(0, 0, width, height);

  // If a serial port is active, send the brightness value to the Arduino
  if (serialActive) {
    writeSerial(brightness + "\n"); // "\n" ensures Arduino reads full numbers
  }

  // If serial is NOT open, show instructions to the user
  if (!serialActive) {
    background("purple");
    fill("white");
    textSize(28);
    text("Press SPACE to choose Serial Port", 20, 40);
  }
}

function keyPressed() {
  // Press SPACE to open the Web Serial port selection dialog
  if (key === " ") {
    setUpSerial();
  }
}

// This function is REQUIRED by p5.webserial
// It receives data sent from Arduino (even if unused)
function readSerial(data) {
  if (data) latestData = data;
}

// Sends data to Arduino IF the writer is available
function writeSerial(value) {
  if (writer) {
    writer.write(value);
  }
}

Arduino:

int ledPin = 9;
int brightness = 0;

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  if (Serial.available() > 0) {
    brightness = Serial.parseInt();
    brightness = constrain(brightness, 0, 255);
  }

  analogWrite(ledPin, brightness);
}

Github link

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Exercise 3:  Take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

Concept: As our ball bounced, and the red LED lit up. Using the potentiometer, the player controlled the wind, making the ball move from one side to the other.

P5 Code:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let on = 0;

function setup() {
  createCanvas(640, 360);
  //noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  
  if (!serialActive) {
    text("Click on the Screen to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
  
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
  }
  // turn on the LED only when it's on the ground or hits the ground
  if(position.y == height-mass/2){ 
    on = 0;
  }else{
    on = 1;
  }
}
function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}
function keyPressed(){
  if (key==' '){
    mass=random(15, 80);
    position.y=-mass;
    velocity.mult(0);
  }
}
function mousePressed() {
    setUpSerial();
}
function readSerial(data) {
  if (data != null) {
    // make sure there is actually a message
    // split the message
    wind.x = data;
    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = on + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino Code:

const int LED = 9;
const int POT = A0;


void setup() {
Serial.begin(9600);
pinMode(LED, OUTPUT);


// Test the LED
digitalWrite(LED, HIGH);
delay(500);
digitalWrite(LED, LOW);
}


void loop() {
int p_value = analogRead(POT); // read from the potentiometer
int move = map(p_value, 0, 1023, -1, 2); // map the value to -1, 0, and 1
Serial.println(move);


if (Serial.available() > 0) {
// read from p5.js
int touch = Serial.parseInt();
// set the LED command
if (touch == 1) {
digitalWrite(LED, HIGH);
} else {
digitalWrite(LED, LOW);
}
}
}

Github file

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Reflection: Across all three projects, I learned how different sensors can shape the experience through serial communication and p5. Working with light, the trackpad, and the potentiometer showed me how physical input can smoothly translate into visual changes on the screen. In the future, I would improve the responsiveness and make the serial connection more stable so the interactions feel smoother and more reliable.

 

Week 11 – Reading Reflection

Design Meets Disability:

As I was reading, I found out that what stayed with me most was how deeply design choices are entangled with social attitudes, even often more than we admit. The author’s critique of “camouflage” design exposed how much stigma gets quietly built into objects that are supposed to help people. It made me reconsider how often “neutrality” in design actually reinforces harmful norms by trying to erase difference instead of valuing it.

I was also struck by the idea that design for disability shouldn’t be separated from regular, mainstream design. The examples of the Eames splint and Aimee Mullins’ prosthetics shifted my understanding of what inclusivity can look like. Instead of treating disability as a constraint to minimize, the reading frames it as a space for experimentation, a site where new forms, aesthetics, and cultural meanings can emerge. That idea felt surprisingly liberating, because it challenges designers to imagine disability not as a deviation, but as part of the full spectrum of human experience.

Also, the discussion of universal design made me question my own assumptions about what “inclusive” even means. Sometimes we equate inclusivity with adding features for everyone, but the reading suggests that true thoughtfulness might come from editing, refining, and listening to individual needs. It left me with a sense that designing responsibly requires both humility and boldness: the humility to recognize one’s blind spots, and the boldness to challenge conventions that limit how people see themselves.

Week 10 – Reading Reflection

A Brief Rant on the Future of Interaction Design:

When I was reading Bret Victor’s “A Brief Rant on the Future of Interaction Design” I thought about how I perceive technology and design. I realized how easily I accept “innovations” like touchscreens as the peak of progress, even though, as Victor argues, they often limit our potential rather than expand it. His critique of “pictures under glass” especially resonated with me, because I use my phone and laptop every day, but I rarely think about how numb those interactions actually are. There is no real feeling, no texture, no sense of connection between my hands and what I’m creating.

I think, this reading challenged me to imagine interfaces that feel alive, that respond to our touch and movement in meaningful ways. Victor’s idea that tools should “amplify human capabilities” made me wonder whether I am designing for convenience or for human expression. I started thinking about how interaction could involve more of the body, maybe through gestures, pressure, or sound, so that users could experience technology in a fuller, more emotional way. I also liked Victor’s reminder that “the future is a choice.” It gave me a sense of agency and responsibility as a future designer. Instead of waiting for big tech companies to define how we interact, I can be part of shaping alternatives that are more tactile, intuitive, and human-centered. This reading did not just critique existing designs, it inspired me to dream bigger and to treat design as a way of expanding what people can truly feel and do.

A follow-up article

These responses challenged the way I think about technology and its relationship to the human body. His insistence that our current interfaces are “flat and glassy” made me realize how limited most digital experiences truly are. I started questioning how often I accept these limitations without noticing them. The idea that our tools should adapt to us not the other way around feels both radical and necessary.What I found most striking was his defense of our hands and bodies as essential to understanding and creativity. It made me see touch not as something trivial, but as a form of intelligence. The thought that we could lose part of that richness by constantly interacting with lifeless screens feels unsettling.

As someone studying Interactive Media, I see this as a call to design technologies that reconnect people with the physical world. Instead of chasing the newest gadget, I want to think about how digital experiences could feel more alive, how they could move, resist, or respond in ways that make us aware of our own presence. The reflections did not just critique modern design, they opened a space for imagining interaction as something deeply human, sensory, and expressive.

Week 10 – Musical instrument

Group members: Aizhan and Darmen
Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram: 

 

 

 

 

 

 

Setup:

 

 

 

 

 

 

 

Video Demonstration

Coding:

Arduino file on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }
}

float getDistance() {
  float echoTime;                   //variable to store the time it takes for a ping to bounce off an object
  float calculatedDistance;         //variable to store the distance calculated from the echo time
  //send out an ultrasonic pulse that's 10ms long
  digitalWrite(TRIG_PIN, HIGH);
  delayMicroseconds(10);
  digitalWrite(TRIG_PIN, LOW);
  echoTime = pulseIn(ECHO_PIN, HIGH);      //pulsein command to see how long it takes for pulse to bounce back to sensor
  calculatedDistance = echoTime / 55.2;    //calculate  distance of object that reflected the pulse in cm 
  return calculatedDistance;
}

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

Week 9 – Reading Reflection

Physical Computing’s Greatest Hits (and Misses)

Reading Physical Computing’s Greatest Hits (and Misses) really made me reflect on the role of human presence and movement in interactive design. The article explores recurring motifs in physical computing, such as theremin-style sensors, body-as-cursor interfaces, tilty tables, and mechanical pixels, and evaluates which ideas succeed and which fall short. What struck me most was the idea that interaction alone is not meaningful unless it is framed with intention or context.  I found this insight particularly relevant because gestures, motion, and bodily engagement only carry meaning when integrated into a space or narrative. The article also emphasizes that even commonly used ideas can be made fresh through variation and creativity.

The discussion of emotionally responsive projects, like “remote hugs”, also inspired me to think about the potential of physical computing to create connection and presence. It made me consider designing experiences where participants’ actions are not only triggers for a response but also carriers of meaning, emotion, or narrative. I found myself imagining interactive installations or performance spaces where movement, gesture, and proximity could communicate emotion or tell a story, giving participants a sense of agency and contribution. Overall, the article reinforced the importance of centering human input and intention over technical complexity. It motivated me to experiment more boldly with interactive media, blending technology, space, and human engagement in ways that feel purposeful, immersive, and emotionally resonant.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

When I was reading Making Interactive Art: Set the Stage, Then Shut Up and Listen I noticed a perspective of the audience or participant as a a co-creator, and the creator’s role is to design opportunities for exploration and discovery. The article encouraged setting the stage, providing affordances, and then stepping back to let participants engage on their own terms. This concept resonated deeply with me because I often feel the need to over-explain or control how people interact with my work, whether in interactive media projects, installations, or themed environments. Learning to trust participants’ curiosity and creativity is both challenging and exciting, and it made me rethink how I approach design: sometimes the most compelling experiences arise when the creator resists guiding every step and instead observes how others explore, interpret, and respond.

I also liked the idea  of “listening” to participant interaction. Observing how people engage, adapt, or even misuse an interactive installation can reveal insights the creator never intended, and these discoveries can guide future iterations. This connects to my interests in performance and immersive storytelling because, in both cases, the audience’s reactions shape the experience. It also made me reflect on how I design spaces and experiences outside of class projects, including themed parties or interactive setups, where I can experiment with encouraging participation rather than prescribing behavior. The article inspired me to embrace unpredictability, co-creation, and emergent experiences, reminding me that interaction is not just about technology or novelty, it is about creating a dynamic relationship between the participant, the space, and the narrative. Now, I want to apply this mindset to my projects, designing experiences where participants’ actions carry weight and meaning, and where discovery becomes a central part of engagement.

Week 9 – Analog input & output

Concept:

For my project, I wanted to create a system that could control multiple LEDs in both a digital and an analog fashion, and I drew inspiration from traffic lights. Instead of just two LEDs, I decided to use three LEDs to mimic a simplified traffic light sequence. When the push button is pressed, the system cycles through the lights in order: red → yellow → green, which demonstrates digital control, as each LED turns fully on or off with each button press.

For the analog aspect, I incorporated a potentiometer connected to an analog input pin on the Arduino. By adjusting the potentiometer, users can control the brightness of the LEDs, allowing for smooth, variable light intensity. Together, the push button and potentiometer create an interactive system: pressing the button cycles through the traffic light colors, while turning the potentiometer adjusts their brightness. This combination not only makes the project functional and educational but also connects to the familiar real-world concept of traffic lights.

Arduino Setup and Demonstration: 

Setup

Arduino video

Hand-drawn schematic

Coding:

Arduino File on Github

// Pins
const int POTENTIOMETER_PIN = A0; // Analog input pin for potentiometer
const int BUTTON_PIN = 2; // Digital input pin for push button
const int RED = 5;  // Digital output pin for RGB LED (red)
const int GREEN = 6;  // Digital output pin for RGB LED (green)
const int YELLOW = 9; // Digital output pin for RGB LED (yellow)

// Variables
int potentiometerValue = 0; // Potentiometer value
bool buttonState = false; // Button state
int colorIndex = 0; // Index of current color

void setup() {
  pinMode(POTENTIOMETER_PIN, INPUT);
  pinMode(BUTTON_PIN, INPUT_PULLUP);
  pinMode(RED, OUTPUT);
  pinMode(GREEN, OUTPUT);
  pinMode(YELLOW, OUTPUT);
}

void loop() {
  potentiometerValue = analogRead(POTENTIOMETER_PIN);
  
  // Map potentiometer value to LED brightness
  int brightness = map(potentiometerValue, 0, 1023, 0, 255);
  
  // Set the LED color based on the current color index
  switch (colorIndex) {
    case 0:  // Red
      analogWrite(RED, brightness);
      analogWrite(GREEN, 0);
      analogWrite(YELLOW, 0);
      break;
    case 1:  // Green
      analogWrite(RED, 0);
      analogWrite(GREEN, brightness);
      analogWrite(YELLOW, 0);
      break;
    case 2:  // Yellow
      analogWrite(RED, 0);
      analogWrite(GREEN, 0);
      analogWrite(YELLOW, brightness);
      break;
  }

  // Check button state
  if (digitalRead(BUTTON_PIN) == LOW) {
    if (!buttonState) {
      // Toggle color index
      colorIndex = (colorIndex + 1) % 3;
      buttonState = true;
      delay(200);
    }
  } else {
    buttonState = false;
  }
}

For this code, I created a circuit where I can control three LEDs—red, yellow, and green using a potentiometer and a button. The potentiometer adjusts the brightness of the active LED, while the button lets me switch between the LEDs. Each LED is connected to separate pins on the Arduino (5, 6, and 9), and the potentiometer is connected to the analog input A0. By pressing the button, I can cycle through the colors, and by turning the potentiometer, I can smoothly change how bright each LED glows.

Reflection:

Through this project, I learned how digital and analog inputs can work together to create an interactive system. It was rewarding to see how the push button and potentiometer complemented each other, one controlling the LED sequence and the other adjusting brightness. This helped me better understand how sensors and user inputs can bring a project to life.
For the future improvements, I would say adding a delay or debounce to make the button response smoother and more reliable, or using PWM pins consistently for better brightness control. In the future, I could also expand the project by adding automatic timing to simulate real traffic light behavior or incorporating sensors to make it more dynamic and realistic.