All Posts

Featured

Week 2 – Reading Reflection

In the video, Casey Reas starts with the age-old tension between order and chaos. He explains, “chaos is what existed before creation, and order is brought about by god or gods into the world.” For centuries, creation was a divine act of imposing regularity on a chaotic world. As humans ourselves, we sought to illuminate our “godness” through patterns and symmetry.

On the contrary, the early 20th-century Dadaists inverted this relationship. Against a “new” world era confined by scientific laws and societal logic (which had potentially led to the chaos of war), they embraced chance as fundamentally human to take apart what they saw as “the reasonable frauds of men.” The entire focal point of their “chance operations” is to set up the artwork where chance creates beauty in chaos. Artists like Jean Arp and Marcel Duchamp used chance operations not to create chaos, but to rebel against a rigid order they no longer trusted and escape the confines of their own preconceptions, creating something truly unexpected.

Whereas this embrace of randomness, or unexpectation to the human eyes, is not a complete surrender to chaos. The rules–much like the physical laws of the nature–are secretly flowing under. As Reas’s own work demonstrates, his generative systems show that a small amount of “noise” is essential to prevent a static homogeneity. More importantly, why do the simple, inorganic rules create such sophisticated spectacle? I explored the dynamic, emergent complexity–the assembly of the crowd–in the course Robota Psyche.

My presentation, “Power of the Mass”, discussed how simple, inorganic rules governing a crowd can produce an incredibly sophisticated and life-like spectacle. The boundary of rules allows for randomness, but it is the assembly of the crowd that breathes life into the system. It raises the question of whether true creativity lies not in meticulous control, but in designing elegant systems that balance intention with unpredictability.

I would like to end my reflection with a Gerhard Richter quote.

“Above all, it’s never a blind chance, it’s a chance that is always planned but also always surprising and I needed in order to carry on in order to eradicate my mistakes to destroy what I’ve worked out wrong to introduce something different and disruptive I’m often astonished to find how much better chances than I am.”

Featured

Retro ASCII Art

Concept: Retro 3D Art

Retro Computer Graphics from Behance

I was inspired by old school computer graphics that you would see in movies like The Matrix. Because of this, I knew that I wanted to make some ASCII art in the signature green color that most retro graphics used. After some experimenting, I decided to make an ASCII representation of a Menger Sponge, which is a fractal geometry that I thought would be very interesting to look at.

Process

I began by creating a sample video that I could use to turn into ASCII art. To do this, I created a 3D cube in Processing, which is a predecessor of P5.js. I attempted to do this in P5.js, but found the saveFrame() function too limiting. I created a simple box using the 3D renderer in Processing, and added some lighting to give the sketch some dynamic range. This is important as I needed to use the difference in brightness later on when converting the video to ASCII, and the greater the dynamic range is the easier it is to perceive the ASCII video.

void setup() {
  size(600, 600, P3D);
}

float a = 0;
void draw() {
  background(0);
  noStroke();
  spotLight(10, 80, 240, width/2, height/2, 400, 0, 0, -1, PI/4, 2);
  pointLight(255, 0, 0, width/2, height/2, height/2);
  ambientLight(100, 0, 100);
  fill(255);

  translate(width/2, height/2);
  rotateX(a/2);
  rotateY(a/2);
  rotateZ(a/3);
  box(280);

  a+=PI/100;

  saveFrame("src/box-######.png");
}

I incremented the rotation angle by a fraction of pi because I wanted to be able to count when the cube resets to its original position. This made it easier to create a video that could be looped seamlessly.

Once I had the output frames, I combined them together using Microsoft Movie Maker. The final result was this video:

Next, I wanted to work on converting this footage to ASCII art. I followed Daniel Schiffman’s coding challenge on creating ASCII text images. After experimenting with the video size and character density arrays, the following sketch was the result I got:

However, I wanted to create something a bit more complex. This is when I remembered an old project that I worked on by following another one of The Coding Train‘s challenges, which was the Menger Sponge coding challenge.  After generating the frames and compiling them into a video, this was the result:

All I had to do then is to insert this video into the original code and play around with different parameters until I got the desired result.

Code Highlights

I’m really proud of the code that makes the animation look like its being built up slowly using ASCII characters. The way I achieved this is basically by filtering out the highlights on the Menger sponge. When I compiled the video, I saw that the lower right corner of the sponge had a bright highlight on it that was blocky.

//finding the right character based on brightness of pixel
let len = charArray.length;
let charIndex;

//playing with map values for the building up effect
charIndex = floor(map(apparentBrightness, 0, 100, len, 0));

When I filtered the brightest points of the sponge out, I essentially removed the lower left corner until it got a bit darker later in the animation, which created the building-up effect.

Reflection

Compared to the first assignment, I had a more solid idea of what I wanted to achieve. Because of this, I had planned out my workflow beforehand and that streamlined the entire creative process. I knew I had to create source animation and then convert it to ASCII characters. This made my code more readable, and I had better control of the sketch overall.

However, the building-up animation that I am most proud of is dependent on the source video. It looks the way it is because in the source animation the highlights are blocky as well. If I wanted to recreate this project, I want to work on some logic that allows the effect to be more generalizable. Maybe I could filter out sections of the object based on a distance function instead of the brightness levels. That way I can substitute different source videos and still get the cool effect.

Sources

 

 

Featured

Analog Musical Instrument

const int blueButton = 6;
const int servoPin = 9;
const int songLength = 8;
const int tempo = 115;

const int noteC3 = 130;
const int noteD3 = 146;
const int noteE3 = 164;
const int noteF3 = 174;
const int noteG3 = 196;
const int noteA3 = 220;
const int noteB3 = 246;


const int noteC4 = 261;
const int noteD4 = 293;
const int noteE4 = 329;
const int noteF4 = 349;
const int noteG4 = 392;
const int noteA4 = 440;
const int noteB4 = 493;
const int noteC5 = 523;

const int noteD5 = 587;
const int noteE5 = 659;
const int noteF5 = 698;
const int noteG5 = 784;
const int noteA5 = 880;
const int noteB5 = 987;


//int musicNotes[] = {noteC4, noteD4, noteE4, noteF4, noteG4, noteA4, noteB4, noteC5};
int musicNotes[] = {noteC5, 0, noteE5, 0, noteG5, 0, noteB5, 0};
int musicNotes2[] = {noteC4, 0, noteE4, 0, noteG4, 0, noteB4, 0};
int musicNotes3[] = {noteC3, 0, noteE3, 0, noteG3, 0, noteB3, 0};
int noteDuration[] = {2, 4, 2, 4, 4, 2, 4, 2};

void setup() {
  pinMode(servoPin, OUTPUT);
  pinMode(blueButton, INPUT_PULLUP);
  Serial.begin(9600);
}
void loop() {
 lightSensor();
  button();



}

//new tab: button
void button() {

  
  bool bluebuttonState = digitalRead(blueButton);
  if (bluebuttonState == HIGH) {
    for (int i = 0; i < songLength; i++) {

      int duration =   noteDuration[i] * tempo;

      tone(servoPin, musicNotes2[i], duration);
      delay(duration); //make the length of the time = length of the musical note(frequency)
      delay(15);

    }
  }    else {
    for (int i = 0; i < songLength; i++) {

      int duration =   noteDuration[i] * tempo;

      tone(servoPin, musicNotes[i], duration);
      delay(duration); //make the length of the time = length of the musical note(frequency)
      delay(15);


    };


  };

};
//new tab light sensor
void lightSensor() {
  int analogValue = analogRead(A0);
 Serial.print(analogValue);
  if (analogValue < 10) {
   Serial.println(" - Dark");
//   //add music note here
  } else if (analogValue < 200) {
   Serial.println(" - Dim");
    for (int i = 0; i < songLength; i++) {

     int duration =   noteDuration[i] * tempo;

     tone(servoPin, musicNotes3[i], duration);
     delay(duration); //make the length of the time = length of the musical note(frequency)
     delay(15);

    }

  }  };


Documentation:

Idea: Create sound directly from Arduino, like a musical drone sound by changing the musical notes from C4 to C5 when you click on the button or change from either to a 3rd octave when you dim the light sensor (all by changing the frequency from within the motor).

Positives:

I like how I could manipulate the motor sound based on its frequency to create a tone and tempo.In addition to that I was able to play around with that frequency within the motor to create an array of musical notes with different octaves. Once I could adjust the tempo and time spacing between notes through looping the array, I was able to integrate that into different parts of the code.

I like that I was able to introduce different noises from the motor by adding in different components that trigger different sounds like the button and sensor.

This was also surprisingly fun compared to other assignments for me because I learned that Arduino could be used for more than just LED circuits etc but you can incorporate and manipulate other media like sound.

Negatives:

I don’t think its particularly musical, however I think it follows the rules of music in terms of octaves and musical notes.

 

 

Featured

Candy colored spiral visual effect

This has gone through many iterations of trial and error to get the coolest effect.

Im using spiraled ellipses as a main element for my mid term, so I have been experimenting with it.

Code for creating the spiral:

float angle;

float x;
float y;
void setup() {
  size(800, 800);
  noFill();
  shapeMode(CENTER);
}

void draw() {

  //fill(255);
   fill(255,200,200); //pale rose, change the color of the candy.
  ellipse(height/2, width/2, 600, 600);
  
  int hundred=100;
  //rotate(angle);

  for (int i=0; i<500; i+=100)

  { 
    strokeWeight(20);
    stroke(0); //change the color of the spiral
    noFill();
    ;
    arc(height/2, width/2, hundred+i, hundred+i, radians(180), radians(360) );
    arc(height/2-25, width/2, hundred*1.5+i, hundred*1.5+i, radians(0), radians(180));
  }
  //angle=angle+0.1;
  //save("mySpiral.jpg");
};

I exported the code above to a .jpg format to use as an image in the sketch below.

Code for the animation:

float angle;
PImage img;
void setup() {
  size(900, 900);
  img = loadImage("mySpiral.png");
};
void draw() {
    background(255);
  for (int i=0; i<width; i++) {
 
     translate(width/2+i, height/2+i);
  imageMode(CENTER);
  rotate(angle);
  image(img, 0, 0,300,300);
  
 
  }
 angle=angle+1*1;
}

 

Next step :

I would like each candy spiral to go through several colored versions of itself:

 

  for (int a= 0; a<1; a++) {
     save(random(20)+"spiral");
  }

This code allows different colored versions from random to be saved as a new version of itself , I plan on using the saved photos as part of a sprite sheet to add to the animation above.

Featured

Psychedelic Geometry :)

float angle;
float grid= 400;
float i=0; 
void setup() {
  size(1080, 900,P3D);
};

void draw() {
 
  for(int x=0;x<=width;x+=grid){
    for(int y=0;y<=height;y+=grid){
  translate(mouseX,mouseY);
rotateX(radians(angle));
rotateY(radians(angle));
i=0; 
  while(i<width){
    i=i+20;
  beginShape();
vertex(i+100,100);
vertex(i+175,50);
vertex(i+250,100);
vertex(i+250,200);
vertex(i+175,250);
vertex(i+100,200);
vertex(i+100,100);
vertex(i+100,200);

endShape();
  };
  
  };
  };
//rotateX(radians(angle));
angle+=1; //does a full rotation
};

For this assignment,

I went through A LOT of experimentation,

at first, I wanted to recreate this pattern:

I almost got there:

Then, I didn’t really know what to do with it in terms of animation….

I  discovered the built in P3D  and started different experiments, but  I didn’t know how to integrate more effects while using the While loop, so I experimented further and created the animation at the top of the post.

The inspiration of this post for me was to experiment as much as possible, since it’s been helping me understand the mechanics of the code. And to save different blocks of code on GitHub if I find something interesting I could integrate in a different art piece later.

Final Project Documentation

Portal Clash is a two-player spaceship battle game that merges the physical and digital worlds. One player plays on a physical 16×16 NeoPixel LED grid using a custom hardware controller, and the second player plays on a digital p5.js canvas using the keyboard. The core idea is the “portal” mechanic: if the physical player shoots a bullet off the edge of their LED screen, it instantly teleports onto the digital screen to attack the other player, and vice versa. It’s a battle across dimensions.

The project relies on a heavy communication loop between the hardware and the browser. The p5.js sketch acts as the “brain” of the game, calculating all the physics, scoring, and portal logic for both worlds. The Arduino acts as a specialized display driver and input device.

Interaction Design
For the physical player, I built a controller with 5 push buttons: four for movement (Up, Down, Left, Right) and one for Firing. The digital player uses the computer keyboard (WASD for movement and ‘V’ for fire). The feedback is immediate—if you get hit, your ship explodes into particles on your respective screen.

Hardware & Circuit
I used four 8×8 NeoPixel matrices tiled together to create a 16×16 grid. This was a bit of a pitfall at first. Powering 256 LEDs is heavy. I tried different wirings, but eventually, I figured out a parallel connection setup where I split the power . I actually used a second Arduino solely as a 5V power source to feed two of the screens while the main Arduino handled the data and powered the other two.

Arduino Code
The code on the Arduino is optimized to avoid lag. It listens for pixel data from p5 to light up the grid. At the same time, it reads the 5 buttons and sends their state back to p5. I had to implement a “state change” logic so it only sends data when I actually press or release a button, which kept the game smooth.

#include <Adafruit_NeoPixel.h>

#define PIN_MATRIX A0 
#define NUMPIXELS  256

// WIRING: Pin -> Button -> Diagonal Leg -> GND
#define PIN_FIRE  2
#define PIN_UP    3
#define PIN_DOWN  4
#define PIN_LEFT  5
#define PIN_RIGHT 6

Adafruit_NeoPixel matrix(NUMPIXELS, PIN_MATRIX, NEO_GRB + NEO_KHZ800);

int lastU=0, lastD=0, lastL=0, lastR=0, lastF=0;
unsigned long lastHeartbeat = 0;

void setup() {
  Serial.begin(115200);
  matrix.begin();
  matrix.setBrightness(20); 
  matrix.show();
  
  pinMode(PIN_FIRE, INPUT_PULLUP);
  pinMode(PIN_UP,   INPUT_PULLUP);
  pinMode(PIN_DOWN, INPUT_PULLUP);
  pinMode(PIN_LEFT, INPUT_PULLUP);
  pinMode(PIN_RIGHT,INPUT_PULLUP);
}

void loop() {
  // 1. RECEIVE VIDEO DATA
  while (Serial.available() > 0) {
    char cmd = Serial.read();
    if (cmd == 'C') matrix.clear();
    else if (cmd == 'S') matrix.show();
    else if (cmd == 'P') {
      int x = Serial.parseInt();
      int y = Serial.parseInt();
      int r = Serial.parseInt();
      int g = Serial.parseInt();
      int b = Serial.parseInt();
      int idx = getPixelIndex(x, y);
      if (idx >= 0 && idx < NUMPIXELS) matrix.setPixelColor(idx, matrix.Color(r, g, b));
    }
  }

  // 2. SEND CONTROLLER DATA
  int u = !digitalRead(PIN_UP);
  int d = !digitalRead(PIN_DOWN);
  int l = !digitalRead(PIN_LEFT);
  int r = !digitalRead(PIN_RIGHT);
  int f = !digitalRead(PIN_FIRE);

  bool stateChanged = (u != lastU || d != lastD || l != lastL || r != lastR || f != lastF);
  
  if (stateChanged || (millis() - lastHeartbeat > 50)) {
    Serial.print("I:");
    Serial.print(u); Serial.print(",");
    Serial.print(d); Serial.print(",");
    Serial.print(l); Serial.print(",");
    Serial.print(r); Serial.print(",");
    Serial.println(f);
    lastU = u; lastD = d; lastL = l; lastR = r; lastF = f;
    lastHeartbeat = millis();
  }
  delay(2); 
}

int getPixelIndex(int x, int y) {
  if (x < 0 || x >= 16 || y < 0 || y >= 16) return -1;
  int screenIndex = 0;
  int localX = x; int localY = y;
  if (x < 8 && y < 8) { screenIndex = 0; }
  else if (x >= 8 && y < 8) { screenIndex = 1; localX -= 8; }
  else if (x < 8 && y >= 8) { screenIndex = 2; localY -= 8; }
  else { screenIndex = 3; localX -= 8; localY -= 8; }
  return (screenIndex * 64) + (localY * 8) + localX;
}

 

p5.js Code
This is where all the logic happens. The sketch manages two “SpaceShip” objects. It tracks which “World” a bullet is in. If a bullet crosses the boundary coordinate, the code swaps its world variable, causing it to stop rendering on the canvas and start rendering on the LED matrix (via Serial).

During development, I used a debugging trick where I mirrored the NeoPixel view onto the p5 canvas. This helped me figure out if the pixels were mapping correctly before I even looked at the LEDs.

Communication
I used the p5.webserial library. The challenge was timing; initially, there was a delay between pressing the button and the ship moving. I realized the serial buffer was getting clogged with old data. I fixed this by making p5 read all available data every frame and only using the most recent packet. Now, it feels instant. I knew from the beginning that the processing speed of the pixel traversing on the neoPixel screen relative to p5 might be a challenge big enough to make the idea not feasible; but I didn’t except good implementation tricks on p5 side would make it this smooth.


I am most proud of the idea and the gameplay itself. Seeing the bullet disappear from the physical LED screen and immediately pop up on the laptop screen feels really satisfying. It turned out exactly how I imagined it, and the competitive aspect makes people want to keep playing.

AI Section
I utilized Generative AI (ChatGPT) as a technical assistant to speed up the development process. The core game concept, the hardware design, and the logic flow were my own ideas. I used AI mainly to help me debug syntax errors in the Serial communication and to suggest optimizations for the lag I was experiencing. For example, when I struggled with the buffer bloat, the AI suggested clearing the buffer loop, which solved the issue. I also used it to help write the “Class” structure for the Spaceships to keep the code clean. The writing and documentation were done by me.

Future Improvements
To improve the experience, I would build a more permanent enclosure for the controller so the buttons are easier to hold. I also want to add a clear “Start Screen” with instructions, as user testing showed that people sometimes needed a moment to figure out the controls.

Also I want to elevate the gameplay and implement an advanced Idea I had in mind to randomize the sending and receiving edges of fires every 10 seconds. So that the users get surprised when the bullets attacking them start to portal from an unexpected direction and they also need to figure out which direction will send their own fires to the other world. 

User Testing

In the user testing most of the user figured out the idea without the need for me to explain; however, there were some pitfalls were the users were  a bit confused:

    • When the two users start to play at the same time and they start to fire they sometimes missed the effects happening and failed to get that the fire portals to the other screen.
    • The hardware setup I had during testing was not finished so some users failed to get which buttons correspond to which direction.
    • Button coloring: Some user recommended having the Fire button in different color so they know it’s supposed to perform a different action than movement.
    • Some users asked about the keyboard controls even though they suspected it’s either gonna be the arrows or WASD. Also the firing button ‘V’ wasn’t clear except for gamers.

What worked really well was the communication between p5 and the neoPixel screen. Once the users got the hang of the game they enjoyed the game so much and the animation of getting hit. They also liked the separation of colors between the player: Yellow and Blue including the fires color coming out of both of them. Some were impressed by the gameplay and how the pixels smoothly switch between the two screens.

To fix the earlier issues I would have a clear instructions page on the game startup that would clarify the controls on both sides and explain scoring system. I would also the core idea of the game to even get the users excited to try it out.

Week 11 – Serial Communication

Concept:

The idea behind this project was to create a bi-directional feedback loop between the physical world and a virtual simulation. Instead of treating Arduino and p5.js as separate tools, I wanted them to behave like two parts of the same system, constantly communicating with each other.
The interaction is centered around cause and consequence. A simple physical action, turning a potentiometer, does not directly place or move objects on screen. Instead, it influences a virtual force, wind, that acts on two simulated balls. When those virtual objects collide, the system responds physically by changing the brightness of a real LED connected to the Arduino.

Method & Materials:

This project combines one analog sensor, a virtual physics simulation, and a physical output using serial communication between Arduino and p5.js.

Hardware:
  • Arduino Uno
  • 10k potentiometer (analog sensor)
  • LED
  • 220–330 Ω resistor
  • Breadboard and jumper wires
Software:
  • Arduino IDE
  • p5.js with Web Serial

The potentiometer is wired as a voltage divider between 5V and GND, with the middle pin connected to analog pin A0. The LED is connected to digital pin 9 through a resistor and grounded on the other side.
The Arduino continuously reads the potentiometer value (0–1023) and sends it to p5.js over serial. p5.js interprets this value as a wind force that affects two balls in a physics simulation. When the balls collide, p5.js sends a command back to the Arduino, which responds by increasing the LED’s brightness.

Process:

I approached this project by breaking it into three conceptual stages rather than thinking of it as separate exercises.

Link to Video Demonstration

1. Physical Input → Virtual Force

The potentiometer provides a continuous analog input. The Arduino reads this input as a raw number and sends it to p5.js without assigning it any meaning. In p5.js, this value is mapped to a horizontal wind force rather than a position. This distinction is important: instead of directly placing the balls, the potentiometer influences how they move over time. This makes the interaction feel physical and dynamic, closer to real-world motion than to simple cursor control.

2. The Virtual Event (Decision-Making in Software)

The physics simulation exists entirely inside p5.js. Gravity, wind, velocity, and drag are calculated every frame. The software also monitors the relationship between the two balls. When they touch, p5.js detects a collision event. The Arduino has no awareness of this virtual world; it does not know what balls, gravity, or collisions are. All interpretation and decision-making happen in software.

3. Virtual Event → Physical Consequence

When a collision occurs, p5.js sends a short command back to the Arduino. Under normal conditions, p5.js continuously tells the Arduino to keep the LED dim, so there is always a subtle sign that the system is running. When a collision is detected, p5.js sends a specific command that causes the Arduino to flash the LED at full brightness for a brief moment. This turns an invisible, virtual event into a tangible physical response.

Schematic:

The schematic shows the potentiometer connected as a voltage divider to analog pin A0 and the LED connected to digital pin 9 through a resistor. The Arduino communicates with the computer over USB using serial communication, allowing p5.js to both receive sensor data and send control commands back to the Arduino.

Code:

The part of the code I am most proud of is the event-based communication triggered by the collision. A single character sent from p5.js is enough to cause a visible physical reaction, showing how minimal signals can carry meaningful information when the system is designed carefully.

Link to Code [Contains Arduino code in file arduino.txt]

Result:

The final system behaves as a closed feedback loop. Turning the potentiometer changes the wind, which alters how the two balls move on screen. When the balls collide, the LED connected to the Arduino flashes brightly, translating a virtual interaction into a physical signal.

In the demonstration video, it is clear that the system is responsive in both directions: physical input affects the simulation, and events inside the simulation produce immediate physical feedback. The interaction feels cohesive rather than fragmented across hardware and software.

Reflection:

This project helped me understand interaction design as a system rather than a collection of isolated components. The Arduino and p5.js each have clearly defined roles: hardware acts as the body, sensing and responding physically, while software acts as the mind, handling physics, logic, and decisions.

Week 10 – Sound, Sensor, Mapping

Concept:

The idea behind my instrument was to create a simple, tactile musical device that translates deliberate physical control into sound. I wanted to explore a grounded interaction where turning a knob directly shapes pitch and pressing a button activates the sound. The instrument encourages slow, intentional exploration: rotating the potentiometer continuously changes the note, while the button acts as a gate that turns the sound on and off.

Link to Video Demonstration

Method & Materials:

  • Analog Sensor: 10k potentiometer used to control pitch
  • Digital Switch: Tactile button used as an on/off trigger
  • Output: Piezo buzzer to produce sound
  • The potentiometer was connected between 5V and GND, with its middle pin wired to analog pin A0. The button was connected to digital pin 2 and GND, using the Arduino’s internal pullup resistor. The piezo buzzer was connected to digital pin 9 and GND.

As the potentiometer is rotated, the Arduino reads a continuous range of values from 0 to 1023. These values are mapped to a frequency range that controls the pitch of the sound produced by the piezo buzzer.

Process:

The potentiometer provides a smooth range of values, while the button only has two states, pressed or not pressed. I experimented with reading both inputs simultaneously and learned how to use the map() function to translate raw sensor data into meaningful sound frequencies.

I also explored how using the internal pullup resistor simplifies wiring, reducing the number of external components needed. Testing different frequency ranges helped me find values that were audible without being too harsh.

Schematic:

(schematic drawing img)

The schematic shows the potentiometer wired as a voltage divider to analog pin A0, the button connected to digital pin 2 using INPUT_PULLUP, and the piezo buzzer connected to pin 9 as the sound output.

Code:

This project uses input from two sensors, a potentiometer and a button, to generate sound through a piezo buzzer. The potentiometer continuously controls pitch, while the button determines whether sound is produced. When the button is pressed, the Arduino reads the potentiometer value and maps it to a frequency range, producing a tone. When the button is released, the sound stops.

The part of the code I am most proud of is the line that maps the potentiometer’s analog values into a usable

Result:

The final prototype behaves like a simple knob-based synthesizer. Turning the potentiometer smoothly changes the pitch, while pressing the button activates the sound. The interaction feels direct and intentional, allowing the user to clearly hear the relationship between physical input and sound output.

In the demonstration video, the pitch responds immediately to changes in the knob position, showing how basic electronic components can be combined to form a functional musical interface.

Reflection:

This project helped me understand how sound, sensors, and code come together in interactive systems. Working with both analog and digital inputs clarified how different types of control shape user experience. Even with very few components, the instrument feels expressive and responsive. This exercise showed me how computational logic can be translated into sensory feedback, and how small design decisions, like mapping and thresholds, strongly influence interaction. It serves as a foundation for thinking about more complex computational instruments in the future.

Week 14 – FINAL PROJECT

A Walk Through Time: Final Project Documentation

Concept

The idea behind A Walk Through Time is to let the viewer control the flow of time with simple hand gestures. When the viewer waves their hand on one side, time moves forward. When they wave on the other side, time reverses. When no one interacts, time pauses. The system changes both the physical world and the digital world at the same time.

The physical clock hand moves using a stepper motor. A growing plant moves up and down using a DC motor and a telescoping cylinder system. On the screen, a surreal p5.js world shows time moving with colors, waves, particles, and a glowing abstract clock. Everything stays in sync and reacts at the same moment. The goal was to create one experience where movement, gesture, and time feel connected.

Project Interaction 

Interaction description:

  • The viewer stands in front of the clock and plant
  • Two ultrasonic sensors wait for hand gestures
  • Waving on the right makes the clock tick forward and the plant rise
  • Waving on the left makes the clock tick backward and the plant collapse
  • When the viewer steps away, both the clock and plant pause
  • The p5.js visuals shift to match the state: forward, backward, or paused

How the Implementation Works

The system uses two Arduinos, two motors, two sensors, and a p5.js sketch.

Main Arduino

  • Reads the left and right ultrasonic sensors
  • Decides the time state: FORWARD, BACKWARD, or PAUSED
  • Moves the stepper motor to tick the physical clock
  • Sends the state through serial as a single character (F, B, or P)
  • Sends the same data to the second Arduino

Second Arduino

  • Receives F, B, P
  • Moves the DC motor to pull or release fishing wire
  • This grows or collapses a three-layer telescoping plant

p5.js

  • Reads the same serial data from the main Arduino
  • Updates the surreal background
  • Moves particles, waves, arrows, and an abstract glowing clock
  • Lets the viewer see time flowing

Interaction Design

The interaction is very simple. The viewer uses hand gestures to control time.

Right sensor → Time Forward
Left sensor → Time Backward
Both or none → Pause

All outputs reinforce this state:

    • The physical clock hand moves
    • The plant grows or collapses
    • The digital world changes color and motion

Arduino Code

Below are the Arduino codes:

#include <Stepper.h>

const int stepsPerRevolution = 2048;
const int ticksPerRevolution = 12;
const int stepsPerTick = stepsPerRevolution / ticksPerRevolution;

Stepper clockStepper(stepsPerRevolution, 8, 10, 9, 11);

enum TimeState { PAUSED, FORWARD, BACKWARD };
TimeState timeState = PAUSED;
TimeState lastSentState = PAUSED;

const int TRIG_RIGHT = 4;
const int ECHO_RIGHT = 5;
const int TRIG_LEFT  = 6;
const int ECHO_LEFT  = 7;

const int DETECT_THRESHOLD_CM = 40;

unsigned long lastTickTime = 0;
const unsigned long tickInterval = 1000;

long readDistanceCM(int trigPin, int echoPin) {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  long duration = pulseIn(echoPin, HIGH, 20000);
  if (duration == 0) return -1;
  return duration / 29 / 2;
}

void sendStateIfChanged() {
  if (timeState == lastSentState) return;
  lastSentState = timeState;
  char c = 'P';
  if (timeState == FORWARD) c = 'F';
  else if (timeState == BACKWARD) c = 'B';
  Serial.write(c);
}

void setup() {
  clockStepper.setSpeed(10);
  pinMode(TRIG_LEFT, OUTPUT);
  pinMode(ECHO_LEFT, INPUT);
  pinMode(TRIG_RIGHT, OUTPUT);
  pinMode(ECHO_RIGHT, INPUT);
  Serial.begin(9600);
}

void loop() {
  unsigned long now = millis();

  long distLeft  = readDistanceCM(TRIG_LEFT,  ECHO_LEFT);
  long distRight = readDistanceCM(TRIG_RIGHT, ECHO_RIGHT);

  bool leftDetected  = (distLeft  > 0 && distLeft  < DETECT_THRESHOLD_CM);
  bool rightDetected = (distRight > 0 && distRight < DETECT_THRESHOLD_CM);

  if (leftDetected && !rightDetected) timeState = BACKWARD;
  else if (!leftDetected && rightDetected) timeState = FORWARD;
  else timeState = PAUSED;

  if (now - lastTickTime >= tickInterval) {
    lastTickTime += tickInterval;
    if (timeState == FORWARD) clockStepper.step(-stepsPerTick);
    else if (timeState == BACKWARD) clockStepper.step(stepsPerTick);
  }

  sendStateIfChanged();
}

const int ENA = 6;
const int IN1 = 5;
const int IN2 = 4;

enum TimeState { PAUSED, FORWARD, BACKWARD };
TimeState state = PAUSED;

byte motorSpeed = 80;

unsigned long lastChangeTime = 0;
const unsigned long maxRunTime = 10000; // 10 seconds

void setup() {
  pinMode(ENA, OUTPUT);
  pinMode(IN1, OUTPUT);
  pinMode(IN2, OUTPUT);
  Serial.begin(9600);
  lastChangeTime = millis();
}

void applyMotorState(TimeState s, byte speed) {
  if (s == PAUSED) {
    digitalWrite(IN1, LOW);
    digitalWrite(IN2, LOW);
    analogWrite(ENA, 0);
  } else if (s == FORWARD) {
    digitalWrite(IN1, HIGH);
    digitalWrite(IN2, LOW);
    analogWrite(ENA, speed);
  } else if (s == BACKWARD) {
    digitalWrite(IN1, LOW);
    digitalWrite(IN2, HIGH);
    analogWrite(ENA, speed);
  }
}

void setState(TimeState newState) {
  if (newState != state) {
    state = newState;
    lastChangeTime = millis();
  }
}

void loop() {
  if (Serial.available() > 0) {
    char c = Serial.read();
    if (c == 'F') setState(FORWARD);
    else if (c == 'B') setState(BACKWARD);
    else if (c == 'P') setState(PAUSED);
  }

  unsigned long now = millis();

  if (state != PAUSED && (now - lastChangeTime >= maxRunTime)) {
    setState(PAUSED);
  }

  applyMotorState(state, motorSpeed);
}

Circuit Schematic

(Diagram made with https://www.circuit-diagram.org/)

breakdown of schematic:

Main Arduino

  • Ultrasonic Sensor Left
    • TRIG to pin 6
    • ECHO to pin 7
    • VCC to 5V
    • GND to GND
  • Ultrasonic Sensor Right
    • TRIG to pin 4
    • ECHO to pin 5
    • VCC to 5V
    • GND to GND
  • Stepper Motor (with driver)
    • IN1 → pin 8
    • IN2 → pin 9
    • IN3 → pin 10
    • IN4 → pin 11
    • VCC → 5V
    • GND → GND
  • Serial Out
    • TX (pin 1) → RX of second Arduino

Second Arduino (DC Motor Controller)

  • DC Motor Driver
    • IN1 → pin 5
    • IN2 → pin 4
    • ENA (PWM) → pin 6
    • Motor output → DC motor
    • Vmotor → 5V
    • GND → common ground with Arduin
  • Serial In
    • RX (pin 0) → TX of main Arduino

p5.js Code

  • Reads the serial state from Arduino
  • Updates the scene
  • Changes background colors
  • Moves particles and waves
  • Animates the digital clock
  • Shows arrows for direction

 

Communication Between Arduino and p5.js

Arduino → p5.js

Sends one character:

  • 'F' — forward
  • 'B' — backward
  • 'P' — paused

p5.js reads this using the Web Serial API.
When p5.js sees the character, it updates the digital world.

What I Am Proud Of

I am proud of how everything stays in sync.
The telescoping plant mechanism was hard to build, but it works well and gives life to the piece.
The gesture-based control also feels natural, and most users understand the idea at once.

How This Was Made

The clock was laser cut and screwed into a cardboard cylinder. I used an ice cream stick for the hand, which was connected using a skewer to the stepper motor. The boxes for the ultrasonic sensors were laser cut, and I got the design from boxes.py. For the fast forward and rewind icons, I designed them in Illustrator and then laser cut them. I got the idea for the telescoping cylinder from a YouTube short (https://www.youtube.com/shorts/99a4RUlUTm0), and I made a much simpler version that I 3D printed. I used another cardboard cylinder that I cut open to place the plant in and attach the DC motor and wheel at the top. I used acrylic paint, with help from an art major friend, to paint a background scene with the plant and sky.

The p5.js code was written through many tests and changes to connect smoothly with the Arduino using Web Serial. The designs for the scene, the clock visuals, and the interaction layout were made in small steps until everything felt right. The writeup was also done in simple clear language to explain the full system. All media was either created by me, painted by a friend, laser cut from my designs, or made using free online tools.

Areas for Future Improvement

The clock could be painted with a wooden-style finish to look more complete. I also still want to explore the original rotary sensor idea. The plan was to let the user manually rewind the clock by hand, and the system would detect this and move backward. I tested this with gears connecting the rotary sensor to the stepper motor, but the motor was too weak or the gears did not line up. I want to try again with stronger parts.

Finally, while the p5.js visuals look good and support the project, I feel there may be more ways to integrate the digital space with the physical movement. This is something I want to improve in the future.

Week 13 – user testing

User Testing: A Walk Through Time

I have conducted the user testing for the project, and the reactions were very positive. Most users understood the main idea right away. When the clock hand moved forward, they saw it as time moving forward. When it moved backward, they understood that time was reversing. This was a good sign that the core interaction is intuitive and does not need much explanation.

The gesture control also made sense to most people, but a few were unsure at first about which sensor to wave their hand over. To make this clearer, I decided to laser cut simple icons for fast forward and rewind and attach them to the ultrasonic sensors. This small change makes the mapping between gesture and action much more obvious.

One interesting issue that came up during testing was the behavior of the plant mechanism. The DC motor pulls fishing wire that extends a telescoping plant, and it collapses when the motor goes in reverse. Some users kept reversing time for too long, which caused the wire to unwind so far that it started rolling in the opposite direction. This made the plant rise again by mistake. Another related problem was users sending the plant up too high until it almost reached the motor.

To address this, I am adding a failsafe to the DC motor logic. The system will now prevent the motor from spinning in the same direction for too long. This will keep the fishing wire from fully unspooling and will protect the telescoping structure from being pulled too far up. This fix makes the physical system more reliable and safer for open interaction

Week 12 – Final Proposal

A Walk Through Time

A Walk Through Time is an interactive artwork that combines a physical clock, motion sensors, a DC motor, a stepper motor, and a digital surreal time-scape made in p5.js. The goal is to let the viewer control the flow of time with simple gestures, and watch both the physical world and the digital world respond in sync. When the viewer waves a hand on one side, time moves forward. When the viewer waves a hand on the other side, time moves backward. When no one is interacting, time pauses.

Arduino: Design and Behavior
The hardware side uses:

Two ultrasonic sensors

A stepper motor that drives a physical clock hand

A DC motor that rotates forward or backward depending on the state

A communication link to p5.js through Web Serial

A second Arduino that receives the state and drives the DC motor

Inputs (Arduino)

Left ultrasonic sensor

    • Detects a hand or body close to it
    • If only the left sensor sees something, time moves backward

Right ultrasonic sensor

    • Detects a hand or body on the right side
    • If only the right sensor sees something, time moves forward

Both sensors together or none

    • Time enters a paused state

The Arduino reads both sensors and decides one of three states: FORWARD, BACKWARD, PAUSED

Outputs (Arduino)

Stepper motor movement

    • Moves a physical clock hand
    • A full rotation is broken into 12 “ticks”
    • In FORWARD state the stepper ticks clockwise
    • In BACKWARD state the stepper ticks counterclockwise
    • In PAUSED state it holds still

Serial output sent to p5.js

    • The Arduino sends a single character representing the state:
    • ‘F’ for forward
    • ‘B’ for backward
    • ‘P’ for paused

Serial output to the second Arduino (DC motor controller)

    • The same state characters (F, B, P) are sent out
DC Motor Arduino

The second Arduino receives the state from the first one:

    • ‘F’ → DC motor spins forward
    • ‘B’ → DC motor spins backward
    • ‘P’ → DC motor stops
p5.js: Design and Behavior

The digital part is a surreal, dreamlike time-space. It reacts in real time to the state coming from Arduino. The design uses motion, color shifts, particles, waves, ripples, and a glowing abstract clock.

Inputs (p5.js)

Serial data from the Arduino

    • Reads incoming characters: F, B, or P
    • Updates timeState
    • Applies visual changes based on the state

Keyboard fallback for testing

    • F B P keys switch states if Arduino is not connected

Behavior of the digital scene

The scene changes in several ways depending on the state, reflecting time going forward, backwards, or stopped.

 

Week 14 – Final Project Doc – Wavy Bird

Wavy Bird is a physical rendering of the classic Flappy Bird. Instead of tapping a screen or clicking a mouse, the player holds a customary controller and physically waves their hand to keep the bird afloat.

The interaction design is simple by design but required significant tuning. The player holds a controller containing an accelerometer. A downward “Wave” gesture translates to a flap on the screen.

The challenge was calibration. A raw accelerometer reading is noisy as gravity pulls on the Z-axis differently depending on how the player holds the device. I designed a system where the game auto-calibrates the “rest” position on startup, which allows the player to hold the controller comfortably in any orientation, and the code detects relative acceleration changes (deltas) rather than absolute values.

Originally, I had ambitious plans to gamify an Implicit Association Test (IAT), forcing players to tilt the controller left or right to categorize words while flying to measure their implicit bias. However, during development, I realized the cognitive load was too high and the physical interaction was “muddy”. I pivoted to strip away the psychological testing and focus entirely on perfecting the core mechanic, i.e., the physical sensation of flight.

Project Interaction

P5.js: https://editor.p5js.org/yiyang/sketches/a2cexa377

Technical Implementation

1. Arduino Code

The heart of the controller is an MMA8452Q accelerometer. I optimized the firmware to be lean. Instead of streaming raw data and letting the browser do the heavy lifting calculation, the Arduino processes the physics locally.

The code samples the Z-axis at 100Hz. If the acceleration drops significantly below the calibrated baseline (indicating a rapid downward movement), it registers a “Wave” to “Bump” up the bird. I implemented a debounce timer (200ms) to prevent a single wave from triggering a double-jump, which was a major point of frustration during early testing.

GitHub w/ Arduino Code: https://github.com/xuintl/wavy-bird

2. Circuit Schematic

The circuit uses I2C communication. I wired the MMA8452Q breakout board to the Arduino (3.3V power, GND, and A4/A5 for SDA/SCL).

3. p5.js Code

The visual front-end is built in p5.js. I refactored a massive and messy code into OOP-optimized code (bird, pipe, ui, serial).

The game loop handles the state machine (Name Entry → Tutorial → Gameplay → Results). I implemented a responsive viewport so the game scales to fit any window height, while maintaining the correct aspect ratio constrained by the sprites’ dimensions, preventing the graphics from stretching.

I am particularly proud of the game feel during the gameplay. Getting the “Wave” to feel responsive without becoming over-sensitive was a lot of trial-and-error. I had to tweak the G-force threshold (from 2g to around 0.5g deviation) to match the natural strength of a human wrist flick. In addition, a smooth control shift, with the addition of a Practice Mode, need to be thought carefully, and was far more a straight line to achieve in every condition.

4. Communication

Communication is handled via the Web Serial API.

  1. Handshake: The user presses - and the browser connects to the Arduino.
  2. Calibration: The user can press = to send a 0 to the Arduino, triggering a recalibration routine if the device drifts.
  3. Gameplay: When the Arduino detects a gesture, it sends the string "WAVE\n" over the serial port. The p5.js SerialManager class parses this line and immediately triggers the bird.flap() function.

A challenge I encountered was the serial communication. I struggled with baud rate mismatches (switching between 115200 and 9600) and browser compatibility issues where the port wouldn’t close properly, requiring a full page refresh. Refactoring the serial logic into its own class with robust error handling solved this.

Development and Discussion

The entire core logic and gameplay were manually engineered, including the decision to switch from absolute to relative coordinates, the tuning of the physics engine, and the refactoring of the game states. I was able to quickly learn the syntax for the SparkFun_MMA8452Q library and to scaffold the Web Serial API connection thanks to GitHub Copilot. These libraries and APIs are verbose and tricky to start from scratch. AI also helped me debug perplexing errors regarding “Wire.h”, but I had to direct the logic to ensure the game remained fun and fair. The code structure and sprite assets were organized by me to ensure maintainability.

For the future, I want to remove the cable entirely. Integrating a Bluetooth Low Energy (BLE) module would allow for a truly wireless controller. Additionally, I’d like to re-introduce the tilt mechanics I stripped out, perhaps allowing the player to strafe forward and backward to catch coins while dodging pipes.

Final Project Documentation

Concept
My final project is an interactive robot game where players try to guide a small “student robot” toward its destination inside Monster Academy. The robot has to navigate through an obstacle course with different paths and challenges, and the players must control it carefully to avoid stepping on the lose sensor. If they reach the final pressure pad, the second sensor detects it and triggers a win. The idea was to merge a physical robot with a digital game interface so the whole experience feels like playing a simple video game, but happening in real life.

Process Overview

Hardware
For the hardware stage, I used the SparkFun Inventor’s Kit and followed the tutorial for building a basic remote-controlled robot. The guide helped me wire the motor driver, connect both motors, and set up the power system. After getting the robot moving reliably, I added two FSR sensors: one placed under an obstacle point to detect a “lose” event, and another placed at the end of the track to detect “win.” Both sensors were connected as voltage dividers so that the Arduino could measure changes in pressure using analog inputs.

I also built a large obstacle course from scratch. It was 100 by 80 centimeters, made using three big wooden boards and different colored papers to mark areas for obstacles, safe paths, and start/finish points. This part took a lot of testing, measuring, and redesigning because the robot moves differently depending on surface friction and layout.

Software
The software had two separate components: Arduino code and p5.js code. Arduino handles all the physical behavior of the robot, such as driving the motors forward, backward, left, and right through an H-bridge motor driver. It also reads the two sensors and sends signals back to the computer when the robot wins or loses.

The p5.js side handles the player interface. I created multiple screens: Start, Instructions, Control/Connect, Win, and Lose screens. Each screen appears at the right moment depending on the game flow. When the player presses a button or arrow key, p5.js sends a single-character command over the Web Serial API to the Arduino. The interface also includes a timer and localStorage functionality that saves the best time under the key robotRunner_bestTime.

Interaction Design
During user testing, something interesting happened. Even though I placed on-screen buttons, almost everyone naturally pressed the arrow keys on the laptop. Because of that, I decided to add full arrow-key support for moving the robot. This made the whole experience more intuitive and closer to traditional games. I also made the interface responsive so it adjusts when players press F to go full screen. The Start page transitions into the Instructions, and then the Control page appears, which is where the real-time serial communication and gameplay happens.

Circuit Schematic

Arduino Code and How It Works
In the Arduino sketch, I set up motor control using digital pins for direction and PWM pins for speed. Each movement command, such as forward or left, runs the motors for a short pulse of time and then stops them. This makes movement more precise and prevents the robot from overshooting.

The two FSR sensors are read using analogRead on A0 and A1. Each one has a threshold value. When the reading passes that threshold, the Arduino prints either “BUMP” or “WIN” to the serial port. The p5.js program waits for these exact messages. Getting this communication stable was challenging at first because the sensor values changed based on pressure and surface, so I had to adjust thresholds multiple times.

I also improved motor speed values so the robot had a more controlled movement. I had issues where the motors were too weak when powered only through USB, so switching to the battery pack was necessary.

// REMOTE ROBOT — p5.js CONTROL

// Right motor pins
const int AIN1 = 13;
const int AIN2 = 12;
const int PWMA = 11;

// Left motor pins
const int BIN1 = 8;
const int BIN2 = 9;
const int PWMB = 10;

// FSR sensors
const int fsr1Pin = A0;        // obstacle sensor
const int fsr2Pin = A1;        // win sensor 
const int fsrThreshold = 200;  // adjust after testing

// movement durations
const int driveTime = 300;
const int turnTime = 250;

void setup() {
  pinMode(AIN1, OUTPUT);
  pinMode(AIN2, OUTPUT);
  pinMode(PWMA, OUTPUT);
  pinMode(BIN1, OUTPUT);
  pinMode(BIN2, OUTPUT);
  pinMode(PWMB, OUTPUT);
  
  Serial.begin(9600);
}

void loop() {
  // SENSOR 1 - OBSTACLE -  YOU LOSE
  int fsr1Value = analogRead(fsr1Pin);
  if (fsr1Value > fsrThreshold) {
    stopMotors();
    Serial.println("BUMP DETECTED");     
    delay(300);
    return;
  }

  // SENSOR 2 - GOAL - YOU WIN
  int fsr2Value = analogRead(fsr2Pin);
  if (fsr2Value > fsrThreshold) {
    stopMotors();
    Serial.println("WIN SENSOR HIT");   
    delay(300);
    return;
  }

  // SERIAL COMMANDS 
  if (Serial.available() > 0) {
    char cmd = Serial.read();
    
    if (cmd == 'f') {
      rightMotor(-255);
      leftMotor(255);
      delay(driveTime);
      stopMotors();
    }
    if (cmd == 'b') {
      rightMotor(255);
      leftMotor(-255);
      delay(driveTime);
      stopMotors();
    }
    if (cmd == 'l') {
      rightMotor(255);
      leftMotor(255);
      delay(turnTime);
      stopMotors();
    }
    if (cmd == 'r') {
      rightMotor(-255);
      leftMotor(-255);
      delay(turnTime);
      stopMotors();
    }
  }
}

// MOTOR FUNCTIONS
void stopMotors() {
  rightMotor(0);
  leftMotor(0);
}

void rightMotor(int speed) {
  if (speed > 0) {
    digitalWrite(AIN1, HIGH);
    digitalWrite(AIN2, LOW);
  } else if (speed < 0) {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, HIGH);
  } else {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, LOW);
  }
  analogWrite(PWMA, abs(speed));
}

void leftMotor(int speed) {
  if (speed > 0) {
    digitalWrite(BIN1, HIGH);
    digitalWrite(BIN2, LOW);
  } else if (speed < 0) {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, HIGH);
  } else {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, LOW);
  }
  analogWrite(PWMB, abs(speed));
}

p5.js Code Description
The p5.js script manages the entire visual and interactive flow of the game. I created states for each screen, and the draw function displays the correct one based on the current state. The Start and Instructions pages have simple navigation buttons. The Control page includes the connection button, movement controls, timer, and status labels.

Once the player connects through Web Serial, the browser can send characters directly to the Arduino. I set up an async text reader so p5.js continuously checks for any messages the Arduino sends back. When a “WIN” or “BUMP” message is detected, p5.js switches to the corresponding screen and stops the timer.

The timer is implemented using millis(), and I store the best time in localStorage so the score stays saved even after refreshing the page. This gives the game a small replayability element.


Communication Between Arduino and p5.js
The communication is simple but effective. p5.js sends commands like “f”, “b”, “l”, and “r”. The Arduino receives them through Serial.read() and triggers the motors. Arduino sends back plain text messages that indicate win or lose events. p5.js reads these messages through a continuous loop using Web Serial’s readable stream. This setup ensures the robot’s physical behavior is always linked to the digital interface.

How This Was Made
The robot’s basic structure comes from the SparkFun guide, which helped me understand the wiring, motor driver logic, and motor control functions. After building the foundation from that tutorial, I added new features: dual-sensor win/lose detection, the full p5.js interface, the multi-screen system, arrow-key controls, the timer, and the localStorage system.

I wrote most of the Arduino and p5.js code myself, but I used generative AI mainly for debugging when errors happened, organizing long sections of code, and helping solve serial communication issues. All visuals, photos, and the physical design are made by me.

Project Media
initial stages of setting the robot up

This was the initial design of the robot which was made from carton and covered with paper. However, it stopped working completely and I didn’t know the reason, so I had to reassemble it. Turns out 5V jumper wire got loose and it was causing an issue.

 

 

 

I learned my lesson and taped all the wires to the Arduino board so that these “accidents” don’t happen anymore. The end result turned out to be much better and pretties than before.

 

 

video demonstration

video demostration 2

What I’m Proud Of
I’m most proud of the amount of design work that went into this project. I spent a lot of time experimenting with layouts, choosing the right materials for the course, and making sure the robot could navigate smoothly. The obstacle course was a huge build and took multiple redesigns to get right. I also used a lot of hot glue during the building phase, to the point where I started recognizing the smell too well.

I’m also proud of how the interface looks and how the robot, sensors, and game flow all connect together. It feels like a small arcade experience.

Future Improvements
In the future, I would like to add a separate physical controller so players don’t have to look at the laptop while playing. I would also include a scoring system with bonus points, a leaderboard that keeps track of previous scores, and maybe multiple levels or different obstacle layouts. These additions would make the game more competitive and engaging.

Final production

Final Piece

Physical Computing / Pieces

My work is all about space, and I wanted the installation to be on the floor. I glued my stepping pieces onto a single piece of wood, with each stepping piece made of cardboard and a hidden FSR beneath each cardboard tile. I painted the wood black, and also painted the cardboard stepping pieces and the boxes that held the button and potentiometer in black to keep a consistent aesthetic. I used the laser cutter in the IM lab and drilled a small hole for cable management.

For the visuals, I added images on the “stones” to enhance the aesthetics. Each image represented the function of its corresponding stepping piece, for example, adding a planet, meteor, or stars. To keep the images stylistically consistent, I used Google Gemini (Nano Banana) for image generation. I laminated the images since people would be stepping on them, and I wanted to make sure they wouldn’t get damaged.

I also had to weld the wires together to extend them and have them stick to the potentiometer and button.

Arduino setup and code

The rails came in really handy since I had five resistors for my five FSRs, so I could just connect all the resistors to the ground rail. I also used the 5V rail for my potentiometer. Some of the wires weren’t long enough, so I had to either solder them together or use jumper wires.

I needed to read the values of course, and for the button, I used input_pullup because I already had a lot of wiring so I wanted to keep the board cleaner.

Code

The main problem was that the FSR would occasionally send a 0 value, causing the whole screen to be spammed with meteors, stars, and planets, which resulted in significant lag, so I needed to change the code to implement proper debouncing, smoothing/averaging, and state confirmation to filter out false triggers. The solution includes a rolling average of three samples to smooth out noise, a 50 ms debounce time to ensure the state is stable before triggering, a minimum press time of 30 ms to confirm a press event, confirmed state tracking to prevent false positives, and hold detection that activates only after the button is verified as pressed.

I used Claude AI here to help come up with the solution because I had no idea what to do about the incorrect readings.

Code is in arduino.html in p5.js

Difficulties

There were mainly two difficulties. The first was more time-consuming than it was technically challenging: having several wires for seven different controls made organiscing and writing everything quite difficult. The second issue, which I mentioned earlier, was that the FSR values would randomly drop to 0 every few seconds even when there was no pressure applied, so I had to find a way to work around that.

User testing

Final code – https://editor.p5js.org/kk4827/full/ThmIIRv_-

Sounds from Pixabay

Images from Gemini Nano Banana