Final Project Documentation

Portal Clash is a two-player spaceship battle game that merges the physical and digital worlds. One player plays on a physical 16×16 NeoPixel LED grid using a custom hardware controller, and the second player plays on a digital p5.js canvas using the keyboard. The core idea is the “portal” mechanic: if the physical player shoots a bullet off the edge of their LED screen, it instantly teleports onto the digital screen to attack the other player, and vice versa. It’s a battle across dimensions.

The project relies on a heavy communication loop between the hardware and the browser. The p5.js sketch acts as the “brain” of the game, calculating all the physics, scoring, and portal logic for both worlds. The Arduino acts as a specialized display driver and input device.

Interaction Design
For the physical player, I built a controller with 5 push buttons: four for movement (Up, Down, Left, Right) and one for Firing. The digital player uses the computer keyboard (WASD for movement and ‘V’ for fire). The feedback is immediate—if you get hit, your ship explodes into particles on your respective screen.

Hardware & Circuit
I used four 8×8 NeoPixel matrices tiled together to create a 16×16 grid. This was a bit of a pitfall at first. Powering 256 LEDs is heavy. I tried different wirings, but eventually, I figured out a parallel connection setup where I split the power . I actually used a second Arduino solely as a 5V power source to feed two of the screens while the main Arduino handled the data and powered the other two.

Arduino Code
The code on the Arduino is optimized to avoid lag. It listens for pixel data from p5 to light up the grid. At the same time, it reads the 5 buttons and sends their state back to p5. I had to implement a “state change” logic so it only sends data when I actually press or release a button, which kept the game smooth.

#include <Adafruit_NeoPixel.h>

#define PIN_MATRIX A0 
#define NUMPIXELS  256

// WIRING: Pin -> Button -> Diagonal Leg -> GND
#define PIN_FIRE  2
#define PIN_UP    3
#define PIN_DOWN  4
#define PIN_LEFT  5
#define PIN_RIGHT 6

Adafruit_NeoPixel matrix(NUMPIXELS, PIN_MATRIX, NEO_GRB + NEO_KHZ800);

int lastU=0, lastD=0, lastL=0, lastR=0, lastF=0;
unsigned long lastHeartbeat = 0;

void setup() {
  Serial.begin(115200);
  matrix.begin();
  matrix.setBrightness(20); 
  matrix.show();
  
  pinMode(PIN_FIRE, INPUT_PULLUP);
  pinMode(PIN_UP,   INPUT_PULLUP);
  pinMode(PIN_DOWN, INPUT_PULLUP);
  pinMode(PIN_LEFT, INPUT_PULLUP);
  pinMode(PIN_RIGHT,INPUT_PULLUP);
}

void loop() {
  // 1. RECEIVE VIDEO DATA
  while (Serial.available() > 0) {
    char cmd = Serial.read();
    if (cmd == 'C') matrix.clear();
    else if (cmd == 'S') matrix.show();
    else if (cmd == 'P') {
      int x = Serial.parseInt();
      int y = Serial.parseInt();
      int r = Serial.parseInt();
      int g = Serial.parseInt();
      int b = Serial.parseInt();
      int idx = getPixelIndex(x, y);
      if (idx >= 0 && idx < NUMPIXELS) matrix.setPixelColor(idx, matrix.Color(r, g, b));
    }
  }

  // 2. SEND CONTROLLER DATA
  int u = !digitalRead(PIN_UP);
  int d = !digitalRead(PIN_DOWN);
  int l = !digitalRead(PIN_LEFT);
  int r = !digitalRead(PIN_RIGHT);
  int f = !digitalRead(PIN_FIRE);

  bool stateChanged = (u != lastU || d != lastD || l != lastL || r != lastR || f != lastF);
  
  if (stateChanged || (millis() - lastHeartbeat > 50)) {
    Serial.print("I:");
    Serial.print(u); Serial.print(",");
    Serial.print(d); Serial.print(",");
    Serial.print(l); Serial.print(",");
    Serial.print(r); Serial.print(",");
    Serial.println(f);
    lastU = u; lastD = d; lastL = l; lastR = r; lastF = f;
    lastHeartbeat = millis();
  }
  delay(2); 
}

int getPixelIndex(int x, int y) {
  if (x < 0 || x >= 16 || y < 0 || y >= 16) return -1;
  int screenIndex = 0;
  int localX = x; int localY = y;
  if (x < 8 && y < 8) { screenIndex = 0; }
  else if (x >= 8 && y < 8) { screenIndex = 1; localX -= 8; }
  else if (x < 8 && y >= 8) { screenIndex = 2; localY -= 8; }
  else { screenIndex = 3; localX -= 8; localY -= 8; }
  return (screenIndex * 64) + (localY * 8) + localX;
}

 

p5.js Code
This is where all the logic happens. The sketch manages two “SpaceShip” objects. It tracks which “World” a bullet is in. If a bullet crosses the boundary coordinate, the code swaps its world variable, causing it to stop rendering on the canvas and start rendering on the LED matrix (via Serial).

During development, I used a debugging trick where I mirrored the NeoPixel view onto the p5 canvas. This helped me figure out if the pixels were mapping correctly before I even looked at the LEDs.

Communication
I used the p5.webserial library. The challenge was timing; initially, there was a delay between pressing the button and the ship moving. I realized the serial buffer was getting clogged with old data. I fixed this by making p5 read all available data every frame and only using the most recent packet. Now, it feels instant. I knew from the beginning that the processing speed of the pixel traversing on the neoPixel screen relative to p5 might be a challenge big enough to make the idea not feasible; but I didn’t except good implementation tricks on p5 side would make it this smooth.


I am most proud of the idea and the gameplay itself. Seeing the bullet disappear from the physical LED screen and immediately pop up on the laptop screen feels really satisfying. It turned out exactly how I imagined it, and the competitive aspect makes people want to keep playing.

AI Section
I utilized Generative AI (ChatGPT) as a technical assistant to speed up the development process. The core game concept, the hardware design, and the logic flow were my own ideas. I used AI mainly to help me debug syntax errors in the Serial communication and to suggest optimizations for the lag I was experiencing. For example, when I struggled with the buffer bloat, the AI suggested clearing the buffer loop, which solved the issue. I also used it to help write the “Class” structure for the Spaceships to keep the code clean. The writing and documentation were done by me.

Future Improvements
To improve the experience, I would build a more permanent enclosure for the controller so the buttons are easier to hold. I also want to add a clear “Start Screen” with instructions, as user testing showed that people sometimes needed a moment to figure out the controls.

Also I want to elevate the gameplay and implement an advanced Idea I had in mind to randomize the sending and receiving edges of fires every 10 seconds. So that the users get surprised when the bullets attacking them start to portal from an unexpected direction and they also need to figure out which direction will send their own fires to the other world. 

User Testing

In the user testing most of the user figured out the idea without the need for me to explain; however, there were some pitfalls were the users were  a bit confused:

    • When the two users start to play at the same time and they start to fire they sometimes missed the effects happening and failed to get that the fire portals to the other screen.
    • The hardware setup I had during testing was not finished so some users failed to get which buttons correspond to which direction.
    • Button coloring: Some user recommended having the Fire button in different color so they know it’s supposed to perform a different action than movement.
    • Some users asked about the keyboard controls even though they suspected it’s either gonna be the arrows or WASD. Also the firing button ‘V’ wasn’t clear except for gamers.

What worked really well was the communication between p5 and the neoPixel screen. Once the users got the hang of the game they enjoyed the game so much and the animation of getting hit. They also liked the separation of colors between the player: Yellow and Blue including the fires color coming out of both of them. Some were impressed by the gameplay and how the pixels smoothly switch between the two screens.

To fix the earlier issues I would have a clear instructions page on the game startup that would clarify the controls on both sides and explain scoring system. I would also the core idea of the game to even get the users excited to try it out.

Week 11 – Serial Communication

Concept:

The idea behind this project was to create a bi-directional feedback loop between the physical world and a virtual simulation. Instead of treating Arduino and p5.js as separate tools, I wanted them to behave like two parts of the same system, constantly communicating with each other.
The interaction is centered around cause and consequence. A simple physical action, turning a potentiometer, does not directly place or move objects on screen. Instead, it influences a virtual force, wind, that acts on two simulated balls. When those virtual objects collide, the system responds physically by changing the brightness of a real LED connected to the Arduino.

Method & Materials:

This project combines one analog sensor, a virtual physics simulation, and a physical output using serial communication between Arduino and p5.js.

Hardware:
  • Arduino Uno
  • 10k potentiometer (analog sensor)
  • LED
  • 220–330 Ω resistor
  • Breadboard and jumper wires
Software:
  • Arduino IDE
  • p5.js with Web Serial

The potentiometer is wired as a voltage divider between 5V and GND, with the middle pin connected to analog pin A0. The LED is connected to digital pin 9 through a resistor and grounded on the other side.
The Arduino continuously reads the potentiometer value (0–1023) and sends it to p5.js over serial. p5.js interprets this value as a wind force that affects two balls in a physics simulation. When the balls collide, p5.js sends a command back to the Arduino, which responds by increasing the LED’s brightness.

Process:

I approached this project by breaking it into three conceptual stages rather than thinking of it as separate exercises.

Link to Video Demonstration

1. Physical Input → Virtual Force

The potentiometer provides a continuous analog input. The Arduino reads this input as a raw number and sends it to p5.js without assigning it any meaning. In p5.js, this value is mapped to a horizontal wind force rather than a position. This distinction is important: instead of directly placing the balls, the potentiometer influences how they move over time. This makes the interaction feel physical and dynamic, closer to real-world motion than to simple cursor control.

2. The Virtual Event (Decision-Making in Software)

The physics simulation exists entirely inside p5.js. Gravity, wind, velocity, and drag are calculated every frame. The software also monitors the relationship between the two balls. When they touch, p5.js detects a collision event. The Arduino has no awareness of this virtual world; it does not know what balls, gravity, or collisions are. All interpretation and decision-making happen in software.

3. Virtual Event → Physical Consequence

When a collision occurs, p5.js sends a short command back to the Arduino. Under normal conditions, p5.js continuously tells the Arduino to keep the LED dim, so there is always a subtle sign that the system is running. When a collision is detected, p5.js sends a specific command that causes the Arduino to flash the LED at full brightness for a brief moment. This turns an invisible, virtual event into a tangible physical response.

Schematic:

The schematic shows the potentiometer connected as a voltage divider to analog pin A0 and the LED connected to digital pin 9 through a resistor. The Arduino communicates with the computer over USB using serial communication, allowing p5.js to both receive sensor data and send control commands back to the Arduino.

Code:

The part of the code I am most proud of is the event-based communication triggered by the collision. A single character sent from p5.js is enough to cause a visible physical reaction, showing how minimal signals can carry meaningful information when the system is designed carefully.

Link to Code [Contains Arduino code in file arduino.txt]

Result:

The final system behaves as a closed feedback loop. Turning the potentiometer changes the wind, which alters how the two balls move on screen. When the balls collide, the LED connected to the Arduino flashes brightly, translating a virtual interaction into a physical signal.

In the demonstration video, it is clear that the system is responsive in both directions: physical input affects the simulation, and events inside the simulation produce immediate physical feedback. The interaction feels cohesive rather than fragmented across hardware and software.

Reflection:

This project helped me understand interaction design as a system rather than a collection of isolated components. The Arduino and p5.js each have clearly defined roles: hardware acts as the body, sensing and responding physically, while software acts as the mind, handling physics, logic, and decisions.

Week 10 – Sound, Sensor, Mapping

Concept:

The idea behind my instrument was to create a simple, tactile musical device that translates deliberate physical control into sound. I wanted to explore a grounded interaction where turning a knob directly shapes pitch and pressing a button activates the sound. The instrument encourages slow, intentional exploration: rotating the potentiometer continuously changes the note, while the button acts as a gate that turns the sound on and off.

Link to Video Demonstration

Method & Materials:

  • Analog Sensor: 10k potentiometer used to control pitch
  • Digital Switch: Tactile button used as an on/off trigger
  • Output: Piezo buzzer to produce sound
  • The potentiometer was connected between 5V and GND, with its middle pin wired to analog pin A0. The button was connected to digital pin 2 and GND, using the Arduino’s internal pullup resistor. The piezo buzzer was connected to digital pin 9 and GND.

As the potentiometer is rotated, the Arduino reads a continuous range of values from 0 to 1023. These values are mapped to a frequency range that controls the pitch of the sound produced by the piezo buzzer.

Process:

The potentiometer provides a smooth range of values, while the button only has two states, pressed or not pressed. I experimented with reading both inputs simultaneously and learned how to use the map() function to translate raw sensor data into meaningful sound frequencies.

I also explored how using the internal pullup resistor simplifies wiring, reducing the number of external components needed. Testing different frequency ranges helped me find values that were audible without being too harsh.

Schematic:

(schematic drawing img)

The schematic shows the potentiometer wired as a voltage divider to analog pin A0, the button connected to digital pin 2 using INPUT_PULLUP, and the piezo buzzer connected to pin 9 as the sound output.

Code:

This project uses input from two sensors, a potentiometer and a button, to generate sound through a piezo buzzer. The potentiometer continuously controls pitch, while the button determines whether sound is produced. When the button is pressed, the Arduino reads the potentiometer value and maps it to a frequency range, producing a tone. When the button is released, the sound stops.

The part of the code I am most proud of is the line that maps the potentiometer’s analog values into a usable

Result:

The final prototype behaves like a simple knob-based synthesizer. Turning the potentiometer smoothly changes the pitch, while pressing the button activates the sound. The interaction feels direct and intentional, allowing the user to clearly hear the relationship between physical input and sound output.

In the demonstration video, the pitch responds immediately to changes in the knob position, showing how basic electronic components can be combined to form a functional musical interface.

Reflection:

This project helped me understand how sound, sensors, and code come together in interactive systems. Working with both analog and digital inputs clarified how different types of control shape user experience. Even with very few components, the instrument feels expressive and responsive. This exercise showed me how computational logic can be translated into sensory feedback, and how small design decisions, like mapping and thresholds, strongly influence interaction. It serves as a foundation for thinking about more complex computational instruments in the future.

Week 14 – Final Project Doc – Wavy Bird

Wavy Bird is a physical rendering of the classic Flappy Bird. Instead of tapping a screen or clicking a mouse, the player holds a customary controller and physically waves their hand to keep the bird afloat.

The interaction design is simple by design but required significant tuning. The player holds a controller containing an accelerometer. A downward “Wave” gesture translates to a flap on the screen.

The challenge was calibration. A raw accelerometer reading is noisy as gravity pulls on the Z-axis differently depending on how the player holds the device. I designed a system where the game auto-calibrates the “rest” position on startup, which allows the player to hold the controller comfortably in any orientation, and the code detects relative acceleration changes (deltas) rather than absolute values.

Originally, I had ambitious plans to gamify an Implicit Association Test (IAT), forcing players to tilt the controller left or right to categorize words while flying to measure their implicit bias. However, during development, I realized the cognitive load was too high and the physical interaction was “muddy”. I pivoted to strip away the psychological testing and focus entirely on perfecting the core mechanic, i.e., the physical sensation of flight.

Project Interaction

P5.js: https://editor.p5js.org/yiyang/sketches/a2cexa377

Technical Implementation

1. Arduino Code

The heart of the controller is an MMA8452Q accelerometer. I optimized the firmware to be lean. Instead of streaming raw data and letting the browser do the heavy lifting calculation, the Arduino processes the physics locally.

The code samples the Z-axis at 100Hz. If the acceleration drops significantly below the calibrated baseline (indicating a rapid downward movement), it registers a “Wave” to “Bump” up the bird. I implemented a debounce timer (200ms) to prevent a single wave from triggering a double-jump, which was a major point of frustration during early testing.

GitHub w/ Arduino Code: https://github.com/xuintl/wavy-bird

2. Circuit Schematic

The circuit uses I2C communication. I wired the MMA8452Q breakout board to the Arduino (3.3V power, GND, and A4/A5 for SDA/SCL).

3. p5.js Code

The visual front-end is built in p5.js. I refactored a massive and messy code into OOP-optimized code (bird, pipe, ui, serial).

The game loop handles the state machine (Name Entry → Tutorial → Gameplay → Results). I implemented a responsive viewport so the game scales to fit any window height, while maintaining the correct aspect ratio constrained by the sprites’ dimensions, preventing the graphics from stretching.

I am particularly proud of the game feel during the gameplay. Getting the “Wave” to feel responsive without becoming over-sensitive was a lot of trial-and-error. I had to tweak the G-force threshold (from 2g to around 0.5g deviation) to match the natural strength of a human wrist flick. In addition, a smooth control shift, with the addition of a Practice Mode, need to be thought carefully, and was far more a straight line to achieve in every condition.

4. Communication

Communication is handled via the Web Serial API.

  1. Handshake: The user presses - and the browser connects to the Arduino.
  2. Calibration: The user can press = to send a 0 to the Arduino, triggering a recalibration routine if the device drifts.
  3. Gameplay: When the Arduino detects a gesture, it sends the string "WAVE\n" over the serial port. The p5.js SerialManager class parses this line and immediately triggers the bird.flap() function.

A challenge I encountered was the serial communication. I struggled with baud rate mismatches (switching between 115200 and 9600) and browser compatibility issues where the port wouldn’t close properly, requiring a full page refresh. Refactoring the serial logic into its own class with robust error handling solved this.

Development and Discussion

The entire core logic and gameplay were manually engineered, including the decision to switch from absolute to relative coordinates, the tuning of the physics engine, and the refactoring of the game states. I was able to quickly learn the syntax for the SparkFun_MMA8452Q library and to scaffold the Web Serial API connection thanks to GitHub Copilot. These libraries and APIs are verbose and tricky to start from scratch. AI also helped me debug perplexing errors regarding “Wire.h”, but I had to direct the logic to ensure the game remained fun and fair. The code structure and sprite assets were organized by me to ensure maintainability.

For the future, I want to remove the cable entirely. Integrating a Bluetooth Low Energy (BLE) module would allow for a truly wireless controller. Additionally, I’d like to re-introduce the tilt mechanics I stripped out, perhaps allowing the player to strafe forward and backward to catch coins while dodging pipes.

Final Project Documentation

Concept
My final project is an interactive robot game where players try to guide a small “student robot” toward its destination inside Monster Academy. The robot has to navigate through an obstacle course with different paths and challenges, and the players must control it carefully to avoid stepping on the lose sensor. If they reach the final pressure pad, the second sensor detects it and triggers a win. The idea was to merge a physical robot with a digital game interface so the whole experience feels like playing a simple video game, but happening in real life.

Process Overview

Hardware
For the hardware stage, I used the SparkFun Inventor’s Kit and followed the tutorial for building a basic remote-controlled robot. The guide helped me wire the motor driver, connect both motors, and set up the power system. After getting the robot moving reliably, I added two FSR sensors: one placed under an obstacle point to detect a “lose” event, and another placed at the end of the track to detect “win.” Both sensors were connected as voltage dividers so that the Arduino could measure changes in pressure using analog inputs.

I also built a large obstacle course from scratch. It was 100 by 80 centimeters, made using three big wooden boards and different colored papers to mark areas for obstacles, safe paths, and start/finish points. This part took a lot of testing, measuring, and redesigning because the robot moves differently depending on surface friction and layout.

Software
The software had two separate components: Arduino code and p5.js code. Arduino handles all the physical behavior of the robot, such as driving the motors forward, backward, left, and right through an H-bridge motor driver. It also reads the two sensors and sends signals back to the computer when the robot wins or loses.

The p5.js side handles the player interface. I created multiple screens: Start, Instructions, Control/Connect, Win, and Lose screens. Each screen appears at the right moment depending on the game flow. When the player presses a button or arrow key, p5.js sends a single-character command over the Web Serial API to the Arduino. The interface also includes a timer and localStorage functionality that saves the best time under the key robotRunner_bestTime.

Interaction Design
During user testing, something interesting happened. Even though I placed on-screen buttons, almost everyone naturally pressed the arrow keys on the laptop. Because of that, I decided to add full arrow-key support for moving the robot. This made the whole experience more intuitive and closer to traditional games. I also made the interface responsive so it adjusts when players press F to go full screen. The Start page transitions into the Instructions, and then the Control page appears, which is where the real-time serial communication and gameplay happens.

Circuit Schematic

Arduino Code and How It Works
In the Arduino sketch, I set up motor control using digital pins for direction and PWM pins for speed. Each movement command, such as forward or left, runs the motors for a short pulse of time and then stops them. This makes movement more precise and prevents the robot from overshooting.

The two FSR sensors are read using analogRead on A0 and A1. Each one has a threshold value. When the reading passes that threshold, the Arduino prints either “BUMP” or “WIN” to the serial port. The p5.js program waits for these exact messages. Getting this communication stable was challenging at first because the sensor values changed based on pressure and surface, so I had to adjust thresholds multiple times.

I also improved motor speed values so the robot had a more controlled movement. I had issues where the motors were too weak when powered only through USB, so switching to the battery pack was necessary.

// REMOTE ROBOT — p5.js CONTROL

// Right motor pins
const int AIN1 = 13;
const int AIN2 = 12;
const int PWMA = 11;

// Left motor pins
const int BIN1 = 8;
const int BIN2 = 9;
const int PWMB = 10;

// FSR sensors
const int fsr1Pin = A0;        // obstacle sensor
const int fsr2Pin = A1;        // win sensor 
const int fsrThreshold = 200;  // adjust after testing

// movement durations
const int driveTime = 300;
const int turnTime = 250;

void setup() {
  pinMode(AIN1, OUTPUT);
  pinMode(AIN2, OUTPUT);
  pinMode(PWMA, OUTPUT);
  pinMode(BIN1, OUTPUT);
  pinMode(BIN2, OUTPUT);
  pinMode(PWMB, OUTPUT);
  
  Serial.begin(9600);
}

void loop() {
  // SENSOR 1 - OBSTACLE -  YOU LOSE
  int fsr1Value = analogRead(fsr1Pin);
  if (fsr1Value > fsrThreshold) {
    stopMotors();
    Serial.println("BUMP DETECTED");     
    delay(300);
    return;
  }

  // SENSOR 2 - GOAL - YOU WIN
  int fsr2Value = analogRead(fsr2Pin);
  if (fsr2Value > fsrThreshold) {
    stopMotors();
    Serial.println("WIN SENSOR HIT");   
    delay(300);
    return;
  }

  // SERIAL COMMANDS 
  if (Serial.available() > 0) {
    char cmd = Serial.read();
    
    if (cmd == 'f') {
      rightMotor(-255);
      leftMotor(255);
      delay(driveTime);
      stopMotors();
    }
    if (cmd == 'b') {
      rightMotor(255);
      leftMotor(-255);
      delay(driveTime);
      stopMotors();
    }
    if (cmd == 'l') {
      rightMotor(255);
      leftMotor(255);
      delay(turnTime);
      stopMotors();
    }
    if (cmd == 'r') {
      rightMotor(-255);
      leftMotor(-255);
      delay(turnTime);
      stopMotors();
    }
  }
}

// MOTOR FUNCTIONS
void stopMotors() {
  rightMotor(0);
  leftMotor(0);
}

void rightMotor(int speed) {
  if (speed > 0) {
    digitalWrite(AIN1, HIGH);
    digitalWrite(AIN2, LOW);
  } else if (speed < 0) {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, HIGH);
  } else {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, LOW);
  }
  analogWrite(PWMA, abs(speed));
}

void leftMotor(int speed) {
  if (speed > 0) {
    digitalWrite(BIN1, HIGH);
    digitalWrite(BIN2, LOW);
  } else if (speed < 0) {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, HIGH);
  } else {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, LOW);
  }
  analogWrite(PWMB, abs(speed));
}

p5.js Code Description
The p5.js script manages the entire visual and interactive flow of the game. I created states for each screen, and the draw function displays the correct one based on the current state. The Start and Instructions pages have simple navigation buttons. The Control page includes the connection button, movement controls, timer, and status labels.

Once the player connects through Web Serial, the browser can send characters directly to the Arduino. I set up an async text reader so p5.js continuously checks for any messages the Arduino sends back. When a “WIN” or “BUMP” message is detected, p5.js switches to the corresponding screen and stops the timer.

The timer is implemented using millis(), and I store the best time in localStorage so the score stays saved even after refreshing the page. This gives the game a small replayability element.


Communication Between Arduino and p5.js
The communication is simple but effective. p5.js sends commands like “f”, “b”, “l”, and “r”. The Arduino receives them through Serial.read() and triggers the motors. Arduino sends back plain text messages that indicate win or lose events. p5.js reads these messages through a continuous loop using Web Serial’s readable stream. This setup ensures the robot’s physical behavior is always linked to the digital interface.

How This Was Made
The robot’s basic structure comes from the SparkFun guide, which helped me understand the wiring, motor driver logic, and motor control functions. After building the foundation from that tutorial, I added new features: dual-sensor win/lose detection, the full p5.js interface, the multi-screen system, arrow-key controls, the timer, and the localStorage system.

I wrote most of the Arduino and p5.js code myself, but I used generative AI mainly for debugging when errors happened, organizing long sections of code, and helping solve serial communication issues. All visuals, photos, and the physical design are made by me.

Project Media
initial stages of setting the robot up

This was the initial design of the robot which was made from carton and covered with paper. However, it stopped working completely and I didn’t know the reason, so I had to reassemble it. Turns out 5V jumper wire got loose and it was causing an issue.

 

 

 

I learned my lesson and taped all the wires to the Arduino board so that these “accidents” don’t happen anymore. The end result turned out to be much better and pretties than before.

 

 

video demonstration

video demostration 2

What I’m Proud Of
I’m most proud of the amount of design work that went into this project. I spent a lot of time experimenting with layouts, choosing the right materials for the course, and making sure the robot could navigate smoothly. The obstacle course was a huge build and took multiple redesigns to get right. I also used a lot of hot glue during the building phase, to the point where I started recognizing the smell too well.

I’m also proud of how the interface looks and how the robot, sensors, and game flow all connect together. It feels like a small arcade experience.

Future Improvements
In the future, I would like to add a separate physical controller so players don’t have to look at the laptop while playing. I would also include a scoring system with bonus points, a leaderboard that keeps track of previous scores, and maybe multiple levels or different obstacle layouts. These additions would make the game more competitive and engaging.

Final production

Final Piece

Physical Computing / Pieces

My work is all about space, and I wanted the installation to be on the floor. I glued my stepping pieces onto a single piece of wood, with each stepping piece made of cardboard and a hidden FSR beneath each cardboard tile. I painted the wood black, and also painted the cardboard stepping pieces and the boxes that held the button and potentiometer in black to keep a consistent aesthetic. I used the laser cutter in the IM lab and drilled a small hole for cable management.

For the visuals, I added images on the “stones” to enhance the aesthetics. Each image represented the function of its corresponding stepping piece, for example, adding a planet, meteor, or stars. To keep the images stylistically consistent, I used Google Gemini (Nano Banana) for image generation. I laminated the images since people would be stepping on them, and I wanted to make sure they wouldn’t get damaged.

I also had to weld the wires together to extend them and have them stick to the potentiometer and button.

Arduino setup and code

The rails came in really handy since I had five resistors for my five FSRs, so I could just connect all the resistors to the ground rail. I also used the 5V rail for my potentiometer. Some of the wires weren’t long enough, so I had to either solder them together or use jumper wires.

I needed to read the values of course, and for the button, I used input_pullup because I already had a lot of wiring so I wanted to keep the board cleaner.

Code

The main problem was that the FSR would occasionally send a 0 value, causing the whole screen to be spammed with meteors, stars, and planets, which resulted in significant lag, so I needed to change the code to implement proper debouncing, smoothing/averaging, and state confirmation to filter out false triggers. The solution includes a rolling average of three samples to smooth out noise, a 50 ms debounce time to ensure the state is stable before triggering, a minimum press time of 30 ms to confirm a press event, confirmed state tracking to prevent false positives, and hold detection that activates only after the button is verified as pressed.

I used Claude AI here to help come up with the solution because I had no idea what to do about the incorrect readings.

Code is in arduino.html in p5.js

Difficulties

There were mainly two difficulties. The first was more time-consuming than it was technically challenging: having several wires for seven different controls made organiscing and writing everything quite difficult. The second issue, which I mentioned earlier, was that the FSR values would randomly drop to 0 every few seconds even when there was no pressure applied, so I had to find a way to work around that.

User testing

Final code – https://editor.p5js.org/kk4827/full/ThmIIRv_-

Sounds from Pixabay

Images from Gemini Nano Banana

Week 15: Final Project

Click Here to Play on P5!

Concept Development

If you’ve seen my previous blog post and the ideas I had for how to construct this thing, you would notice that the final version of the build is very different from the original plan.

There were essentially two major issues going into this idea:
I forgot to consider that the trucks move in more than just one axis, they also lift up (I believe this is because I have RKP trucks); this means the original support columns would hold the trucks still and disallow it from tilting in either direction
I have no idea how to build stuff with wood.

It was actually Dustin that recommended me to search for “skateboard trainers” online instead of “skateboard mounts” because “mounts” were leading me to wall display mounts instead of what I wanted. Searching up “skateboard trainers” gave me these rubber things that I ordered immediately.

Rubber Skate Trainers

Dustin also recommended me to use large zip ties to secure the board in place instead of building a socket for each wheel. This greatly simplified the process and saved me so much time. I am forever grateful to Dustin.

Testing Prototypes

User Testing: Yongje

While I was in class, I thought it might be a good idea to see how people interact with the skateboard so far, with just the rubber trainers holding the wheels, so I invited Yongje and Siyona to take a ride on it. When Yongje initially stepped on it, he said some handle bars might make him feel safe but he quickly got comfortable with it. I paid special attention to where he approached getting onto the board and where he was placing his feet. Siyona was initially a little confused on where to place her feet, so this is when I got the idea to add some green indicators on the board for where to stand

It was at this time where Professor Mang strongly advised me against the idea of introducing a forward-leaning angle to the board. This would’ve created a safety nightmare so I decided to make the whole skateboard sit flat. We also measured out the board’s dimensions and the base’s potential dimensions based on that.

The next day, I went to the scene shop with Prof.Mang to grab a piece of plywood around 126 cm x 72 cm. We sanded it down and I taped a sequence of yellow and black tape as caution lines around the perimeter of the board. I also put down temporary paper tape to indicate where the wheels will go and where the trucks are; this would allow me to drill holes into the

After I had finished planing out the base, I had my good friend Clara step onto it to see how she would interact with it. This is what I documented for the week 13 User Testing so I’ll keep it brief; Clara has never stood on a skateboard before and this was her first time. I started getting an idea of where people who’ve never skateboarded typically stand.

Then I got a video from further away to see how she was naturally adjusting to the board as she got on. I pay special attention to how people bend their knees and adjust their arms when they’re leaning on the board.

Construction & Arduino Integration

 

For this project, I needed an accelerometer for its ability to detect tilt. I bought my Accelerometers off of Amazon instead of renting one from the connect2 system because I wanted to keep it for my project after the showcase. The

Accelerometer MPU6050

accelerometer I bought was the HiLetGo MPU6050 (the blue one) and it worked fantastic despite not being from one of the larger brands.

However, the accelerometer did not come pre-soldered so I had to solder it myself. Thankfully Professor Shiloh did a fantastic job of teaching me everything about soldering and I was able to solder everything really well by myself. 

Here’s my vector-hand drawn Affinity schematic for the circuits:

Circuit Schematics

After that day, I spent a lot of time in the scene shop with Tony working on the base of my project. Carrying it by its sides was getting very tiring so I decided to cut a handle with the jigsaw. In my practice cuts I was not cutting very clean circular shapes at the ends of the handle so I decided to use a hole saw for the corners and use the jigsaw to cut the straights instead. The handle may not have been necessary but it was much more convenient to transport the big rectangular base after I made it.

Then was the task of drilling holes in the base for the zip ties to tie down the board to the base. The original idea was to tie it to the trucks but the rubber skateboard trainers actually make for a perfect place for the ziptie to slip through. I measured roughly where the zip ties would have to go through for maximum stability while allowing slight lift for the RKP trucks to tilt, marked it with a pencil, and then drilled it out with a 9.5mm bit. This part was very challenging since it was my first time woodworking but it was a very rewarding experience; I was very glad I challenged myself to make this project.

Construction Done!

Professor Mang recommended that I make a small plank of wood that would house my Arduino and be semi-locked under my board but removable at any time so I decided to make that; I started by grabbing a small plank and sanding it down thoroughly since I was going to be handling it very often. Then I moved onto drilling two holes in opposite corners to each other and marking the holes on the main base with pencil. I then screwed in two small golden screws to hook onto the small modular piece. It turned out pretty great, and it even stays on while held vertically; although I’m not sure if the added weight from the Arduino and breadboard would allow that once those are mounted on.

Final Base Design with Zipties Ready

You might have also noticed that I made a handle for the base; that took a really long time but I love how it turned out.

Software Development

I didn’t really have a clear idea of what the game would actually look like for most of the semester; and yes, I’ve been planning this since day one. I even

My Idea on Aug 25th

wrote it down on a doc in my drive.

Sure I had a few ideas, but nothing that really excited me. I wanted a game that involved only side to side movement, but it would still be engaging and dynamic.

I thought about games like Pong, Subway Surfers, and even the “catching random things falling out of the sky” genre, but none of these sounded fun. Pong would be easy to do but I’m not sure it would be very fun playing against a wall that bounces the wall back by oneself; the reason why the midterm co-op mode worked was because having a second player added a whole ‘nother dimension to the design.

I ended up having this great idea for a bullet-hell inspired 2D spaceship shooter; I didn’t want it to be too stressful so there wouldn’t actually be bullet hell intensity but there would still be obstacles you need to steer away from. So I got to work.

Interface Design

The interface was inspired by many different things. A lot of the UI is inspired by my midterm project; I thought it would be nice to have a coherent menu system between them. It made sense since I also reused my code for screen and button management for this project. I loved how the title screen turned out. I had been designing in portrait mode all semester because portrait feels more casual and informal but I knew this project had to get the full landscape treatment and be placed on a big display.

Main Menu Screen

Health UI

The “Hull Integrity” health bar is inspired by one of my favorite games of all time, Titanfall 2. It works perfectly here as the

Titanfall 2 Hull Health UI

health for the ship. I think it looks great too, especially with the blinking effect when your ship is “doomed.”

 

So much of this project was inspired by the Titanfall universe and the amazing design from the team behind it. I really hope that my designs make any potential Titanfall fans that play this project happy; it sure made me happy learning to recreate it.

Sprite Design 

All the sprites were custom drawn by me in Affinity using vector tools. I had a lot of fun designing the sprites for the game. I really loved creating separate sprites for the spaceship based on the damage it had taken. I think it adds a lot to the visual communication to the player, especially for them to take more caution.

The parasite alien sprites were inspired by Casey Reas’ Eyeo talk on chance operations all the way back from the Week 2 reading; it’s from around 28:00 minutes in where he creates these little symmetrical space invaders-looking aliens. I have an image below of one of the ones Casey made and one I made that was inspired by it. All of the parasite designs were strongly inspired by examples shown in that Eyeo talk; I thought it’d be a nice callback to I believe was our first ever reading assignment.

My Parasite Sprite

From Casey Reas’s Eyeo 2012 Talk

 

 

 

 

 

Not a Cookie

The asteroid sprites were the one I was least proud of, I really struggled to make unique variations so some of them look a lot nicer than the others. I’ve always struggled to depict nature so this stressed me out a lot. I only made 3 of these so unfortunately you see a lot of repeats.

Sound Design 

The sound design for this game was quite challenging. For one, the game is set in space— there is no sound in space. For two, I wanted to avoid generic Star Wars-like sounds. 

I chose a kickdrum for the railgun shooting sound because I knew a bass-y sound would be less annoying to listen to on repeat than something high pitched. I had a big emphasis on subtle sound design in this project.

Speaking of subtle, I really cared to add things like ambient spaceship-humming and stuff like a sci-fi bridge sound effect on loop when you’re on the menu screen. I thought the bridge sounds with the ambient music in combination turned out amazing; you could listen to some pretty immersive sounds while you browsed the menus.

I also had the countdown sound effect I used in my midterm project return for this one, used in the exact same way — to count the player down before the game starts.

Near the very end of developing this game, I found some voice line files from Titanfall 2 where the mech, BT, says to the pilot “Warning. Major Hull Damage Detected” and “Warning. Heavy Damage Sustained.” This really added to the immersion of the gameplay and I really hope people will like it.

Misc. Design Choices

I could go on forever about the many little decisions I made in an effort to make this game excellent but I’ll stick with the most practical things I designed.

I placed calibration settings in the “How To Play” menu so players can calibrate while they’re on the same page where they try to understand the controls and objectives.

I made all the menu navigation possible with just a mouse; so even if you don’t have a keyboard in your hands, you can just click RMB to go back to the main menu; but if you do have a keyboard available you can also press CTRL to go back (ESC would exit fullscreen and I don’t think I can remove that from the browser).

Creating Custom Keybinds

I had a lot of fun setting up the serial connection because I split everything up into separate functions I can call and assigned each of them to a keybind. I thought the “Connect to Arduino” buttons I was seeing looked so ugly and felt so unintuitive, so I came up with my own solution.

I needed a button to quickly reset the current angle to C so I assigned “C” to become my calibrated button. It was chosen because the word calibrate starts with a C, which became a reference point that was easy to remember.

“X” was used to connect/disconnect the serial connection. It felt SO much more practical than having an onscreen button. “X’ was chosen purely because it was next to “C.”

And lastly, “B” to open the debug menu. This really made it feel like a real developed video game.

How this was Made (AI Utilization):

I asked ChatGPT 5.1 to help me set up the accelerometer I bought off of Amazon. It helped me understand what each pin did and gave me the library I needed to download for my MPU6050; afterwards, It helped me debug my serial connection code that I copied from the lecture notes and helped me add a P5 to Arduino transmission that would tell it to zero itself and calibrate the current angle to 0 degrees. ChatGPT 5.1 was a major help for setting up the serial connection.

ChatGPT 5.1 was also used to debug everything when I couldn’t figure it out. It recommended a lot of different approaches to program things like enemy spawning. Through the debugging process I ended up learning really fast because it answered all my questions like”why would you fix it that way?” or “are you sure you’re right? doesn’t this function() work like this?” in a really clear and precise manner.

Final Words:

This project was genuinely one of the most fun projects I’ve worked on in my three years at NYUAD. Through this project alone, I learned so much about woodworking with Tony in the scene shop, soldering with Professor Shiloh, and general project management throughout every step.

I want to give a very special thank you to Tony, Dustin, Professor Shiloh, Professor Ang for general guidance throughout the project, whether it was teaching me new skills or even training me in new tools. This project wouldn’t have been possible without them.

I also want to thank my playtesters:
Gauhar Meiram, Clara Juong, Yongje Jeon, Siyona Goel

User Testing

IMG_2996 Updates

After doing more research, I realised I can only have five stepping pieces. This is because the Arduino only has six analog pins, and I need to reserve one for the potentiometer. So I changed the layout to: three pins for the three star regions, one for planets, and one for meteors.

User Testing

During my initial testing, the first problem I encountered was that several sensors didn’t work at all. I had to find new ones, solder them to the wires so they stayed attached, and replace the old ones. This also meant I had to rip the stones off the board, which made me realise I should have tested the sensors thoroughly before sticking them onto the wood.
I used a simple Arduino program that printed the FSR values to make sure each sensor worked properly.

Another thing I realised was that, since I wanted the installation to be activated by foot pressure, I needed a way for users to know what each foot pad represents. So I generated images with Gemini AI, printed them, and mounted them on cardboard. Then I realised I should laminate them, because they were starting to crinkle and tear.

I also thought about how the user would see on the screen what they are pressing, especially if they’re holding it down. I decided it would be ideal to show small circles representing each stone on the screen with a ring animation that fills up when the stone is held, similar to the ring on the Apple Watch when you hold down a button for three seconds.

With the FSR, I noticed that sometimes the pressure would fluctuate between 0–10 even when a foot was on the sensor, so I set the threshold to 10.
Finally, I placed my stepping stones on pieces of black wood to create a more defined “space” feel.

 

Week 14 Final Project Documentation

  • Include some pictures / video of your project interaction
    • user testing 2 (see this clip for full interaction):
      • https://drive.google.com/file/d/11MYknXjQfZ2JDwCdM1UDdrmKWQnlF0M7/view?usp=sharing (sorry professor I have no idea word press uploaded this long video file for an hour, so I turned to google drive)
    • connecting
      • building handshake video clip
    • p5 interface
    • robot headshot
  • Describe your concept

For my final project, I will create a physically interactive Tamagotchi. It is a mini robot creature that the user can pat, touch, spin, and talk to, but instead of being cute and grateful, it responds in unexpectedly pessimistic and slightly hostile ways. The project combines physical interaction through arduino sensors with a character interface and dialogue system in p5. The concept is inspired by classic Tamagotchi toys, small handheld digital pets that demand constant attention and reward care with affection. In contrast, my project imagines a near-future world in which artificial beings no longer need, or even want, human caretaking. This pet has a personality shaped by human environmental destruction and techno-optimism gone wrong, and it is deeply unimpressed by humans.

  • How does the implementation work?
    • Description of Arduino code and include or link to full Arduino sketch
    • Schematic of your circuit (hand drawn or using tool)
    • Description of p5.js code and embed p5.js sketch in post
        • link includes both p5 and commented arduino: https://editor.p5js.org/joyzheng/sketches/mmYysys_A

The system consists of two main components: a physical controller powered by an arduino and a visual interface running on p5. The physical body uses a potentiometer to detect rotation (spinning) and three Force Sensitive Resistors (FSRs) placed on the left, right, and back to detect touch. An 8×8 neo matrix serves as the robot’s physical face. p5 handles the complex game logic, visual assets, and audio, while the arduino handles raw sensor data acquisition and LED matrix control.

  • Description of interaction design

The interaction is designed to simulate a moody robot. Spinning the potentiometer quickly disorients it, triggering a dizzy or angry state that turns the physical neo matrix into an angry red face. Similarly, squeezing the robot by pressing both the left and right sensors evokes annoyance. In keeping with the creature’s difficult personality, soothing it requires specific back-sensor patting, which is the only way to reset the angry state to calm. When the creature is already calm, playful pokes on individual sensors trigger a single pat reaction, causing the physical face to cycle through various calm colors. Leaving it idle, however, results in aggressive dialogue. After user testing, I added visual cues such as an animated arrow and a wiggling pointer to help guide the user through these interactions. To further immerse the user, the background music dynamically shifts to match the robot’s mood, transitioning from a soothing melody to an intense track whenever the angry state is triggered.

  • Description of communication between Arduino and p5.js

The arduino sketch is responsible for reading the four sensors and driving the 8×8 neo matrix. I established a handshake protocol during setup, ensuring the arduino waits for a valid connection before processing loops. The arduino sends raw sensor strings to the computer, while p5 returns specific logic flags like isAngry and triggerColorChange. I wrote a specific algorithm for the triggerColorChange flag to ensure that when the robot cycles through colors, it never selects the same color twice in a row. The p5 sketch functions as the brain of the operation, managing the state machine that dictates whether the robot is calm or angry. It loads pixel art sprites and implements a typing effect for the dialogue, simulating the text scrolling of a retro RPG to enhance the vintage atmosphere.

  • What are some aspects of the project that you’re particularly proud of?

I am particularly proud of successfully utilizing laser-cut acrylic for the physical enclosure, marking my first time working with this material. Unlike other prototyping materials, the permanent nature of acrylic demanded rigorous measurement and planning, as there was no room for error once a cut was made. This requirement for precision significantly increased the time investment for the physical build compared to previous projects. However, I’ve overcome this learning curve in my fabrication skills, and I now look forward to designing and creating even more complex acrylic models in future iterations of my work.

  • Include a 1-2 paragraph statement “How this was made” that describes what tools you used including how you used generative AI. Explain how the code was written, the design was made, and how the writeup was done. Credit the media sources (e.g. self-created, from a site, or generated).

This project was built on the synthesis of our arduino-p5 serial communication session, so I reviewed Professor Mang’s code mainly to refresh my memory on the mechanics. I also improved my typing effect by referencing this https://editor.p5js.org/xc2736/sketches/1igkPpfX5. The core serial communication structure was adapted from class examples regarding serial potentiometers. I used AI to help plan the robot logic and debug hardware issues. Gemini helped me resolve a conflict between the 16-bit color format of the graphics library and the 32-bit color requirements of the neo matrix. For visuals, I developed assets by using ChatGPT to pixelate my scratchy hand-drawn drafts, while the pointer asset was sourced from Adobe Stock and stretched using Affinity.

  • What are some areas for future improvement?

Looking toward future iterations, I plan to expand the physical feedback by integrating neo pixel  strips into the robot’s skeleton so its mood colors radiate through its entire body rather than just the face. I also want to enhance visual feedback by adding circular gradient cues in p5 that react to specific sensor inputs, alongside refining the pixel art sprites with more detailed animation frames.