Week 14 – Final Project Doc – Wavy Bird

Wavy Bird is a physical rendering of the classic Flappy Bird. Instead of tapping a screen or clicking a mouse, the player holds a customary controller and physically waves their hand to keep the bird afloat.

The interaction design is simple by design but required significant tuning. The player holds a controller containing an accelerometer. A downward “Wave” gesture translates to a flap on the screen.

The challenge was calibration. A raw accelerometer reading is noisy as gravity pulls on the Z-axis differently depending on how the player holds the device. I designed a system where the game auto-calibrates the “rest” position on startup, which allows the player to hold the controller comfortably in any orientation, and the code detects relative acceleration changes (deltas) rather than absolute values.

Originally, I had ambitious plans to gamify an Implicit Association Test (IAT), forcing players to tilt the controller left or right to categorize words while flying to measure their implicit bias. However, during development, I realized the cognitive load was too high and the physical interaction was “muddy”. I pivoted to strip away the psychological testing and focus entirely on perfecting the core mechanic, i.e., the physical sensation of flight.

Project Interaction

P5.js: https://editor.p5js.org/yiyang/sketches/a2cexa377

Technical Implementation

1. Arduino Code

The heart of the controller is an MMA8452Q accelerometer. I optimized the firmware to be lean. Instead of streaming raw data and letting the browser do the heavy lifting calculation, the Arduino processes the physics locally.

The code samples the Z-axis at 100Hz. If the acceleration drops significantly below the calibrated baseline (indicating a rapid downward movement), it registers a “Wave” to “Bump” up the bird. I implemented a debounce timer (200ms) to prevent a single wave from triggering a double-jump, which was a major point of frustration during early testing.

GitHub w/ Arduino Code: https://github.com/xuintl/wavy-bird

2. Circuit Schematic

The circuit uses I2C communication. I wired the MMA8452Q breakout board to the Arduino (3.3V power, GND, and A4/A5 for SDA/SCL).

3. p5.js Code

The visual front-end is built in p5.js. I refactored a massive and messy code into OOP-optimized code (bird, pipe, ui, serial).

The game loop handles the state machine (Name Entry → Tutorial → Gameplay → Results). I implemented a responsive viewport so the game scales to fit any window height, while maintaining the correct aspect ratio constrained by the sprites’ dimensions, preventing the graphics from stretching.

I am particularly proud of the game feel during the gameplay. Getting the “Wave” to feel responsive without becoming over-sensitive was a lot of trial-and-error. I had to tweak the G-force threshold (from 2g to around 0.5g deviation) to match the natural strength of a human wrist flick. In addition, a smooth control shift, with the addition of a Practice Mode, need to be thought carefully, and was far more a straight line to achieve in every condition.

4. Communication

Communication is handled via the Web Serial API.

  1. Handshake: The user presses - and the browser connects to the Arduino.
  2. Calibration: The user can press = to send a 0 to the Arduino, triggering a recalibration routine if the device drifts.
  3. Gameplay: When the Arduino detects a gesture, it sends the string "WAVE\n" over the serial port. The p5.js SerialManager class parses this line and immediately triggers the bird.flap() function.

A challenge I encountered was the serial communication. I struggled with baud rate mismatches (switching between 115200 and 9600) and browser compatibility issues where the port wouldn’t close properly, requiring a full page refresh. Refactoring the serial logic into its own class with robust error handling solved this.

Development and Discussion

The entire core logic and gameplay were manually engineered, including the decision to switch from absolute to relative coordinates, the tuning of the physics engine, and the refactoring of the game states. I was able to quickly learn the syntax for the SparkFun_MMA8452Q library and to scaffold the Web Serial API connection thanks to GitHub Copilot. These libraries and APIs are verbose and tricky to start from scratch. AI also helped me debug perplexing errors regarding “Wire.h”, but I had to direct the logic to ensure the game remained fun and fair. The code structure and sprite assets were organized by me to ensure maintainability.

For the future, I want to remove the cable entirely. Integrating a Bluetooth Low Energy (BLE) module would allow for a truly wireless controller. Additionally, I’d like to re-introduce the tilt mechanics I stripped out, perhaps allowing the player to strafe forward and backward to catch coins while dodging pipes.

Week 13 – Prototyping and User Testing

I had a grand, academic ambition when I started my final project. My midterm, Pitchy Bird, had successfully used vocal pitch to control a character. For the final, I wanted to escalate the complexity to gamify the Implicit Association Test (IAT). The concept was Flappy IAT, a game where players navigated by physically “bumping” a controller while simultaneously tilting it left or right to categorize words (e.g., “Good” vs. “Bad”) displayed on pipes.

It sounded brilliant on paper, a lesson to learn in practice… A lesson on how an ambitious outlook (conceptual AND technical) can suffocate user experience.

User Testing

My game’s initial prompt for the user was: “Bump to fly, Tilt Left for Green/Good, Tilt Right for Red/Bad.”

There was immediate confusion when I described it to people and even later handed a prototype to them without instructions. The cognitive load was too high to keep the bird afloat WHILE processing semantic text on moving pipes. This has become clear immediately after user testing, even with a keyboard-simulated prototype.

Even when the users understand the game control logic, “Bump” didn’t play well physically. Users weren’t sure if they should tap the device or shake it. The motion was actually a “Wave” and a sudden vertical movement regardless of the angle. I decided to changethe internal and external naming from BUMP to WAVE, which made the game sound nicer and action more intuitive.

Technical Challenges

Behind the scenes, the technical implementation was really a “battlefield”. I was using an Arduino with an MMA8452Q accelerometer sending data to p5.js via Web Serial.

Calibration Nightmare

In the first iteration, I assumed the controller would calibrate itself and has some error tolerance. None. Hardcoded thresholds (e.g., expecting 1.0g on the Z-axis at rest) failed the moment a user held the device at a comfortable, natural angle.

I had to write a manual calibration routine calibrateRestPosition() that runs at startup. It takes 20 samples to establish a “zero” baseline for the user’s specific grip, allowing the game to detect relative changes rather than absolute ones.

Sensitivity Tuning

Initially, the bird was sluggish. Users were waving frantically with no response, or triggering two responses as they waved up AND down. The threshold was set to 1.0f (requiring 1g of force above gravity) and bidirectional. I had to lower this to 0.5f to make the “Wave” detection feel snappy and responsive, and one-directional so you go down to “Bump” the bird up.

Killing the IAT

Note that I had built elaborate logic for stimuli arrays, category pairs (like “Female Doctor”), and complex CSV logging for reaction times. But I looked at the codebase and the user experience. The game wasn’t fun. It was stressful and buggy.

The IAT logic was causing errors in the drawing loop, and the “Game Over” screen was breaking because the state management between the IAT trials and the physics engine was out of sync. Users were dying because they lacked familiarity and the buggy interface.

I commented out the entire IAT part.

  • I removed the word overlays and the tilt logic.
  • I optimized the Arduino code to stop listening for X/Y tilt axes entirely to save processing power.
  • I refactored the name from Flappy IAT to Wavy Bird.

Refining Core Experience

Once I removed the “noise” of the psychological test, I could focus on making the actual game better.

  • The Tutorial: I noticed users could die instantly upon starting. I introduced a “Tutorial Mode” where collisions don’t kill you but reset the bird position.
  • The “Game Over” Loop: A persistent bug caused the screen to go blank upon death instead of showing the score. I fixed the state transition so that handleCollision correctly triggers the Game Over screen, allowing for a restart without refreshing the page,.

Conclusion

The project I ended up with is smaller than the one I proposed, but I would say it’s a trade-off, particularly when I am managing three final projects for this semester. By cutting the IAT feature, I could focus more, making the game from a broken research tool to a polished physical game.

If one has to explain the controls for five minutes before someone can play, the design is too complex. Sometimes, we just need to wave.

Week 12 – Final Project Proposal

My midterm, Pitchy Bird, was a voice-controlled version of the classic Flappy Bird where the user’s vocal pitch determined the bird’s flight height. People leaned into microphones, experimenting with their vocal range to navigate obstacles. It was a fun experience. But for my final project, I want to take what learned to push it into physical computing and psychological testing.

I’m currently actively prototyping Flappy IAT, an endless runner controlled by an Arduino-based accelerometer that gamifies the Implicit Association Test (IAT).

The Concept

Instead of voice, players use a custom handheld controller (powered by an MMA8452Q accelerometer or a joystick). The mechanics require high-speed multitasking to bypass conscious cognitive filtering:

  1. Navigation (“Bump”): Players must physically “bump” the controller upward to flap the bird’s wings and maintain altitude against gravity.
  2. Categorization (“Tilt”): Every pipe obstacle displays a text stimulus (e.g., words like “Happy,” “Sad,” or category pairs like “Female Doctor”). To pass through the pipe safely, the player must tilt the controller Left (Green/Good) or Right (Red/Bad) to correctly categorize the word.

If it works, this game will be an active research tool more than only about survival. By forcing players to make categorization decisions under the pressure of keeping the bird afloat, the system exposes unconscious biases. The game tracks specific metrics via serial communication to p5.js:

  • Reaction Time (RT): The milliseconds between seeing the word and tilting the device.
  • Response Intensity: Measuring the angular velocity and degree of the tilt to detect hesitation or certainty.
  • Cognitive Load: Increasing difficulty across three levels to force players into a “flow state” where implicit biases are harder to suppress.

Technical Implementation

The project utilizes a bidirectional data flow. The Arduino handles raw gesture detection (calibrating “rest” position to detect relative Bumps and Tilts), while p5.js manages the game state, visual feedback (green checks for correct tilts, red flashes for errors), and data logging. At the end of a session, the game exports a CSV file detailing the player’s performance and implicit bias metrics.

Week 11 – Thoughts on Final Project

For my final project, I want to build something that makes the invisible visible. Specifically, the split-second biases we don’t even know we have.

My Idea

I’m designing an interactive system that captures implicit reaction bias through a combination of visual stimuli and physical sensors. There will be rapid-fire choices that measure not just if you respond, but how you respond. Do you press harder on certain choices? Do you hesitate before others? Our body knows things our conscious mind doesn’t want to admit.

How It Works

The p5.js component will display paired visual stimuli (e.g., shapes, colors, maybe even symbolic representations) that require quick binary decisions. Meanwhile, Arduino sensors capture the physical story: pressure sensors under each button measure response intensity, and an accelerometer tracks micro-movements and hesitations. I’m also considering adding a heart rate sensor to catch physiological stress during decision-making moments.

Why It Works

The power comes from the immediate feedback. As you go through trials, p5 visualizes your patterns in real-time, in color-coded displays showing where you reacted fastest, where you pressed hardest, where you hesitated. It’s confrontational in the best way: holding up a mirror to unconscious patterns we’d rather not see.

My Hesitation

This may be too psychological and not “art”, so people may question the interactivity of the “game”, or may not even recognize it as a game.

Week 11 – Group Exercises

Group Members: Joy, Yiyang

(Arduino codes are commented at the bottom of p5 codes)

1. Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/63Yg60k8D
Video Demonstrative: IMG_9638

2. Make something that controls the LED brightness from p5

p5 & Arduino Code: https://editor.p5js.org/yiyang/sketches/dtftbIzaK

3. take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/v77Sd41K4
Video Demonstrative: IMG_9640

Week 11 – Reading Response

Design Meets Disability treats constraint more than a deficit. When a flow fails under limited vision or one‑hand use, the failure is a measurement of hidden demand, e.g., precision, memory, timing. Fixing those loads tends to make the product faster and safer for everyone.

The shift from accommodation to inclusive design is a key shift in the design workflow. An add‑on ramp or a screen reader retrofit treats accessibility as a patch. Building large hit targets, consistent structure, multimodal signals, and undo by default treats accessibility as a core requirement. The second approach reduces support costs and broadens use. Captions prove the point, becuase they serve deaf users first, but help viewers in noisy spaces, language learners, and anyone scanning a video on mute. Curb cuts started for wheelchairs, then made cities usable for strollers, carts, and bikes. The spillover is not an accident, but it is the direct result of “designing for constraints” that generalize. As Charles Eames believed, “design depends largely on constraints.” And I agree.

Inputs, perception, and cognition form a clear framework. Products assume hands with fine motor control, perfect vision and hearing, and uninterrupted attention. Alternatives (e.g., switch controls, eye‑tracking, high contrast, scalable text, haptics, chunked steps) lower friction and error rates across contexts. Voice control demonstrates its utility along this path—essential for limited mobility, valuable hands‑free while cooking or driving. Predictive text began as assistive technology, now it is an everyday efficiency feature, especially in coding today, a humorous but significant save of less efficient efforts. That’s why Cursor dominates.

I do have questions. Are there cases where inclusive design adds complexity that hurts clarity? For instance, voice, haptic, and visual signals can compete. The answer may be progressive disclosure. Maybe default to one strong signal, then let users layer or swap modes. But it raises another concern. How do we balance among options, and must we spend resources to develop all of them? How do teams budget for inclusive testing without turning it into a checkbox? We must tie success to task completion under constraint, not to the presence of features. If two flows pass under one‑hand use, low contrast, and divided attention, or even less for a more specific use case they are ready.

The actionable stance: Write constraints into requirements. Set minimum hit area sizes, enforce semantic structure, and require undo and confirmation for destructive actions. Prefer multimodal capability, but choose a clear default per context. Measure success as completion with minimal precision under time pressure.

Week 10 — Reading Response

Bret Victor argues that hands do two things, feel and manipulate, and that most screen-first products ignore both. On the counter I judge texture, resistance, and weight, I adjust heat by feel, I correct errors through immediate tactile feedback. On the screen I scroll and tap with one finger, I convert rich physical cues into flat sequences of steps, accuracy falls and attention shifts from food to interface.

Fitness tracking shows a similar pattern. A watch counts reps and time, yet it cannot teach grip pressure, bar path, stance, or breath. Effective coaching speaks through the body, the right cue is a change in force or timing, not another chart. A better tool would offer variable resistance and haptic prompts, small vibrations for tempo, pressure feedback for grip, and state you can feel without looking.

Even productivity tools can illustrate the loss in “transaction”. Physical sticky notes on a whiteboard build spatial memory, clusters are recalled by location and reach, the body encodes the arrangement. Dragging cards on a screen removes proprioception, scanning columns replaces simple recall by place. Tangible controllers and deformable surfaces could restore some of that embodied structure, information would be carried in texture and force, not only pixels.

To improve this, I propose we treat touch as information but not just input. Design for affordances that speak through force, texture, and spatial arrangement. If a tool mediates physical tasks or spatial understanding, add haptic and tangible feedback before adding new visual layers.

Week 10 — Electronic Drum

Concept

For our interactive media sound project, my partner, Joy Zheng, and I decided to create a simple yet expressive instrument with a few sensors and a buzzer on Arduino Uno. We wanted to build something that was intuitive to play and produced a unique, percussive sound. The result is this force-sensitive drum. Tapping different pads creates different notes, and a toggle switch shifts the entire instrument into a higher-pitched mode.

Our initial idea was inspired by the force sensors used in class to control sound. We thought, what if we could use multiple sensors to combine frequencies and create rhythms? We brainstormed a few possibilities. Could we assign different chords to each sensor, where pressing harder makes a certain chord more prominent? Or could the sensors act as modifiers for a continuous track?

We settled on a more direct approach for a playable instrument. We decided to have three Force Sensitive Resistors (FSRs) that would each trigger a distinct note, like pads on a drum machine. To meet the project requirements and add another layer of interactivity, we incorporated a digital two-way switch. Flipping this switch would transpose the notes of all three pads to a higher octave, giving the player two different sound palettes to work with.

Arduino Build

The build was straightforward, centered around an Arduino Uno and a breadboard.

Components Used:

  • 1x Arduino Uno

  • 1x Breadboard

  • 3x Force Sensitive Resistors (FSRs), our analog sensors

  • 1x Two-way toggle switch, our digital sensor

  • 1x Piezo Buzzer

  • Resistors (for the FSRs and switch)

  • Jumper wires and Alligator clips

Each of the three FSRs was connected to a separate analog input pin on the Arduino. This allows the Arduino to read a range of values based on how much pressure is applied. The toggle switch was connected to a digital pin to give us a simple ON/OFF (or in our case, Mode 1/Mode 2) reading. Finally, the piezo buzzer was connected to a digital pin capable of PWM (Pulse Width Modulation) to produce the tones.

The Arduino code continuously checks the state of our mode switch and reads the pressure on each of the three force sensors. If a sensor is pressed hard enough to cross a defined hitThreshold, it calls a function to play a corresponding sound.

To simulate it more as a drum effect, we made this for loop to create this pitch decay effect:

// drum pitch decay effect
  for (int f = baseFreq + 40; f > baseFreq; f -= 5) {
    tone(buzzer, f);
    delay(10);
  }

Challenges and Improvement

There was evolution of our instrument. We started with a basic concept (v0.1) and then refined it by adjusting the frequency gaps between the sensors for a more distinct and musical sound (v1.0a). Finally, we tweaked the delay to give it a more responsive and percussive, drum-like feel (v1.0b).

Our biggest physical challenge was the alligator clips. It was indeed a handy tool to create a prototype, but their exposed metal heads made it very easy to accidentally create a short circuit if they touched. We learned to be meticulous about checking that the rubber insulators were covering the clips properly before powering on the Arduino.

On the software side, getting the sound right was an iterative process. First, we spend time exploring the pitch gaps. Initially, the pitches were too close together and didn’t sound very musical. By trial and error, we adjusted the base frequencies to create a more noticeable and pleasant musical gap between the pads. Second, rhythm and feel in hand needed to match a those of a “drum machine”. We played with the delay() value in the main loop. A shorter delay made the instrument feel much more responsive and rhythmic.

If we were to continue this project, we could add more sensors for a full octave, or perhaps use the analog pressure value to control the volume (amplitude) of the note in addition to triggering it. It would also be interesting to experiment with different waveforms or sound profiles beyond the simple tones.

Week 9 – Two-Degree Safety

My Concept

I decided to build a “two-degree safety guardrail.” The logic is based on two separate actions:

  1. Idle State: A red LED is ON (digital HIGH).
  2. “Armed” State: A button is pressed. This turns the red LED OFF (digital LOW).
  3. “Active” State: While the button is held, a FSR (force-sensing resistor) is pressed, which controls the brightness of a green LED (my analog output).

Challenge

My challenge was getting the red LED to be ON by default and turn OFF only when the button was pressed. I tried to build a hardware-only pull-up or bypass circuit for this but struggled to get it working reliably on the breadboard.

So, I shifted that logic into Arduino code adapted from a template in the IDE.

// constants won't change. They're used here to set pin numbers:
const int buttonPin = 2;  // the number of the pushbutton pin
const int ledPin = 13;    // the number of the LED pin

// variables will change:
int buttonState = 0;  // variable for reading the pushbutton status

void setup() {
  // initialize the LED pin as an output:
  pinMode(ledPin, OUTPUT);
  // initialize the pushbutton pin as an input:
  pinMode(buttonPin, INPUT);
}

void loop() {
  // read the state of the pushbutton value:
  buttonState = digitalRead(buttonPin);

  // check if the pushbutton is pressed. If it is, the buttonState is HIGH:
  if (buttonState == LOW) {
    // turn LED on:
    digitalWrite(ledPin, LOW);
  } else {
    // turn LED off:
    digitalWrite(ledPin, HIGH);
  }
}

The demo was shot with the Arduino implementation.

Schematic

However, I later figured out the pull-up logic, and was able to implement a hardware-only solution. This schematic was a result of the updated circuit.

Week 9 – Reading Response

Interactive art should be designed as a conversation (or workshop seminar) instead of a speech. The clearest takeaway from the two pieces is that affordances and arrangements matter more than artist statements. If the system communicates its possibilities through handles, hints, and constraints, the audience can complete the work through action. When artists over-script, they collapse the range of possible meanings and behaviors into one narrow path.

Physical computing’s “greatest hits” list is useful because it exposes where interaction often stalls. Video mirrors and mechanical pixels are beautiful, but they rarely push beyond “move, see response.” Gloves, floor pads, and utilty controllers introduce a structured gesture vocabulary that people already know, which shortens the learning curve. Across categories: where gesture has meaning, interaction retains depth; where mappings are shallow, novelty fades quickly.

These pieces prompt two design questions. First, what minimal cues will help a participant discover the interaction without text? Second, what state changes will keep the system from settling into a single loop? In practice, that means compositional choices, such as discrete modes, cumulative effects, and recoverable errors.

For attention sensing, presence is not engagement. Designers should think of for signals that correlate with intent, then treat them probabilistically. Use the audience’s behavior as feedback to adjust affordances, not narratives. If the work does not evolve under interaction, you likely built a display, not a performance.