Week 12 – Final Project Proposal

The Grove

Overview

The Grove is a cozy resource management and crafting game built in P5.js for the midterm. The player moves between five locations — a Map, River, Forest, Pottery Studio, and Greenhouse — collecting resources, making pottery, and growing plants. For the final, the project is being expanded into a physical table installation. All mouse and keyboard input is replaced with custom props and sensors, so every in-game action has a corresponding physical one.

What’s Already Built

The full P5.js game is complete from the midterm. This includes:

    • All five scenes with backgrounds, navigation, and per-scene background music
    • A full inventory and resource system (clay, soil, water, seeds, pots)
    • The complete pottery workflow: place clay, shape on wheel, fire in furnace, collect
    • A plant growth system with three growth stages, timers, and harvest logic
    • Sprite sheet animations for plants, pots, and the watering can
    • Sound effects for every interaction and background music per scene
    • A custom cursor, backpack inventory overlay, main menu, pause menu, and instructions screen
Physical Components
Joystick — Navigation

The joystick is the main input device and replaces the mouse entirely. It handles all navigation and selection across every scene.

    • Left/Right on the Map cycles through locations (Studio, Greenhouse, Forest, River). Button enters.
    • In scenes, Up moves focus to the upper HUD (Back to Map / Menu). Down moves to the lower HUD (inventory). Left/Right cycles between options in the current zone. Button confirms.
    • In the Studio, Left/Right switches focus between the pottery wheel and furnace.
    • In the Menu, Up/Down cycles options and Button selects.
Ultrasonic Proximity Sensor — Pottery Wheel

An HC-SR04 sensor is mounted face-up at the Studio zone. To shape a pot, the player holds both hands above it with palms facing down, mimicking cupping clay on a wheel. The closer the hands, the faster the pot shapes. Pulling hands away pauses progress. This replaces the original click-and-hold mechanic.

Potentiometer — Furnace

A rotary dial at the Studio zone controls the furnace. Turning it up starts firing. The player watches the pot sprite on screen and turns the dial down at the right moment to retrieve the pot. Leaving it too long burns or destroys it.

Digging Mechanic — Forest

Five fixed plot positions in the forest scene each map to a physical point on the table. On every spawn and respawn (after a 30-second cooldown), each plot is randomly assigned as clay or soil, readable from the sprite on screen.

The current plan is to use a conductive shovel prop with an aluminum foil tip and five corresponding foil contact points on a flat board (four corners and a center), each wired to its own Arduino pin. Touching the shovel to a point completes the circuit and triggers that plot. Contact must be held for 200ms to avoid false triggers.

An alternative being considered is a keypad, where each key corresponds to one of the five plot positions. This would be simpler to wire and more reliable, but less immersive than the shovel prop given the physical metaphor of the rest of the installation.

Water Sensor — Greenhouse

A moisture sensor sits inside a small cup with drainage holes at the bottom. After placing a seed, the player physically pours water into the cup to water the plant in-game. A high analog threshold distinguishes a real pour from residual moisture, and a short debounce timer prevents repeated triggers while the cup drains.

Arduino

The Arduino reads all sensors and sends a single comma-separated line over serial each loop:

joyX, joyY, joyBtn, proximity, potValue, waterValue, dig0, dig1, dig2, dig3, dig4

All game logic stays in P5. The Arduino only handles reading and transmitting. Debounce and sampling logic is handled on the Arduino side: 150ms for the joystick, 200ms hold for shovel contacts, 3-4 second suppression after a water trigger, and a 5-sample rolling average for the proximity sensor.

P5 Changes

The existing code is being modified, not rewritten. The main changes are:

    • input.js: Web Serial API replaces all mouse and keyboard handlers. Incoming serial data is parsed into a global sensorState object read every frame.
    • globals.js: New sensor state variables added alongside existing ones.
    • screens.js: A focus system is layered into each scene. The joystick moves focus between defined zones per scene. The button triggers whatever is currently focused. Pulsating glow and highlight drawn on the focused element.
    • classes.js: ResourcePlot updated to re-randomise its type on each respawn.
    • sketch.js: Minimal changes.
Table Layout

Each prop sits in a zone on the table that corresponds to its location in the game. The joystick is at the center. The proximity sensor and potentiometer are grouped at the Studio zone. The digging board and shovel are at the Forest zone. The watering cup is at the Greenhouse zone. The idea is that as the player moves between locations in the game, they also shift their focus and hands to a different part of the table.

Progress
    • Done: Full P5.js game from midterm
    • Done: Physical interaction design for all five components
    • Done: Joystick navigation model defined for all scenes
    • In progress: Arduino wiring and serial protocol
    • Pending: Web Serial integration in P5
    • Pending: Focus and highlight system in P5
    • Pending: Sprite changes in P5
    • Pending: Sensor threshold calibration
    • Pending: Physical prop construction and table setup

Week 11 – Serial Communication

Repository

Repository

Exercise 1

1. Overview

In this exercise, we explored serial communication between Arduino and p5.js. The main objective was to use a single sensor on Arduino and translate its input into visual movement in p5 — specifically controlling an ellipse moving horizontally across the screen. A potentiometer connected to pin A1 served as the analog input, and no data was sent back from p5 to Arduino, making this a one-directional communication exercise.

2. Concept

The core idea was to establish the simplest possible link between a physical input and a digital visual. By turning a potentiometer, the user directly moves an ellipse across a canvas in the browser in real time. This introduced us to the fundamental pipeline of physical interaction: sensor reads a value, Arduino maps and sends it, p5 receives and translates it into something visible on screen. The simplicity of the setup made it easy to trace the data flow end-to-end and understand what was happening at each stage.

3. Process and Methods
    • We began by following the example demonstrated in class and gradually adapted it to better understand the data flow. On the Arduino side, the potentiometer on pin A1 returns a raw analog value between 0 and 1023. We mapped this to a smaller range of 0–255 using map() before sending it through Serial.println(). This made the value easier to work with on the p5 side without losing meaningful range.
    • For p5.js, we used the p5.webserial library following the structure introduced in class. The serial connection is opened manually through a button, which triggers the browser’s port picker dialog. Inside draw(), port.readUntil(“\n”) reads each incoming line, trim() strips the newline character, and int() converts the string into a usable number. That number is then mapped to the canvas width using map(), which drives the ellipse’s horizontal position. The ellipse stays fixed on the vertical centre of the canvas at all times.
4. Technical Details
    • The Arduino maps the raw potentiometer value before sending it to reduce the range to something the p5 side can easily stretch across the canvas:
// ── READ SENSOR ──
// analogRead returns 0–1023 based on the voltage at A1.
// At 0V (GND side) → 0. At 5V → 1023.
int potentiometer = analogRead(A1);

// ── MAP TO SMALLER RANGE ──
// Compress 0–1023 to 0–255 so p5 can easily map it onto the canvas width.
int mappedPotValue = map(potentiometer, 0, 1023, 0, 255);

// ── SEND TO p5 ──
// Serial.println() sends the number as a string followed by a newline character '\n'.
// p5 uses that newline to know where one value ends and the next begins.
Serial.println(mappedPotValue);
    • On the p5 side, the incoming string is cleaned and converted before being mapped to a screen position:
let str = port.readUntil("\n");

if (str.length > 0) {
  // trim() removes the trailing '\n' (and any spaces).
  // int() converts the cleaned string into a number.
  let val = int(trim(str));

  // ── MAP VALUE TO CANVAS ──
  // Arduino sends 0–255. We stretch that range across
  // the full canvas width so the ellipse covers the
  // entire screen as the pot is turned.
  x = map(val, 0, 255, 0, width);
}
    • This two-step mapping — first on Arduino from 0-1023 to 0-255, then in p5 from 0-255 to 0-canvas width — demonstrates how data can be progressively transformed as it passes between systems.
5. Reflection

This exercise gave us a concrete understanding of how physical input can directly influence digital visuals in real time. The most valuable part was seeing the full pipeline in action: turning the potentiometer caused an immediate, visible response on screen, which made the abstract idea of serial communication feel tangible. It also highlighted the importance of consistent data formatting — the newline character that Serial.println() appends is what makes port.readUntil(“\n”) work reliably on the p5 side. If we were to continue developing this, we would explore using different sensors such as FSR or ultrasonic distance sensors, add smoothing to reduce noise, and expand the visuals to control more than one element.

Exercise 2

1. Overview

In this exercise, we reversed the direction of communication — from p5.js to Arduino. The objective was to control the brightness of a physical LED using mouse movement in the browser. p5 continuously sends a brightness value based on the mouse’s horizontal position, and Arduino uses that value to dim or brighten the LED through PWM. No data is sent back from Arduino to p5.

2. Concept

Where Exercise 1 used hardware to control software, this exercise flipped that relationship. The browser became the controller and the LED became the output. Moving the mouse across the screen is a familiar, intuitive gesture, and seeing that gesture reflected immediately in a physical light made the connection between the two systems feel direct and satisfying. This exercise also introduced us to the handshake pattern, which is necessary when Arduino needs to wait for p5 to be ready before the communication loop can begin.

3. Process and Methods
    • We kept the overall structure close to what was demonstrated in class and simplified the communication to a single value per message. On the p5 side, mouseX is mapped to a range of 0–255 using map() and constrained with constrain() to prevent out-of-range values. This number is sent to Arduino as a string followed by a newline character using port.write().
    • On the Arduino side, the sketch begins with a handshake loop that repeatedly sends “0” and blinks the built-in LED until p5 responds. Once connected, Serial.parseInt() reads the incoming integer from the serial buffer. After confirming the message ends with ‘\n’, analogWrite() applies the value to the LED on pin 5. Because analogWrite() requires a PWM-capable pin, the LED must be connected to one of the pins marked with a tilde (~) on the Arduino board — in our case, pin 5.
4. Technical Details
    • The p5 sketch maps mouse position to brightness and sends it continuously while the port is open:
// ── CALCULATE BRIGHTNESS ──
// Map the mouse's horizontal position across the canvas
// to a brightness value in the range 0–255.
// This matches the range that analogWrite() accepts on Arduino.
let brightness = int(map(mouseX, 0, width, 0, 255));

// ── CONSTRAIN ──
// Clamp the value so it never exceeds 0–255,
// even if the mouse moves outside the canvas.
brightness = constrain(brightness, 0, 255);
// ── SEND TO ARDUINO ──
// Only write if the port is open (i.e., user has connected).
// We append '\n' so Arduino's Serial.read() can detect
// the end of the message after parseInt() runs.
if (port.opened()) {
  port.write(brightness + "\n");
}
    • Arduino reads the value and applies it to the LED:
// ── READ BRIGHTNESS ──
// parseInt() reads digits from the buffer and returns them as an integer. 
// It stops at any non-digit character (including the newline).
int brightness = Serial.parseInt();

// ── CONFIRM COMPLETE MESSAGE ──
// After parseInt(), the next character should be '\n'.
// This confirms we received a full message and not a partial one, preventing corrupted values.
if (Serial.read() == '\n') {

  // ── SET LED BRIGHTNESS ──
  // analogWrite(pin, 0–255) uses PWM to control brightness.
  // 0 = fully off, 255 = fully on, values in between = dimmed.
  analogWrite(ledPin, brightness);
}
    • The ‘\n’ check after parseInt() is important — it confirms that a complete message was received before acting on the value, which prevents the LED from responding to corrupted or partial data.
5. Reflection

This exercise made the communication feel more interactive than Exercise 01, because the browser was no longer just a display — it was actively sending instructions to hardware. The main issue we encountered was that the LED did not dim smoothly at first. After checking the wiring, we found the LED was connected to a non-PWM pin, which meant analogWrite() had no effect. Moving it to pin 5 resolved this immediately. This was a useful reminder that the physical wiring must match the assumptions in the code. If we were to continue, we would replace mouse control with a more interesting interaction such as key presses or dragging, add multiple LEDs with independent brightness values, and eventually expand into bi-directional communication.

Exercise 3

1. Overview

This exercise brought together everything from the previous two by implementing full bi-directional communication between Arduino and p5.js. A joystick connected to Arduino controls the wind force in a physics simulation, while p5 sends a signal back to Arduino to light up an LED every time the ball bounces. Data flows in both directions simultaneously.

2. Concept

The gravity and wind sketch provided a compelling context for bi-directional communication because it has two clearly distinct interactions that naturally map to each side: the joystick controls something in the simulation, and something in the simulation triggers a response on the hardware. Rather than the user controlling a parameter directly, the LED reacts to an event — a bounce — which made the physical and digital feel genuinely connected rather than just linked. Replacing keyboard input with a physical joystick also made the experience more immersive, since it gave the user a tangible way to influence the simulation.

3. Process and Methods
    • We started from the gravity and wind example provided in class and kept the simulation structure mostly unchanged. The main modifications were replacing keyboard-based wind control with joystick input, and adding bounce detection that communicates back to Arduino.
    • On the Arduino side, the joystick’s horizontal axis is read from pin A0 using analogRead() and sent to p5 via Serial.println() after every incoming message. On the p5 side, the value is read with port.readUntil(“\n”), trimmed, converted to an integer, and mapped from 0-1023 to a wind force range of -1 to 1. This value is applied to the wind vector each frame, so tilting the joystick left or right pushes the ball accordingly.
    • For the return signal, we added bounce detection inside draw(). Each frame, a variable bounced is initialised to 0. If the ball hits the floor with a downward velocity greater than 2, it is counted as a real bounce and bounced is set to 1. This value is sent back to Arduino using port.write(). Arduino reads it with Serial.parseInt() and calls digitalWrite() to turn the LED on or off.
4. Technical Details
    • The bounce detection uses a velocity threshold to distinguish real impacts from the small residual movements that occur when the ball settles on the ground:
  // ── BOUNCE DETECTION ──
  let bounced = 0;                    // assume no bounce this frame
  let floorY = height - mass / 2;    // y where ball touches the floor

  if (position.y > floorY) {
    position.y = floorY;   // prevent ball from going below floor

    // ── VELOCITY THRESHOLD ──
    // Only count as a real bounce if the ball hits with enough
    // downward speed (> 2). This filters out the tiny movements when the ball is nearly at rest
    // Would otherwise cause the LED to flicker continuously.
    if (velocity.y > 2) {
      velocity.y *= -0.9;   // reverse and reduce (energy loss on impact)
      bounced = 1;           // signal a real bounce to Arduino
    } else {
      velocity.y = 0;        // ball has come to rest — stop it completely
    }
  }

  // ── SEND BOUNCE STATE TO ARDUINO ──
  // Sends 1 if a real bounce happened this frame, 0 otherwise.
  // Arduino uses this to briefly light the LED on each bounce.
  // '\n' is appended so Arduino can confirm end of message.
  if (port.opened()) {
    port.write(bounced + "\n");
  }
}
    • On the Arduino side, the received state is applied directly to the LED:
// ── READ LED STATE FROM p5 ──
// p5 sends '1' when a bounce is detected, '0' otherwise.
// parseInt() extracts the integer from the incoming string.
int ledState = Serial.parseInt();

// ── CONFIRM COMPLETE MESSAGE ──
// Check for the newline that p5 appends with port.write().
// This prevents acting on partial or corrupted values.
if (Serial.read() == '\n') {

  // ── CONTROL LED ──
  // ledState is 1 (HIGH) when p5 detects a bounce → LED on.
  // ledState is 0 (LOW) otherwise → LED off.
  digitalWrite(ledPin, ledState);
}
    • The joystick read and send happens inside the same while (Serial.available()) block, directly after the LED state is processed. This keeps the two directions of communication tightly coupled within each communication cycle.
5. Reflection

This was the most technically demanding exercise of the three, and also the most rewarding. The biggest challenge was the LED flickering continuously when the ball came to rest on the ground. Even though the ball appeared stationary, the physics simulation kept generating tiny downward movements due to gravity, which the system kept interpreting as bounces. Introducing the velocity threshold velocity.y > 2 solved this cleanly and also taught us something broader: physics simulations produce continuous, noisy output, and meaningful events often need an explicit threshold or condition to be extracted from that noise. If we were to continue, we would add a dead zone to the joystick centre to reduce drift, use both joystick axes for more complex motion, and add more LEDs to represent different simulation events.

6. Resources

Week 11 — Reading Response

What struck me most about Pullin’s argument is the way he reframes concealment — not as a neutral design default, but as a value judgment quietly embedded in every flesh-toned hearing aid and skin-matching prosthetic. I’d always assumed that “discreet” design for disability was simply considerate, a way of respecting the user’s desire to blend in. Pullin destabilizes that assumption by asking who decided blending in was the goal in the first place. The eyeglasses example is effective precisely because it’s already resolved: nobody today apologizes for visible frames or tinted lenses, and the question of why hearing aids can’t occupy the same cultural space is genuinely difficult to answer without exposing some discomfort about how disability is perceived. What this raises for me is whether the concealment instinct in design is a response to user needs or a projection of the designer’s own unease — and whether those two things are even distinguishable in practice. If a designer has never lived with a hearing aid, their intuition about “what users want” is shaped by imagining the stigma from the outside, which may be a very different thing from what someone actually navigating that stigma would choose.
The tension I keep returning to is the one between designing for expression and designing for choice — and I’m not sure Pullin fully resolves it. His most compelling cases, like Aimee Mullins’ carved wooden legs, work because they belong to a specific person with a specific relationship to visibility and performance. But what interests me is the middle ground he gestures at: the user who neither wants to pass as non-disabled nor be conscripted into a narrative of disability-as-spectacle they didn’t ask for. That unresolved space feels like the honest core of the book, and it connects for me to broader questions in interaction design about whether personalization is a solution or a way of deferring a harder design decision. Giving users options is good, but the options still frame what’s possible — and right now, as Pullin shows, the frame has been set almost entirely by clinical priorities and the discomfort of people who aren’t disabled. That framing shapes perception in ways I hadn’t fully considered before reading this.

Week 10 — Musical Instrument

Mariam B and Jenny

1. Repository

Repository

2. Overview

This project is an interactive electronic musical instrument. An ultrasonic distance sensor acts as a pitch controller — the player moves their hand closer or further away to select notes from a pentatonic scale. A potentiometer controls the tempo of the beat. A toggle switch changes between two musical modes: a steady melodic mode where every beat plays the note selected by the player’s hand, and a drum-style alternating mode where deep and light tones trade off rhythmically. All sound is produced through a piezo buzzer.

3. Concept

The concept was inspired by the Theremin — a classical electronic instrument played without physical contact, where the performer’s hand position in the air controls the pitch of the sound. Rather than pressing discrete keys to trigger notes, the player sculpts sound by moving their hand through the air above the ultrasonic sensor. This creates a continuous, expressive range of input that feels intuitive and physical at the same time. The toggle switch gives the instrument two distinct personalities — one melodic and one percussive — while the potentiometer lets the player set the energy of the performance by controlling how fast the beat moves.

    • Note on AI assistance: The wiring configuration for this project was determined with the help of Google Gemini. The circuit concept was described to Gemini and it provided wiring instructions for the components.
4. Process and Methods
    • The ultrasonic distance sensor fires a 10-microsecond pulse and measures how long the echo takes to return. A helper function microsecondsToCentimeters() converts that time into a distance in centimeters. The distance is then mapped to an index in the scale[] array using map(), so each position of the hand corresponds to a specific note.
    • The scale[] array uses note definitions from a pitches.h file and is limited to a two-octave pentatonic scale — C, D, E, G, A across octaves 4 and 5. This removes dissonant half-steps so the instrument always sounds in key regardless of where the hand is placed.
    • The potentiometer is read using analogRead and mapped to a time interval between 150 and 800 milliseconds, which controls how frequently the beat fires.
    • A non-blocking timer using millis() handles beat timing instead of delay(). This keeps the Arduino continuously reading the sensors between beats so the instrument feels responsive and live. A beatStep counter increments on every beat and is used by the alternating mode to separate even and odd beats.
    • The toggle switch selects between the two musical modes and uses INPUT_PULLUP so no external resistor is needed.
5. Technical Details
    • The pitches.h file defines note names as frequency constants, allowing the scale array to be written in readable musical notation rather than raw numbers:
// ── MUSICAL SCALE ──
// A two-octave pentatonic scale (C, D, E, G, A across octaves 4 and 5)
int scale[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_G4, NOTE_A4,
               NOTE_C5, NOTE_D5, NOTE_E5, NOTE_G5, NOTE_A5};
int numNotes = 10;
    • unsigned long is used for the timer variables because the millis() counter grows very quickly and would overflow a standard int (maximum 32,767) after only about 32 seconds:
unsigned long startMillis = 0;
unsigned long currentMillis = millis();
    • The distance sensor trigger sequence follows the standard HC-SR04 pattern — a brief LOW to clear the pin, a 10-microsecond HIGH pulse to fire the sensor, then LOW again to listen:
// ── ULTRASONIC SENSOR TRIGGER SEQUENCE ──
// Pull the trigger pin LOW briefly to ensure a clean starting state
digitalWrite(distPin, LOW);
delayMicroseconds(2);
// Send a 10-microsecond HIGH pulse to fire the ultrasonic burst
digitalWrite(distPin, HIGH);
delayMicroseconds(10);
// Pull LOW again — the sensor is now listening for the echo
digitalWrite(distPin, LOW);

// Measure how long the echo takes to return, in microseconds.
// The 10000 timeout prevents the program from freezing if no object is close enough to reflect the pulse — it returns 0 instead.
duration = pulseIn(echoPin, HIGH, 10000);
    • The distance is mapped starting from 2cm rather than 0 because the HC-SR04 has a physical blind spot below approximately 2 centimetres where readings are unreliable. The upper limit of 50cm keeps the performance zone compact and ergonomic:
int noteIndex = map(cm, 2, 50, 0, numNotes - 1);
    • The alternating drum mode uses the modulo operator on beatStep to separate even and odd beats, assigning a fixed deep tone of 100Hz to even beats as a kick drum and the hand-controlled note to odd beats as a snare:
if (beatStep % 2 == 0) {
  tone(buzzerPin, 100, 50);
} else {
  tone(buzzerPin, currentNote, 50);
}
6. Reflection

This project pushed us to think about musical interaction as much as electronics. The biggest technical challenge was replacing delay() with the millis() timer — understanding why the Arduino had to keep running between beats, rather than pausing, took some time but made the instrument feel genuinely alive and responsive in a way that a delay-based version couldn’t. The decision to use a pentatonic scale rather than mapping the full chromatic range was the single biggest improvement to how the instrument sounds; removing the dissonant notes made even accidental hand movements musical. The Theremin inspiration also shaped how the wiring was approached — describing the concept to Gemini and using its wiring instructions as a reference helped bridge the gap between the interaction idea and the physical circuit. If we were to extend this, we could add a second switch to select between different scales, a visual indicator showing the active mode, and potentially an LCD display to show the current tempo and note being played.

7. Resources

Week 10 — Reading Response

Victor’s “Pictures Under Glass” critique is most compelling not as a takedown of touchscreens specifically, but as a diagnosis of a failure of imagination — the industry confused accessible with good, and then mistook good for visionary. The tool definition he anchors everything to (amplifying human capabilities, not just addressing needs) is doing a lot of work, and it mostly earns it. The sandwich analogy is a little theatrical, but the underlying point lands: we have genuinely extraordinary hands, and sliding a finger on flat glass uses almost none of what makes them extraordinary. The shoelace example is sharper — close your eyes and you can still do it, which is exactly what good tool design should exploit, not eliminate.

What’s interesting is that Victor isn’t really against touchscreens. He’s against treating them as a destination rather than a waypoint. The Kodak analogy in the responses piece clarifies this well: color film wasn’t inevitable, it required people deciding that black-and-white was missing something and doing the research. His concern is that without the rant — without the explicit articulation that something is missing — nobody funds the research. That’s a reasonable fear, and it gives the piece a purpose beyond the polemic.

The responses page is where the argument gets more nuanced and also more interesting. The voice section is the strongest part: his distinction between oracle tasks (ask and receive) and understanding tasks (explore and manipulate) cuts to something real about how knowledge works. You can’t skim a possibility space with your voice. You can’t point at a region of a graph with a command. That’s not a limitation of NLP, it’s a limitation of the modality for that kind of cognition. The explorable explanations he gestures at are a better argument for his position than anything in the rant itself.

The weakest section is the brain interfaces response, which slides from a reasonable point about body-computer mismatch into a somewhat alarmist vision of immobile humans that feels grafted on. And the “my child uses the iPad” rebuttal — channeling all interaction through a single finger is like restricting all literature to Dr. Seuss — is rhetorically satisfying but probably too cute. Accessibility and expressive richness aren’t actually as opposed as the analogy implies; the question is whether you optimize one at the permanent expense of the other. Victor would say yes, that’s exactly what we’re doing. That’s the argument worth sitting with, and it’s one the responses page never quite resolves — which, to his credit, he seems to know.

Week 9 – Reading Response

Tigoe’s “Greatest Hits” taxonomy is useful precisely because it reframes recurring project themes not as signs of creative exhaustion, but as flexible starting points. The most interesting tension in the piece is between projects that offer structured interaction — drum gloves, floor pads, tilty controllers — and those that are essentially aesthetic spectacles you wave at, like video mirrors and mechanical pixels. Tigoe is honest about this distinction without being dismissive, and that balance feels right. The example that stuck with me most is the Digital Wheel Art project, designed for a patient with limited mobility — it reframes a “body-as-cursor” trope into something genuinely purposeful, which suggests that what makes a recurring theme worth revisiting is often the specificity of the context, not novelty for its own sake. The piece raises a question though: if these themes keep recurring, is that because they represent something deeply natural about how humans want to interact with physical systems, or just because they’re the easiest things to build with the available tools?
“Set the Stage, Then Shut Up and Listen” makes a compelling case that interactive art is fundamentally a conversation, and that over-explaining your work is a way of refusing to have that conversation. The director analogy is apt — you can set conditions for an authentic response, but you can’t prescribe the response itself. That said, Tigoe’s argument assumes a fairly confident, well-resourced designer. In practice, people often over-explain their work not out of arrogance but out of anxiety that the interaction won’t be legible at all — a real concern, especially for newer designers. There’s a tension between “shut up and let the audience interpret” and “design clearly enough that people know how to engage.” The “Greatest Hits” piece is actually a useful companion here: it shows that certain gestures — tapping, tilting, waving — are already culturally legible, which makes it easier to step back. The question both readings leave open is how much context is enough, and how you know when you’ve crossed from helpful framing into over-interpretation.

Week 9 — Input Output

1. Repository

Repository

2. Overview

This project is an interactive color-matching game. A random RGB color is displayed on a target LED. The player uses a potentiometer to dial in their best guess across three channels — red, green, then blue — cycling between them with a button. A second RGB LED shows the guess in real time. When the player is satisfied, they press a lock button to submit their answer. The game scores the result based on how closely the guess matches the target across all three channels combined.

3. Concept

The idea came from thinking about how humans perceive color — we rarely see RGB values, we just see a color and react to it intuitively. The game puts that intuition to the test. By isolating each channel and forcing the player to set them one at a time, it makes visible a process that normally happens instantly in our brains. The constraint of the potentiometer (a single knob controlling one value at a time) creates a deliberate, almost meditative interaction. It’s also just a fun game to play, matching colors without needing to understand what’s happening electronically.

4. Process and Methods
    • The potentiometer is read every loop using analogRead, which returns a value from 0 to 1023. This is mapped to 0–255 using map() to match the range that analogWrite expects for PWM output.
    • A state machine (currentState) tracks which channel is active. Only the active channel updates when the pot is turned — the other two hold their last value. This lets the player build up the color incrementally without losing previous adjustments.
    • The Next button increments the state from Red to Green to Blue to Finished. If the player keeps pressing beyond Finished, the state wraps back to Red so they can refine any channel without restarting the round.
    • The Lock button only accepts input when currentState >= 3, preventing the player from accidentally locking in an incomplete guess.
    • randomSeed(analogRead(A5)) seeds the random number generator using electrical noise from an unconnected pin, ensuring a different target color every round.
5. Technical Details
    • Both buttons use INPUT_PULLUP mode — the pin is held HIGH internally and reads LOW when pressed. No external resistors are needed.
// Pin reads HIGH normally, LOW when pressed
pinMode(nextBtn, INPUT_PULLUP);
pinMode(lockBtn, INPUT_PULLUP);
    • The guess LED uses analogWrite (PWM) on pins 9, 10, 11 — this is the analog output requirement. The target LED uses the same approach on pins 5, 6, 7.
// analogWrite sends a PWM signal (0 = off, 255 = full brightness)
analogWrite(redG,   guessR);
analogWrite(greenG, guessG);
analogWrite(blueG,  guessB);
    • Scoring uses the sum of absolute differences across all three channels rather than a per-channel check. This gives a single number (0–765) representing total color distance, which maps naturally to a three-tier result.
// Total difference: sum of how far off each channel is (max possible = 765)
int diff = abs(targetR - guessR) + abs(targetG - guessG) + abs(targetB - guessB);
    • A simple delay(250) after each button press acts as debounce, preventing a single physical press from registering multiple times.
6. Reflection

This project was the most game-like thing I’ve built with the Arduino so far and it pushed me to think about user experience as much as electronics. The biggest design challenge was the state machine — figuring out how to let the player cycle back through channels to refine their guess without losing their previous values took a few iterations. I also underestimated how important the real-time feedback from the guess LED would be; watching the colour change as you turn the pot makes the whole interaction feel alive in a way that a number on a serial monitor never could. If I were to extend this, I would add a scoring history across multiple rounds and a visual indicator on the LED itself to show which channel is currently active (perhaps pulsing the active color) rather than relying entirely on the Serial Monitor for guidance.

7. Resources

Week 8 — Reading Response

Norman’s central argument (that positive affect broadens thinking and makes people more tolerant of minor design flaws, while negative affect narrows it) is intuitive and reasonably well-supported by the psychological research he cites. The distinction between depth-first and breadth-first thinking is genuinely useful, and it reframes stress not as uniformly bad but as a cognitive mode that suits certain situations. That said, Norman is not entirely without bias. The essay reads partly as a self-correction, an attempt to walk back the “usability over beauty” reputation he’d earned from The Design of Everyday Things, and his anecdotal framing around his teapot collection, while charming, does a lot of heavy lifting for what is supposed to be a scientific argument. More importantly, he treats aesthetic pleasure as relatively universal without interrogating whose standards of beauty are being applied — a significant gap when design operates across cultures and contexts.

McMillan’s portrait of Hamilton is compelling and the Apollo 8 anecdote is particularly striking: Hamilton identifies a realistic failure scenario, NASA dismisses it as impossible, and then it happens on a live mission anyway. It’s a clean illustration of institutional overconfidence and the cost of ignoring people closest to the problem. However, the piece leans hagiographic — Hamilton is framed almost as a lone genius, while the broader collaborative effort, including the Raytheon seamstresses who physically hardwired the memory and engineers like Don Eyles, gets quietly sidelined. The bias is an understandable corrective given how long Hamilton’s contributions went unrecognized, but it’s worth naming. What the two readings share is an underlying tension about whose judgment gets trusted within institutions, and both suggest that the people with the clearest view are often the ones being overruled.

Read together, both texts circle the same problem: institutions tend to trust hierarchy over proximity. Norman’s supervisors dismissed color displays as scientifically valueless; NASA dismissed Hamilton’s error-checking as unnecessary. In both cases, the person closest to the work had the clearest picture. Norman resolves this through neuroscience — affect shapes judgment in ways we don’t always consciously recognize. McMillan resolves it through biography — one person’s persistence against institutional resistance changed the course of history. The question both readings leave open is how to build systems, whether design processes or space programs, that actually listen to the people who see the problems first.

Week 8 — Unusual Switch

1. Repository

Repository

2. Overview

For this assignment, I built a hands-free switch that uses the human body as a conductor. The project uses two aluminum foil patches — one taped to the floor, one to the sole of a shoe — to open and close a circuit through the act of jumping. An Arduino reads the state of the switch using digitalRead and communicates over USB serial to a Python script running on a laptop, which plays and stops a song based on the jumping rhythm. The music only plays while jumps are happening in close sequence, stopping if the user pauses for more than 1.5 seconds.

3. Concept

My goal was to turn the whole body into an instrument of interaction, removing the hands entirely and replacing them with something more physical and committed. The initial inspiration came from an unexpected source: my baby nephew, who kept interrupting my work sessions. Rather than fighting the disruption, I decided to include him in the project. Babies and toddlers are defined by their movement — bouncing, jumping, stomping — so building a switch that is activated by jumping felt like a natural way to design around him and his abnormal enthusiasm for dancing. The idea became about handing control to someone who couldn’t operate a traditional switch at all.
The music-as-reward mechanic adds another layer that felt personally fitting: the music only survives as long as the jumping continues, stopping if the user pauses for more than 1.5 seconds. For a restless toddler this creates an instinctive feedback loop between movement and sound. Stop jumping and the music dies. Keep going and it lives.

4. Process and Methods
    • I began with the physical circuit, using the INPUT_PULLUP mode on Pin 2 to avoid needing an external resistor. This keeps the pin held HIGH when the circuit is open, and pulls it LOW when the foil patches make contact through the body.
    • To filter out noise and false triggers — particularly the pin floating when only one foil patch was touched — I implemented a two-stage debounce. The code waits 50ms after detecting a change, then re-reads the pin to confirm the state change was real before acting on it.
    • On the software side, I wrote a Python script that listens on the serial port for LANDED and JUMP messages. Rather than toggling the music on a single event, I used a timeout system: every landing resets a timer, and the music stops only when that timer expires without a new landing.
    • The serial communication bridge between the Arduino and the laptop uses the pyserial library, and audio playback uses Mac’s built-in afplay command via Python’s subprocess module, keeping dependencies minimal.
5. Technical Details
    • The core of the switch detection relies on INPUT_PULLUP, which activates an internal ~50kΩ resistor inside the Arduino chip. This pulls Pin 2 to 5V (HIGH) by default. When the foil patches make contact, they create a conductive path through the human body to GND, pulling the pin LOW.
pinMode(jumpPin, INPUT_PULLUP);
    • The debounce reads the pin twice, separated by a 50ms delay, to confirm that a state change is genuine and not electrical noise or foil flutter on contact:
// Only act if the state has changed since the last loop
if (currentSwitchState != lastSwitchState) {

  // Wait 50ms before reading again — debouncing — one landing could trigger dozens of false readings.
  delay(50);
  currentSwitchState = digitalRead(jumpPin);

  // Re-check that the state still differs after the debounce delay.
  // If the reading went back to the original state, it was just noise — ignore it.
  if (currentSwitchState != lastSwitchState) {

    if (currentSwitchState == LOW) {
      // Foil patches are touching — the circuit is complete — user has landed
      digitalWrite(ledPin, HIGH);       // turn LED on as visual confirmation
      Serial.println("LANDED");        // notify the Python script to start music
    } else {
      // Foil patches are separated — the circuit is open — user is in the air
      digitalWrite(ledPin, LOW);        // turn LED off
      Serial.println("JUMP");          // notify the Python script
    }

    // Save the current state so we can detect the next change
    lastSwitchState = currentSwitchState;
  }
}
    • The Python timeout logic uses time.time() to track the moment of the last landing. The music continues as long as landings arrive within the threshold window:
    if line == "LANDED":
        # Foil patches made contact — reset the timer and start music
        last_landing = time.time()
        start_music()

# If music is playing but no landing has come in within TIMEOUT seconds, stop
if playing and (time.time() - last_landing > TIMEOUT):
    stop_music()

 

6. Reflection

This project pushed me to think about interaction in a much more physical way than usual. The main technical challenge was dealing with pin floating — when only one foil patch was touched, the unconnected pin would pick up ambient electrical noise from the environment and the body, triggering false readings. Understanding why INPUT_PULLUP solves this (by giving the pin a definite default state rather than leaving it undefined) was the key insight. The two-stage debounce was also a lesson in how unreliable physical contacts can be at the moment of touch — foil crinkles and makes intermittent contact before settling, and reading the pin twice filtered that out cleanly. Physically, the strain relief on the wires (taping them to the leg with slack at the ankle) turned out to be just as important as the electronics, since a pulled wire mid-jump would break the circuit unpredictably.

One thing I should have considered more carefully from the start was my nephew’s impatience. The whole project was inspired by him, but toddlers don’t wait; if the music doesn’t start immediately or cuts out too quickly between jumps, the experience breaks down fast. In hindsight, I would have tuned the TIMEOUT value with him in mind from the beginning, giving a little more grace time between landings to account for the unpredictable rhythm of a small child jumping rather than calibrating it around an adult’s pace.

7. Resources

Midterm Project – The Grove

1. Sketch and Code

2. Concept

I wanted to make something that felt more physical than most browser games. The idea was simple: instead of clicking a button and having a resource appear, you actually go and get it. You walk to the river to fill a bucket. You dig in the forest and carry the clay back. You hold your mouse on a pottery wheel until the shape changes. The whole game is built around making you move between spaces and handle things directly, rather than managing numbers in a menu.

The game has five locations — a world map, a river, a forest, a pottery studio, and a greenhouse — each with its own interaction logic and its own music track. You start with five seeds and no other resources, and the loop is: collect clay and water, make a pot in the studio, bring it to the greenhouse with soil and a seed, and wait for the plant to grow. The cursor changes depending on where you are and what you’re carrying, so you can always tell what you’re holding without opening an inventory screen. The visual style came from wanting it to feel lo-fi and cozy, loosely inspired by games like Stardew Valley but much smaller in scope.

The world map – each region is a hand-mapped pixel boundary
3. How it Works

The entire game runs on two parallel state variables stacked on top of each other. gameState controls the meta-level — which screen the player is on (title, instructions, gameplay, or pause). currentLayer controls the world-level — which physical location the player is standing in. Every frame, the draw() loop reads both and routes rendering and input accordingly. This separation means that pausing the game, for instance, simply renders the pause menu on top of an already-drawn scene without tearing anything down. A new layer can be added to the game without touching any existing screen logic.

Navigation between scenes is handled by a rectClick() helper that checks whether the mouse landed inside a manually defined pixel rectangle. The world map coordinates were discovered by logging mouseX and mouseY to the console while clicking over the background image — a reliable form of coordinate mapping. Two rectangles per scene allow irregular regions of the map to be approximated without any polygon math.

/*
 * Detects which map region was clicked and navigates to that layer.
 * Regions are defined as bounding rectangles over the map background art.
 */
function checkMapClick() {
    if (rectClick(0, 190, 260, 470) || rectClick(240, 330, 380, 430)) {
        currentLayer = "STUDIO";
    } else if (rectClick(240, 200, 500, 260) || rectClick(300, 260, 510, 360)) {
        currentLayer = "GREENHOUSE";
    } else if (rectClick(260, 110, 780, 200) || rectClick(520, 200, 780, 290)) {
        currentLayer = "FOREST";
    } else if (rectClick(525, 365, 840, 450) || rectClick(790, 215, 1025, 450)) {
        currentLayer = "RIVER";
    }
}

Plants must keep living regardless of which scene the player is viewing. They are stored in a global activePlants array and updated on every frame via updateGlobalPlants(), called unconditionally at the top of draw(). This means a seedling keeps aging while the player is away collecting water at the river. Growth is tracked using millis() rather than frameCount, making it completely frame-rate independent.

// Called every frame; promotes the stage when enough time has passed. 
update() {
    let age = millis() - this.birthTime;
    if (age > this.growthDuration && this.stage < 2) {
        this.stage++;
        this.birthTime = millis();          // Reset timer for the next stage

        // Play the "fully grown" sound once
        if (this.stage === 2 && !this.hasPlayedGrowthSfx) {
            sfxGrowing.play();
            this.hasPlayedGrowthSfx = true;
        }
    }
}
4. Technical Decisions
The Pottery Wheel — Hold-to-Craft

The most deliberate design decision in the project was rejecting an instant “Make Pot” button in favor of a hold-to-craft interaction. The pottery wheel tracks how long the player’s mouse has been in contact with it and advances a shapingFrame counter every five seconds, visually pulling the clay through four distinct silhouettes. During contact, a looping wheel sound plays and the pot sprite is mirrored horizontally on alternating frames to suggest rotation. Release the mouse and the sound cuts immediately — the wheel stops the moment you lift your hand. The entire sequence takes fifteen seconds of sustained attention, which is long enough to feel like real effort and short enough not to become tedious.

// ── Pottery Wheel ──
if (wheelState !== 'EMPTY' && !isDraggingFromWheel) {
    let isTouching = mouseIsPressed && dist(mouseX, mouseY, wheelX, wheelY) < 70;

    if (wheelState === 'SHAPING') {
        if (isTouching) {
            // Keep wheel sound looping while the player holds the wheel
            if (!sfxWheel.isPlaying()) sfxWheel.loop();

            // Advance the pot shape frame every 5 seconds of contact
            if (millis() - shapingTimer > 5000) {
                shapingFrame = min(shapingFrame + 1, 3);
                shapingTimer = millis();
            }
        } else {
            sfxWheel.stop(); // Stop sound when mouse is lifted
        }

        // Once fully shaped, transition to draggable state
        if (shapingFrame === 3) {
            wheelState = 'READY_TO_DRAG';
            sfxWheel.stop();
        }
    }

    // Draw the pot on the wheel, mirroring every 10 frames to suggest spinning
    push();
    imageMode(CENTER);
    translate(wheelX, wheelY);
    if (wheelState === 'SHAPING' && isTouching && frameCount % 20 < 10) scale(-1, 1);
    drawPotFrame(0, 0, shapingFrame, 200, 200);
    pop();
}
The Furnace — Time as Stakes

Once a shaped pot is dragged into the furnace, a four-phase timer begins. There is a ten-second window to retrieve a perfect pot, then a five-second grace period where the pot is visibly burnt but still removable (though broken), then five more seconds before it crumbles to ash entirely. This makes the act of pot-making carry real risk: leave the studio to collect other resources and you may return to nothing. The time-management tension it creates between the furnace and the wider world loop was a late addition to the design, but it became one of the most important decisions in the whole game — it’s what makes the studio feel dangerous rather than merely mechanical.

// ── Furnace ──
if (furnaceState !== 'EMPTY' && !isDraggingFromFurnace) {
    let elapsed = (millis() - furnaceStartTime) / 1000; // Seconds since firing started

    if (elapsed < 10) {
        furnacePotFrame = 3;
        furnaceState = 'FIRING';
        if (!sfxFurnace.isPlaying()) sfxFurnace.loop();
    } else if (elapsed < 15) {
        furnacePotFrame = 4;
        furnaceState = 'READY_TO_DRAG'; // Pot is done — player can pick it up
        sfxFurnace.stop();
    } else if (elapsed < 20) {
        furnacePotFrame = 5;
        furnaceState = 'BURNT'; // Left too long — pot is cracked
    } else {
        furnacePotFrame = 6;
        furnaceState = 'ASH';  // Completely destroyed
        sfxFurnace.stop();
    }

    imageMode(CENTER);
    drawPotFrame(205, 237, furnacePotFrame, 70, 70);
}
The Cursor as a Physical Inventory

Rather than displaying abstract resource counts in a HUD panel, physical resources are communicated directly through the cursor. In the forest, the shovel sprite changes to show clay or soil clinging to the blade the moment something is dug up. At the river, the bucket visually fills. Resources are deposited by carrying them to the backpack icon in the corner — the act of storing something is the same gesture as moving it there.

Bucket cursor fills visually after clicking the river surface
Cursor becomes a clay-caked shovel after digging a deposit
5. Challenges
Double-Firing Buttons

The most persistent bug in the project was button clicks firing twice from a single physical interaction. p5.js triggers both mousePressed and mouseClicked in sequence for the same click event, and because several buttons triggered state changes or inventory mutations, the same action would execute twice — opening and immediately closing the inventory, or incrementing a counter twice in one tap. The fix was a lastMenuClickTime debounce guard: every button action stamps the current timestamp, and any input arriving within 250 milliseconds of that stamp is silently discarded. Setting mouseIsPressed = false inside the button handler also “eats” the event before any downstream listener can see it.

// Fire the action on click, preventing double-firing with a debounce timestamp
if (hover && mouseIsPressed) {
    sfxButton.play();
    lastMenuClickTime = millis();
    mouseIsPressed = false; // Consume the press so nothing else reacts to it
    action();
}
The Cursor Bleeding Over UI Buttons

A subtler issue emerged from the custom cursor system: the shovel and bucket sprites would remain active when hovering over the “Return to Map” and “Menu” buttons in the forest and river scenes. This made the buttons feel broken — the system’s hand cursor never appeared, and the sprite image obscured the button labels. The fix required duplicating the button bounding-box logic inside drawCustomCursor() and explicitly reverting to cursor(ARROW) whenever the mouse entered a UI button’s region. It’s not the most elegant solution, since the same coordinates appear in two places, but it is simple, clear, and reliable.

6. Areas for Improvement

The most obvious missing layer of feedback is what happens when a planting action fails. If the player clicks a greenhouse slot without the right resources, nothing happens. A brief wobble on the backpack icon or a soft error tone would communicate the missing ingredient without interrupting the lo-fi calm. The furnace has the same problem: because there is no visible countdown, the “BURNT” outcome surprises players on a first run through the studio. A subtle color shift on the furnace door as elapsed time crosses into the danger zone would be enough to telegraph urgency without resorting to a numerical timer on screen.

Structurally, the game currently has no win condition or narrative arc beyond the resource loop itself. A concrete goal — growing five plants to full harvest, for instance — would give the loop a sense of closure and make the opening seeds feel like the start of something rather than an arbitrary starting point. Beyond that, the pottery wheel’s hold-to-craft timer could become adaptive: longer contact for a more durable pot, shorter contact for a fragile one that breaks after a single use. That single change would introduce meaningful trade-offs to what is currently a single fixed path through the studio, without adding any new systems

On the technical side, every scene coordinate in the codebase is a hard-coded pixel value sniffed by hand from a 1024×576 canvas. If the canvas size ever changes, every boundary needs to be remapped manually. Normalizing all coordinates to proportions of width and height and then multiplying at render time would make every scene scale to any canvas size automatically — a straightforward refactor that would future-proof the entire coordinate system.

6. Resources
Inspiration

Some of the interaction design was also influenced by this p5.js sketch, which was linked as an example and I came across while exploring what direct, hands-on interaction could look like inside a browser canvas.

Libraries
    • p5.js (v1.11.11) — core rendering, input handling, and sprite-sheet animation via image().
    • p5.SoundloadSound(), loop()setVolume(), and isPlaying() for all BGM cross-fades and per-action sound effects.
Visual Assets
    • All backgrounds, sprites, and sprite sheets were generated using Google Gemini and subsequently edited by hand — cropped, trimmed to transparency, and sliced into equal-width frames for use with p5’s source-rectangle API.
Audio
    • All audio is managed through p5.Sound. BGM transitions are handled by manageBGM(), which compares a targetBGM reference against currentBGM each frame and only swaps when the target has changed — preventing the track from restarting on every draw call.
    • Background Music — each location in the game has its own assigned instrumental track, chosen to match the mood of that space:
      • Main Menu               Supernatural               NewJeans
      • Instructions             ASAP                               NewJeans
      • Map                             The Chase                     Hearts2Hearts
      • River                           Butterflies                     Hearts2Hearts
      • Forest                         Ditto                                NewJeans
      • Studio                        Right Now                     NewJeans
      • Greenhouse             OMG                                NewJeans
      • Pause Menu             Midnight Fiction        ILLIT
    • Sound Effects — all SFX (wheel spinning, bucket fill, shovel dig, furnace fire, etc.) were sourced from Pixabay and other royalty-free libraries.