Week 12 – Final Project Documentation

For my final project, I’m creating a physically interactive memory-sequence game centered around the metaphor of “recovering a fragmented memory.” The game uses four large LED pushbuttons wired to an Arduino, each with a built-in LED that flashes as part of a color sequence. The player must watch and memorize the flashing sequence and then repeat it correctly by pressing the matching buttons in order. With each successfully completed level, a blurry or pixelated image on the screen becomes clearer, symbolizing memory restoration. If the user gets the sequence wrong, the image distorts or glitches, as if the memory is slipping away. Only after completing all levels does the fully restored image appear.

The Arduino handles all sensing and feedback related to physical input: it detects button presses using INPUT_PULLUP, flashes the LEDs during each round (based on input from P5), and sends messages to P5 whenever the player presses a button. Each button press is communicated over serial with a simple string like “BUTTON:0”, “BUTTON:1”, etc. P5 receives these signals, checks them against the correct sequence, and determines whether to progress the game, update the image clarity, or apply a glitch effect. On the flip side, P5 sends commands to Arduino to flash specific LEDs by sending numbers (0-3) over serial that correspond to the button LEDs.

On the P5 side, the sketch manages all game logic, sequence generation, visual feedback, and memory visualization. It starts with a low-resolution or blurred image and gradually resolves the image as the user completes levels. The sketch also gives instructions to the user and visual cues about success or failure. This layered system allows for a compelling interaction that blends precise physical input with expressive visual output.

 I’ve successfully soldered one of the large LED pushbuttons with its wires and tested it using the Arduino with the internal pull-up setup. The button press registers correctly, and the built-in LED lights up when triggered from code. This confirms that the wiring and logic are working as intended.

Next, I’ll repeat the soldering and wiring process for the remaining three buttons, ensuring each is connected to a unique input and output pin. I’ve also laser-cut the top panel of the box, which has four holes precisely sized to mount the pushbuttons. This will keep the layout organized and user-friendly for gameplay. Once all buttons are mounted and connected, I’ll move on to integrating all four into the Arduino code and begin syncing with the visual side in p5.js.

Laser Cutting Video:

 

 

Week 12 – Final Project Proposal

My midterm, Pitchy Bird, was a voice-controlled version of the classic Flappy Bird where the user’s vocal pitch determined the bird’s flight height. People leaned into microphones, experimenting with their vocal range to navigate obstacles. It was a fun experience. But for my final project, I want to take what learned to push it into physical computing and psychological testing.

I’m currently actively prototyping Flappy IAT, an endless runner controlled by an Arduino-based accelerometer that gamifies the Implicit Association Test (IAT).

The Concept

Instead of voice, players use a custom handheld controller (powered by an MMA8452Q accelerometer or a joystick). The mechanics require high-speed multitasking to bypass conscious cognitive filtering:

  1. Navigation (“Bump”): Players must physically “bump” the controller upward to flap the bird’s wings and maintain altitude against gravity.
  2. Categorization (“Tilt”): Every pipe obstacle displays a text stimulus (e.g., words like “Happy,” “Sad,” or category pairs like “Female Doctor”). To pass through the pipe safely, the player must tilt the controller Left (Green/Good) or Right (Red/Bad) to correctly categorize the word.

If it works, this game will be an active research tool more than only about survival. By forcing players to make categorization decisions under the pressure of keeping the bird afloat, the system exposes unconscious biases. The game tracks specific metrics via serial communication to p5.js:

  • Reaction Time (RT): The milliseconds between seeing the word and tilting the device.
  • Response Intensity: Measuring the angular velocity and degree of the tilt to detect hesitation or certainty.
  • Cognitive Load: Increasing difficulty across three levels to force players into a “flow state” where implicit biases are harder to suppress.

Technical Implementation

The project utilizes a bidirectional data flow. The Arduino handles raw gesture detection (calibrating “rest” position to detect relative Bumps and Tilts), while p5.js manages the game state, visual feedback (green checks for correct tilts, red flashes for errors), and data logging. At the end of a session, the game exports a CSV file detailing the player’s performance and implicit bias metrics.

Week 12 – Final Project Proposal

Finalized concept

For my final project, I’m building something that sits somewhere between a desk object, a quiet companion, and a tiny archivist of the room it lives in. The idea came from a simple thought: most days pass without leaving much behind, and by the time I get to the end of the night, the whole thing feels like a blur. I’m not someone who journals daily, but I like the idea of having some kind of trace of the day even if it’s abstract, incomplete, or not linguistic at all.

So the artefact is basically a small desk object that listens to the atmosphere of the space throughout the day, and later turns those ambient shifts into a soft, formless visual cloud in p5.js. It’s not interested in what the user did, only in how the day felt. It just records the vibe of a day, no eavesdropping or surveillance (and definitely no productivity tracking). I want the final thing to feel almost poetic, like the object is quietly paying attention on the side while I’m working, and at the end of the day it shows me its version of the memory.

What the Arduino will do

The Arduino will handle all the sensor-stuff during the day. I’m using:

  • a photoresistor to capture light changes,
  • an ultrasonic sensor to sense presence/absence near the desk,
  • a piezo to detect general sound/vibration spikes.

Arduino will collect these readings over time and send them to p5.js through serial. I’m keeping the Arduino’s job simple: sense → store → transmit.
I’ll also have a small physical trigger (most likely a button or dial) that the user presses at the end of the day to “reveal” the visual memory.

What p5.js will do

p5.js will take the day’s data and transform it into an atmospheric, slow-moving cloud. I’m aiming for visuals that sit in between abstract art and environmental “weather.” Light translates to color gradients, presence to density, and sound to softness or sharpness of the shape. I’m also considering a very light ml5.js layer just to classify general movement energy, so the cloud feels a bit more alive.
[Communication is mostly Arduino to p5, but later I might also send a message back to Arduino so the object can react in a small way when the memory is generated]

Early progress

So far, I’ve been sketching a few versions of the cloud visualization to figure out what feels “alive” without being overwhelming. Physically, I’ll keep the build minimal – something that looks more like a desk artefact than a tech box.
I’m hoping the final result feels calm, personal, and a little bit poetic, not a gadget trying to do too much, but an object that simply notices what the day was like and gives it a shape.

Week 12 – Documentation on Final Project

The final project is a Mini DJ Booth, a flat cardboard controller styled like a DJ console with painted “records” and four colored buttons on top. The cardboard box houses the Arduino and wiring, while the laptop screen becomes the virtual stage using p5.js. Each physical button represents a different beat or sound layer (for example: kick, snare, base, drums). When a user presses combinations of buttons, they can live-mix these layers while watching synchronized visuals, mimicking the feel of a compact DJ setup.

Generated with Chat GPT

I haven’t started the actual coding yet, but right now I’m more focused on the physical build, especially because this will be my first time soldering and I want to make sure I do it cleanly and safely. During our lab tour, we saw the LED push buttons, and they’re honestly perfect for this project, they’re big, colorful, satisfying to press, and will make the DJ booth feel much more realistic and professional. My plan is to use scrap cardboard from the back of the lab to construct the box itself, painting it to look like a real DJ setup with turntables and controls. Once the physical build is stable, I’ll move on to connecting the buttons to the Arduino and then writing the p5.js code to bring the whole system to life.

On the Arduino side, the program will continuously read the state of the four input buttons using `digitalRead()`. Each button is wired with a pull-down (or pull-up) resistor so the Arduino can reliably detect presses. In the main loop, the Arduino will encode the button states as a simple serial connection  and send this to the computer via USB using Serial.println(). The Arduino will not generate audio; its main role is to sense button presses quickly and transmit clean, structured data to p5.js so there is minimal latency between touch and feedback.

Here is a schematic example that I could potentially adapt for my final project:

Source: https://www.the-diy-life.com/multiple-push-buttons-on-one-arduino-input/

In the p5.js program, the sketch will receive the serial data, log the four button values, and use them to control both sound playback and visuals. For each button, p5 will toggle/keep looping the beat and  (or optionally stop the sound, depending on the final interaction choice). In the p5 sketch I will download four different audio tracks and will generate distinct color schemes and animations for each active button, for example, different color blocks, pulsing circles over each painted “record,” or waveform-style bars that move with the beat. At this stage, communication is mostly one-way (Arduino → p5), but there is room to extend it later (e.g., p5 sending messages back to Arduino to drive LEDs of each button). I just want to note that I havent started working on the p5 visuals yet in terms of code because I’m still planning the overall mood and ambiance I want to set.

I found royalty free beats I could download for the audio:

Week 11~: Production (Arduino + P5)

Production #1: Potentiometer Moves Ellipse

Link to P5 Sketch #1

Production #2: P5 Controls LEDs

Link to P5 Sketch #2

I modified it to also take “A” and “D” keys as input just like the mouse clicks

Final Project Proposal

I was looking for a room in Baraha to get some work done. However, to check some of the rooms there, I had to manually open the doors to see if they were occupied or not. I found the whole process inefficient, since I had to apologize if the room was occupied. The experience was embarrassing, and I was sure there were better ways to avoid it from happening.

So I decided to build my final project based on this idea: a detection machine that lets people know if the room is occupied or not. I found further inspiration from the parking lights installed in some of the malls in Abu Dhabi. There are lights on top of a parking space, and those lights display colors depending on the availability of the space.

At first, I thought using ultrasonic sensors would work. The idea was to have a sensor installed on the door to check if someone walks across the doorway: if they do, activate the light. However, ultrasonic sensors spread in an arc, so they may cause more errors than I expect. Next, I considered using PIR sensors that detect motion and heat. This is a good approach, but since PIR detects only movement, there are issues when people inside do not move: if they sit still and work on their projects, the sensor would assume the room is empty. This defeats the whole purpose of the project.

So for my sensors, I decided to use IR beam-break sensors. It is a single-line sensor that triggers when something gets in the way, so I thought it would be perfect for this project. If I have two of those sensors, I would also be able to determine whether a person enters or exits the room, allowing me to turn the LED on or off. It would be better to have the IR beam sensor installed at the door and a PIR motion sensor installed inside the room for a two-way detection system to ensure fewer errors, but cost-wise, I found that it was not worth having two sensors. I want to buy the sensors so that I can keep the project for my own use without having to return it later.

For the display, I will either order an LED panel or just use a circular LED display to show the status. Universal language such as red light means it is occupied, and green light means that the room is available. I can get more creative and use LED panels to show art installations, which would make it look more like an interactive media project.

Week 11 – Thoughts on Final Project

For my final project, I want to build something that makes the invisible visible. Specifically, the split-second biases we don’t even know we have.

My Idea

I’m designing an interactive system that captures implicit reaction bias through a combination of visual stimuli and physical sensors. There will be rapid-fire choices that measure not just if you respond, but how you respond. Do you press harder on certain choices? Do you hesitate before others? Our body knows things our conscious mind doesn’t want to admit.

How It Works

The p5.js component will display paired visual stimuli (e.g., shapes, colors, maybe even symbolic representations) that require quick binary decisions. Meanwhile, Arduino sensors capture the physical story: pressure sensors under each button measure response intensity, and an accelerometer tracks micro-movements and hesitations. I’m also considering adding a heart rate sensor to catch physiological stress during decision-making moments.

Why It Works

The power comes from the immediate feedback. As you go through trials, p5 visualizes your patterns in real-time, in color-coded displays showing where you reacted fastest, where you pressed hardest, where you hesitated. It’s confrontational in the best way: holding up a mirror to unconscious patterns we’d rather not see.

My Hesitation

This may be too psychological and not “art”, so people may question the interactivity of the “game”, or may not even recognize it as a game.

Week 11 – Group Exercises

Group Members: Joy, Yiyang

(Arduino codes are commented at the bottom of p5 codes)

1. Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/63Yg60k8D
Video Demonstrative: IMG_9638

2. Make something that controls the LED brightness from p5

p5 & Arduino Code: https://editor.p5js.org/yiyang/sketches/dtftbIzaK

3. take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/v77Sd41K4
Video Demonstrative: IMG_9640

Week 11: final project idea

For my final project, I plan to build an interactive installation called “Discover Nepal: Interactive Cultural Storyboard.” The goal is to create a small tabletop experience that teaches people about Nepali culture, history, geography, and fun facts in a playful, visual way. The project will use both Arduino and p5.js and will function like a tiny museum exhibit about Nepal.

Physically, I will create a flat board or “map” of Nepal with five labeled zones: Mountains, Festivals, Music & Dance, Food, and History & Fun Facts. Each zone will be connected to an Arduino input using either pushbuttons or simple touch pads made from aluminum foil. I will also add a single knob (potentiometer) that lets the user scroll through multiple facts within a category. When the user touches or presses one of the zones, the Arduino will detect which category was selected and read the knob position, then send this information as serial data to a p5.js sketch running on my laptop.

In the p5 sketch, I will draw a stylized interface representing Nepal, with icons or scenes for each category. Based on the messages from the Arduino, p5 will highlight the active category, display a short piece of text (such as a fun fact about the Himalayas, a description of Dashain or Tihar, information about Nepali instruments or foods like momo and dal bhat, or a historical event), and play a short sound that matches the theme (wind in the mountains, festival bells, a madal rhythm, etc.). Turning the physical knob will move through different facts inside that category, so the user can explore more than one piece of information in each area.

This project satisfies the course requirements because it is a physically interactive system that relies on a multimedia computer for processing and display. The Arduino “listens” to the user’s physical actions on the board (pressing zones and turning the knob), and the p5.js program “thinks” by interpreting those inputs and deciding what cultural content to show. The system “speaks” back to the user through animated graphics, text, and sound. My intention is to make the interaction feel like a fun, hands-on way to discover Nepal’s culture and history, rather than just reading a static page of information.

Final Idea

I want to create an immersive experience with space. I want users to come to an empty canvas and use pressure sensors to add planets, meteors and stars. These stars can be added and let’s say we have a minimum of 5 stars, a star constellation will be created and can grow if the user adds more stars. These will be floating around or soaring by (meteors) and so it feels like you’re in space. I want to have a projector to project the P5 screen so users can really feel the grandeur of it.

The Interaction

I will have 8 stepping stones on the floor that when the user steps on them they do different things:

Star Makers (Stones 1, 4, 6):

  • Quick tap: Adds one small white star to that stone’s constellation
  • Each stone creates its own independent star collection
  • When 5 stars are placed, they automatically connect with glowing lines to form a constellation
  • Can expand up to 9 stars, making the constellation more intricate and complex
  • Three unique constellations can exist simultaneously in different regions of the sky

Planet Makers (Stones 2, 5, 7):

  • Hold for 3+ seconds: Materializes a planet in space
  • Random planet type appears: gas giants, ringed planets, rocky worlds, ice planets, or mysterious alien worlds
  • Planets drift randomly through space, floating past your constellations
  • Creates a living, breathing solar system

Meteor Makers (Stones 3, 8):

  • Quick tap: Shoots one meteor across the sky in a random direction
  • Hold for 3+ seconds: Triggers a meteor shower with 5 meteors streaking across space
  • Adds dynamic movement and excitement to the scene

Atmosphere Control (Potentiometer dial):

  • Physical dial near the stepping area
  • Controls both the visual intensity and audio volume
  • Low: Clear, minimal space with silence
  • High: Rich nebula clouds, cosmic dust, and immersive ambient soundscape
  • Gives users creative control over the mood of their universe

The Experience

Users approach a darkened floor projection showing empty space. As they explore the stepping stones, they discover they can build their own universe, star by star, constellation by constellation. The moment when 5 stars connect into a glowing constellation creates a magical sense of achievement. Planets drift through the creations, meteors streak across the sky, and the atmosphere can be dialed from stark and minimal to rich and dramatic.