Week 12 – Documentation on Final Project

The final project is a Mini DJ Booth, a flat cardboard controller styled like a DJ console with painted “records” and four colored buttons on top. The cardboard box houses the Arduino and wiring, while the laptop screen becomes the virtual stage using p5.js. Each physical button represents a different beat or sound layer (for example: kick, snare, base, drums). When a user presses combinations of buttons, they can live-mix these layers while watching synchronized visuals, mimicking the feel of a compact DJ setup.

Generated with Chat GPT

I haven’t started the actual coding yet, but right now I’m more focused on the physical build, especially because this will be my first time soldering and I want to make sure I do it cleanly and safely. During our lab tour, we saw the LED push buttons, and they’re honestly perfect for this project, they’re big, colorful, satisfying to press, and will make the DJ booth feel much more realistic and professional. My plan is to use scrap cardboard from the back of the lab to construct the box itself, painting it to look like a real DJ setup with turntables and controls. Once the physical build is stable, I’ll move on to connecting the buttons to the Arduino and then writing the p5.js code to bring the whole system to life.

On the Arduino side, the program will continuously read the state of the four input buttons using `digitalRead()`. Each button is wired with a pull-down (or pull-up) resistor so the Arduino can reliably detect presses. In the main loop, the Arduino will encode the button states as a simple serial connection  and send this to the computer via USB using Serial.println(). The Arduino will not generate audio; its main role is to sense button presses quickly and transmit clean, structured data to p5.js so there is minimal latency between touch and feedback.

Here is a schematic example that I could potentially adapt for my final project:

Source: https://www.the-diy-life.com/multiple-push-buttons-on-one-arduino-input/

In the p5.js program, the sketch will receive the serial data, log the four button values, and use them to control both sound playback and visuals. For each button, p5 will toggle/keep looping the beat and  (or optionally stop the sound, depending on the final interaction choice). In the p5 sketch I will download four different audio tracks and will generate distinct color schemes and animations for each active button, for example, different color blocks, pulsing circles over each painted “record,” or waveform-style bars that move with the beat. At this stage, communication is mostly one-way (Arduino → p5), but there is room to extend it later (e.g., p5 sending messages back to Arduino to drive LEDs of each button). I just want to note that I havent started working on the p5 visuals yet in terms of code because I’m still planning the overall mood and ambiance I want to set.

I found royalty free beats I could download for the audio:

Week 11~: Production (Arduino + P5)

Production #1: Potentiometer Moves Ellipse

Link to P5 Sketch #1

Production #2: P5 Controls LEDs

Link to P5 Sketch #2

I modified it to also take “A” and “D” keys as input just like the mouse clicks

Final Project Proposal

I was looking for a room in Baraha to get some work done. However, to check some of the rooms there, I had to manually open the doors to see if they were occupied or not. I found the whole process inefficient, since I had to apologize if the room was occupied. The experience was embarrassing, and I was sure there were better ways to avoid it from happening.

So I decided to build my final project based on this idea: a detection machine that lets people know if the room is occupied or not. I found further inspiration from the parking lights installed in some of the malls in Abu Dhabi. There are lights on top of a parking space, and those lights display colors depending on the availability of the space.

At first, I thought using ultrasonic sensors would work. The idea was to have a sensor installed on the door to check if someone walks across the doorway: if they do, activate the light. However, ultrasonic sensors spread in an arc, so they may cause more errors than I expect. Next, I considered using PIR sensors that detect motion and heat. This is a good approach, but since PIR detects only movement, there are issues when people inside do not move: if they sit still and work on their projects, the sensor would assume the room is empty. This defeats the whole purpose of the project.

So for my sensors, I decided to use IR beam-break sensors. It is a single-line sensor that triggers when something gets in the way, so I thought it would be perfect for this project. If I have two of those sensors, I would also be able to determine whether a person enters or exits the room, allowing me to turn the LED on or off. It would be better to have the IR beam sensor installed at the door and a PIR motion sensor installed inside the room for a two-way detection system to ensure fewer errors, but cost-wise, I found that it was not worth having two sensors. I want to buy the sensors so that I can keep the project for my own use without having to return it later.

For the display, I will either order an LED panel or just use a circular LED display to show the status. Universal language such as red light means it is occupied, and green light means that the room is available. I can get more creative and use LED panels to show art installations, which would make it look more like an interactive media project.

Week 11 – Group Exercises

Group Members: Joy, Yiyang

(Arduino codes are commented in the bottom of p5 links)

1. Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/63Yg60k8D
Video Demonstrative: IMG_9638

2. Make something that controls the LED brightness from p5

p5 & Arduino Code: https://editor.p5js.org/yiyang/sketches/dtftbIzaK

3. take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/v77Sd41K4
Video Demonstrative: IMG_9640

Week 11: final project idea

For my final project, I plan to build an interactive installation called “Discover Nepal: Interactive Cultural Storyboard.” The goal is to create a small tabletop experience that teaches people about Nepali culture, history, geography, and fun facts in a playful, visual way. The project will use both Arduino and p5.js and will function like a tiny museum exhibit about Nepal.

Physically, I will create a flat board or “map” of Nepal with five labeled zones: Mountains, Festivals, Music & Dance, Food, and History & Fun Facts. Each zone will be connected to an Arduino input using either pushbuttons or simple touch pads made from aluminum foil. I will also add a single knob (potentiometer) that lets the user scroll through multiple facts within a category. When the user touches or presses one of the zones, the Arduino will detect which category was selected and read the knob position, then send this information as serial data to a p5.js sketch running on my laptop.

In the p5 sketch, I will draw a stylized interface representing Nepal, with icons or scenes for each category. Based on the messages from the Arduino, p5 will highlight the active category, display a short piece of text (such as a fun fact about the Himalayas, a description of Dashain or Tihar, information about Nepali instruments or foods like momo and dal bhat, or a historical event), and play a short sound that matches the theme (wind in the mountains, festival bells, a madal rhythm, etc.). Turning the physical knob will move through different facts inside that category, so the user can explore more than one piece of information in each area.

This project satisfies the course requirements because it is a physically interactive system that relies on a multimedia computer for processing and display. The Arduino “listens” to the user’s physical actions on the board (pressing zones and turning the knob), and the p5.js program “thinks” by interpreting those inputs and deciding what cultural content to show. The system “speaks” back to the user through animated graphics, text, and sound. My intention is to make the interaction feel like a fun, hands-on way to discover Nepal’s culture and history, rather than just reading a static page of information.

Final Idea

I want to create an immersive experience with space. I want users to come to an empty canvas and use pressure sensors to add planets, meteors and stars. These stars can be added and let’s say we have a minimum of 5 stars, a star constellation will be created and can grow if the user adds more stars. These will be floating around or soaring by (meteors) and so it feels like you’re in space. I want to have a projector to project the P5 screen so users can really feel the grandeur of it.

The Interaction

I will have 8 stepping stones on the floor that when the user steps on them they do different things:

Star Makers (Stones 1, 4, 6):

  • Quick tap: Adds one small white star to that stone’s constellation
  • Each stone creates its own independent star collection
  • When 5 stars are placed, they automatically connect with glowing lines to form a constellation
  • Can expand up to 9 stars, making the constellation more intricate and complex
  • Three unique constellations can exist simultaneously in different regions of the sky

Planet Makers (Stones 2, 5, 7):

  • Hold for 3+ seconds: Materializes a planet in space
  • Random planet type appears: gas giants, ringed planets, rocky worlds, ice planets, or mysterious alien worlds
  • Planets drift randomly through space, floating past your constellations
  • Creates a living, breathing solar system

Meteor Makers (Stones 3, 8):

  • Quick tap: Shoots one meteor across the sky in a random direction
  • Hold for 3+ seconds: Triggers a meteor shower with 5 meteors streaking across space
  • Adds dynamic movement and excitement to the scene

Atmosphere Control (Potentiometer dial):

  • Physical dial near the stepping area
  • Controls both the visual intensity and audio volume
  • Low: Clear, minimal space with silence
  • High: Rich nebula clouds, cosmic dust, and immersive ambient soundscape
  • Gives users creative control over the mood of their universe

The Experience

Users approach a darkened floor projection showing empty space. As they explore the stepping stones, they discover they can build their own universe, star by star, constellation by constellation. The moment when 5 stars connect into a glowing constellation creates a magical sense of achievement. Planets drift through the creations, meteors streak across the sky, and the atmosphere can be dialed from stark and minimal to rich and dramatic.

Week 11 – Final Project Prompt

My final project is an interactive installation titled “The Snail’s Journey to School.” The idea is inspired by the opening scenes of Monsters University, where a small character makes their way toward school. In my version, the user helps a robot snail travel through a physical obstacle course until it reaches its destination.

Arduino will control the physical movement of the robot snail using motors and buttons for directional input. It may also include an optional sensor, such as a photoresistor or distance sensor, to detect obstacles or the finish line. The snail robot and the obstacle course will be fabricated by me, either through 3D printing or hand-built materials.

p5.js will act as the narrative and feedback layer. It will display an introduction, instructions, and story elements that respond to the user’s physical interactions. As the user presses buttons to move the snail, Arduino will send messages to p5.js, which will update the visuals, play small animations or sounds, and react to progress in real time. When the snail reaches the “school” area, p5 will display the final scene.

The interaction loop centers on listening (reading the user’s button presses), thinking (moving the snail and sending corresponding data), and speaking (p5.js reacting immediately with feedback).

Week 11 – Reading Response

In Design Meets Disability, Graham Pullin challenges the way society frames disability by questioning why assistive devices are often treated as purely functional rather than expressive. He argues that design for disability should not be limited to medical necessity, it should also include aesthetics, identity, and personal preference. What stood out to me is how Pullin highlights the quiet power imbalance in design: mainstream objects like glasses or smartphones have endless variations and styles, while many assistive tools remain clinical and uniform. This difference reveals how disability is still seen as something to “fix” instead of a natural part of human diversity.

Pullin pushes the reader to consider that assistive devices could be opportunities for creativity rather than reminders of limitation. For example, he discusses the possibility of hearing aids becoming fashionable accessories instead of devices people feel pressured to hide. His argument reframes disability not as a deficit but as a design space full of potential innovation.

Overall, the reading invites designers to rethink their assumptions. Instead of designing for disabled people, Pullin encourages designing with them, treating disability as a source of insight and richness. The book ultimately suggests that inclusive design is not just ethical, it also expands the possibilities of design itself.

Lily pad

For this project, I wanted to recreate a peaceful scene of a frog sitting on a lily pad in a pond. Since I hadn’t worked much with sensors yet, I thought this would be the opportunity to incorporate one. I decided to use a distance sensor to control the ripples in the water, the closer your hand gets to the sensor, the more frequently the frog hops and creates ripples.

The Arduino codesimply measures how far away your hand is from the sensor and sends that data to the p5.js code. The Arduino code measures distance by sending an ultrasonic pulse and calculating how long it takes to bounce back, then converts that time into cms using the speed of sound (0.034 cm/microsecond) divided by 2 since the sound travels to the object and back.

The p5.j code then uses that distance data to dictate how often the ripples should occur. In the p5 code, I also hosted the visuals/art element of the project.

Next time, I think it would be fun to use a pressure sensor and have users actually jump up and down at different strengths and show that on p5.js.

// ARDUINO CODE
const int trigPin = 9;
const int echoPin = 10;

void setup() {
  Serial.begin(9600);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
}

void loop() {
  // measure distance
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  
  // send distance to p5.js
  Serial.println(distance);
  
  delay(50);
}

IMG_2539

Week 11 – Reading Response

Design Meets Disability treats constraint more than a deficit. When a flow fails under limited vision or one‑hand use, the failure is a measurement of hidden demand, e.g., precision, memory, timing. Fixing those loads tends to make the product faster and safer for everyone.

The shift from accommodation to inclusive design is a key shift in the design workflow. An add‑on ramp or a screen reader retrofit treats accessibility as a patch. Building large hit targets, consistent structure, multimodal signals, and undo by default treats accessibility as a core requirement. The second approach reduces support costs and broadens use. Captions prove the point, becuase they serve deaf users first, but help viewers in noisy spaces, language learners, and anyone scanning a video on mute. Curb cuts started for wheelchairs, then made cities usable for strollers, carts, and bikes. The spillover is not an accident, but it is the direct result of “designing for constraints” that generalize. As Charles Eames believed, “design depends largely on constraints.” And I agree.

Inputs, perception, and cognition form a clear framework. Products assume hands with fine motor control, perfect vision and hearing, and uninterrupted attention. Alternatives (e.g., switch controls, eye‑tracking, high contrast, scalable text, haptics, chunked steps) lower friction and error rates across contexts. Voice control demonstrates its utility along this path—essential for limited mobility, valuable hands‑free while cooking or driving. Predictive text began as assistive technology, now it is an everyday efficiency feature, especially in coding today, a humorous but significant save of less efficient efforts. That’s why Cursor dominates.

I do have questions. Are there cases where inclusive design adds complexity that hurts clarity? For instance, voice, haptic, and visual signals can compete. The answer may be progressive disclosure. Maybe default to one strong signal, then let users layer or swap modes. But it raises another concern. How do we balance among options, and must we spend resources to develop all of them? How do teams budget for inclusive testing without turning it into a checkbox? We must tie success to task completion under constraint, not to the presence of features. If two flows pass under one‑hand use, low contrast, and divided attention, or even less for a more specific use case they are ready.

The actionable stance: Write constraints into requirements. Set minimum hit area sizes, enforce semantic structure, and require undo and confirmation for destructive actions. Prefer multimodal capability, but choose a clear default per context. Measure success as completion with minimal precision under time pressure.