Final Project: Sorbetes Hero

Concept

My concept is a rhythm game called Sorbetes Hero and it is heavily influenced by the game “Guitar Hero.” In the game, you are a Sorbetes (Filipino-style ice cream) vendor, trying to earn as much money as possible. There’s four lanes that are different ice cream flavor and you have to catch the falling “ice cream” (circles) to earn more points and the longer you play, the more bonuses/multipliers you get. There’s an easy and a hard mode that is dependent on the photoresistor/LDR sensor. The easy mode is just a regular game play of one ice cream falling down at a time, while the hard mode has multiple ice creams falling down. The game has a 90-second timer that triggers the game and the ice cream falls faster as more time goes on, the player gets three strikes for missing, and the end shows how much money they earned.

Project Demo | Arduino Code | p5.js Code | Schematics

What I’m proud of

I’m quite proud of the photoresistor/LDR sensor interaction that I have. I was debating what type of interaction to have with it and I just landed on having it be the “mode indicator.” I took quite a while thinking of how to add this in the physical controller without needing to use a flashlight on it or cover it with my hand yet still have the player interact with it in a natural way. So, I used the mini umbrellas that are usually used for drinks. Umbrellas are used quite often in the Philippines because of both the heat and rain and especially by street vendors. I thought this was a clever and also relevant way to integrate the sensor.

Reflections & Future Improvement

Some things I wanted to include was music and sound effects but I ran into some issues uploading them on p5.js, maybe next time I’ll try using the piezo buzzer for that. I also think I could’ve added more obstacles and challenges in the actual game, something like long presses or like the “bombs” in Fruit Ninja. For the actual Arduino, I like how it turned out but I would like to try to use arcade buttons to have a bigger surface and more satisfying feel. Nevertheless, I’m really proud of this project because I felt like I applied all that I have learned throughout the semester. Though the semester unfortunately became only, I still feel like I was able to learn as much as I could.

Week 13: User Testing for Final Project

I had a slight change of plans for my final project, but I’ll still be using the same components (buttons and sound sensor), except for the ultrasonic distance sensor which I have swapped to use the photoresistor sensor instead. Initially, I wanted to do a Philippine-style Jeepney game, but I found that it would take too long to make the graphics and would cause for more complicated game mechanics as well as the Jeepney physical controller being more time-consuming. My new idea is a Philippine-style ice cream Sorbetes game, which is a rhythm game, similar to guitar hero. On p5.js, there will be four lanes with different colors and in the game, they’re the different ice cream flavors. The four lanes represent four different buttons on the Arduino and you have to click it when there’s that colored-circle falling down. The player earns more money (Pesos), the more they catch and they get three lives for missed ones.

USER INTERACTION VIDEO

I made my mom play the game and it was pretty clear to her as it has very simple and quite known mechanics. The game has clear functions that let you know when you catch or didn’t catch ice cream, I still do wanna add sound effects to make it better in this aspect, though. As per the Arduino, although she said that it was registering her moves well, she did find it a bit overwhelming with the wires and the buttons were too small and sometimes would disconnect from the breadboard.

The interaction between p5.js and Arduino worked really well, the buttons clicked right on time and it reflected on the p5.js. The game is quite a simple rhythm game and pretty easy to follow. However, at the time of filming, I didn’t have the photoresistor or sound sensor to test out yet. I will be adding more mechanics to make it more challenging using the two sensors. The controller is also a huge part of my project and I’m working on designing and ordered the things I’ll need for it, thanks to the stipend given. I got female-to-male jumper wires, bigger buttons, and some decorative pieces for the controller.

 

Week 12: Final Project Proposal

Concept

My concept is a top-down arcade driving game  where you play as a jeepney driver navigating a chaotic Manila street, racing to hit your daily peso quota before fuel runs out. The controller is a hand-built cardboard jeepney with three embedded sensors. 

What the programs do

Arduino continuously reads all three sensors and sends them as a single serial string to P5 every loop — ultrasonic distance (hand hover over headlights = brake), sound level (shout into exhaust = honk), and two buttons (pick up / drop off). It also listens for signals back from P5 to trigger the speaker for horn and engine sounds.

P5 receives the serial data each frame and runs the entire game — scrolling road, traffic spawning, passenger logic, collision detection, fare multiplier, fuel gauge, and HUD. All game logic lives in P5. Arduino is purely the input/output layer.

ROUGH SKETCH/IDEA

Challenging part and progress so far

Because the sound sensor isn’t something we’ve discussed in class, I wanted to play around and tested it. I also wanted to test having two buttons do different things so I did two buttons that are different colors open their respective LEDs. So far, I’ve tested the sound sensor (triggers LED on loud input). Tested both buttons (each lights a separate LED). All three confirmed working individually.

SOUND TEST | BUTTONS TEST

Week 12: Reading Response

The reading is essentially asking why design for disability has historically been so focused on making things disappear rather than making them good. The author uses the Eames leg splint as his opening example, something designed for injured navy personnel that also happens to be genuinely beautiful, and I think that object does a lot of work in setting up his argument. If a leg splint can be that considered and elegant, why are hearing aids still being molded in pink plastic and engineered to hide? The glasses section is the one I kept thinking about after I finished reading. I found it interesting how spectacles were once classified as medical appliances and their wearers as patients, and that the goal was explicitly for them to not be styled. And now, they’ve become a sort of fashion accessory. I think about my own experience being short and how much of my early relationship with my appearance was about minimizing rather than owning it, avoiding certain things because I did not want to draw attention to something I felt self-conscious about. I recognize now that instinct is exactly what the author is critiquing. Trying to make something invisible does not make it go away, it just communicates that you think it should be hidden, and I think that says more than the thing itself ever would. I kept thinking that somewhere along the way, the priority shifted from helping someone hear to helping them hide the fact that they need help hearing, and those are very different design briefs with very different outcomes for the person wearing it.

Week 11: Musical Instrument

Concept

My concept is heavily inspired by the show Squid Game and I wanted to make a musical instrument (well I guess in this, more of a music box/player?) where the light controls the mood of the music, so I used one of the songs from a game in the show.  I used a photoresistor/LDR where the darker it is, the slower the tempo is to create that “creepy vibe,” and the brighter it is, the normal tempo is played. I also used a red LED when the creepy version is played  and a green LED when the normal one is played, these lights’ colors are actually used in the show in another one of the games.

Full Code | Video Demo | Schematics

Code that I’m proud of

ldrValue = analogRead(LDR_PIN);
isDark   = (ldrValue < DARK_THRESHOLD);
tempoMult = isDark ? CREEPY_TEMPO : NORMAL_TEMPO;

int duration = (int)((1000.0 / durations[note]) * tempoMult);

tone(BUZZER_PIN, melody[note], duration);

I’m proud of this because I had to figure out how to make the LDR play a role in the song without just making turn on and off. First thing I had to consider was that the LDR doesn’t hand me “LOW” or “HIGH” and just gives me a raw number between 0 and 1023, and I had to find my own threshold by covering the sensor with my hand and watching the Serial Monitor until I landed on 420. For the actual song, I decided to use a tempo multiplier because I didn’t want the notes themselves to change (just the feel of them), I multiplied the duration by 2.5 in the dark to make every note hang longer and the whole melody sounds heavier and more unsettling without needing to change the notes.

How this was made

I started with the slide switch wired to pin 2 using INPUT_PULLUP, which meant I didn’t need an external resistor. When the switch is open, the buzzer stops and both LEDs go off straight away. For the LDR I had to build a voltage divider using a 10kΩ resistor because the Arduino can only read voltage and not resistance directly. One leg of the LDR goes to 5V, the other connects to A0 and one leg of the 10kΩ resistor, and the other resistor leg goes to GND. In bright light the LDR resistance drops and A0 reads high, and in darkness the resistance rises and A0 reads low. I landed on 420 as my threshold after some testing with the Serial Monitor (that I temporary added but later removed). The buzzer plays the melody using tone() and I used a code from Github for that melody (referenced below), and the duration of each note gets multiplied by 2.5 in dark conditions to make it creepy. The green LED lights up in bright conditions and the red LED lights up when it goes dark. I also made sure to re-check the LDR and switch inside the melody loop on every single note so the instrument reacts to light changes mid-song rather than waiting for the whole melody to finish before checking again.

Reflection & Future Improvements

Honestly this assignment gave me more trouble than I expected, but in a good way. I went in thinking the LDR would be straightforward as it’s something that I’ve learned from the analog sensor lesson and it kind of was at the hardware level, but making it do something musically interesting took some more thought. The Squid Game inspiration kept me motivated too because I had a clear vision of what I wanted it to feel like, which made the troubleshooting feel worth it rather than frustrating. If I kept going I’d want to include more songs from the different games and have more “modes.” I’d also want to try changing the actual pitch in the dark, dropping notes lower to make it sound even more sinister, which feels very on brand for the Squid Game theme. But for what it is, I’m genuinely happy with how the concept came through.

References

https://docs.arduino.cc/language-reference/en/functions/advanced-io/tone/

https://docs.arduino.cc/built-in-examples/digital/toneMelody/?_gl=1*yqplbf*_up*MQ..*_ga*MTM3MDIzMDA5Ni4xNzc2ODY2MzM5*_ga_NEXN8H46L5*czE3NzY4NjYzMzckbzEkZzAkdDE3NzY4NjYzMzckajYwJGwwJGgxNjY2NjUxMjk0

For the melody: https://github.com/hibit-dev/buzzer/blob/master/src/movies/squid_game/squid_game.ino

Week 11: Reading Response

A Brief Rant on the Future of Interactive Design + Follow-up

The first article is essentially arguing that the dominant vision of future technology, which is everything being a flat glassy touchscreen you slide your finger across, is not actually visionary at all. It is just a timid extension of what already exists, and what already exists ignores almost everything that makes human hands remarkable. His point is that our hands do two things extraordinarily well, they feel things and they manipulate things, and touchscreens strip both of those capabilities away in exchange for a visual interface that he calls Pictures Under Glass. I reflected on his example of making a sandwich. He asks you to pay attention to how many tiny adjustments your fingers make without you even thinking about it, switching grips, sensing weight, feeling texture, and then he asks whether we are really going to accept a future interface that is less expressive than that. That question reminded of the time I tried learning drums through one of those tablet apps, and the difference between that and sitting in front of a real kit is almost laughable. On a real drum the stick bounces back after you hit it, and that rebound produces important information. Your wrist reads it and adjusts the next stroke automatically, and I could feel even as a beginner that my hands were supposed to be learning something from that response. On the app there is nothing. You tap a flat surface, it makes a sound, and that is the entire relationship. I was learning the pattern but I was not learning to actually play, and from what I can understand, that distinction is what the author is getting at.

About his response to pushback, I actually found it more interesting than the original rant. In the part when someone asked about voice interfaces and he said he has a hard time imagining a painter telling his canvas what to do. That, again, reminded  of the drums. There is no way to describe with a voice or replicate on a screen the feeling of a snare cracking back against your stick, or the way a cymbal responds differently depending on where and how hard you hit it. That knowledge is supposed to live in your hands built up over time, and I genuinely felt the absence of it every time I went back to the app and realized my fingers were learning nothing they could transfer to a real instrument. It felt like I was practicing the appearance of playing drums without any of the physical intelligence that actually makes someone a drummer.

Week 10: Digital (Slide Switch) and Analog (Sound Sensor)

Concept

I wanted to make a lightning system where sound controls the brightness. For my digital sensor, I used a slide switch to turn the LEDs on and off. As for my analog sensor, I used a sound sensor where every time it detects a sound, the LEDs get dimmer. I’m quite interested in sound-activated lights and even though it didn’t come with the kit, I still wanted to give it a try and I got the KY-037 from Amazon.

Full Code | Video Demo | Schematics

Code that I’m proud of

int soundValue = analogRead(soundPin);
 int change = abs(soundValue - 512);

 if (change > threshold) {
   brightness = brightness - 50; // drop brightness by 50 each sound
   Serial.println("Lowering brightness.."); // debug to make sure sound is going through and brightness lowering
   delay(200);
 }

I’m proud of this part because I had to actually understand what the sound sensor was giving me. I thought it would just tell me loud or quiet (HIGH/LOW), but it outputs a number between 0 and 1023 and in silence it sits around 512 (which is about half of that range). I also added the Serial.println myself because I had no idea if claps were even registering so I wanted to confirm it was working in the Serial Monitor before trusting the LEDs and as reassurance.

How this was made

I started with the slide switch wired to pin 2 with a 10kΩ pull-down resistor to GND, the reason for the higher Ω used was to prevent the pin from floating and giving random readings when the switch is open. When it reads LOW, both LEDs turn off and brightness resets to 255 (maximum brightness) so it starts fresh every time. I had to learn how the sound sensor worked and wired it myself. It outputs a continuous number between 0 and 1023 of the volume, sitting around 512 (mid-range value) in silence, and every loop the code reads that value, subtracts 512, and takes the absolute value to get the amplitude. If that crosses my threshold of 70 the brightness drops by 50. I had a problem with sensitivity at first as it kept triggering on background noise or missing claps and I found out that the module has a small dial you adjust with a screwdriver. I also added Serial.println debug line so I could confirm in the Serial Monitor that claps were actually registering before trusting the LEDs. The two LEDs on pins 9 and 10 each have a 330Ω resistor to GND and receive the brightness value through analogWrite using PWM.

Reflection & Future Improvements

This was quite a challenging assignment because I heavily insisted on using a sound sensor. Even though the sound sensor was new to me and not something we covered in class, I was able to apply a lot of the analog concepts we already learned (things like analogRead, analogWrite, and PWM) and it translated over and made it easier to grasp. I went back to the class notes and a few tutorials online (referenced below) to piece it together. If I were to keep going I’d add more brightness steps so the dimming feels smoother, and I’d revisit an earlier idea where the two LEDs go in opposite directions where one dims while the other brightens based on the same concept of having live sound adjusts that. Nevertheless, I’m happy with my output and how it turned out.

References

https://github.com/liffiton/Arduino-Cheat-Sheet

https://docs.arduino.cc/language-reference/

Arduino Sound Sensor: Control an LED with Sound

Week 10: Reading Response

Physical Computing’s Greatest Hits (and misses)

This article goes through a recurring list of physical computing project themes that show up in classes every year, and I found it quite fascinating how the author encourages students to pursue repeated ideas. That resonated with me because I sometimes catch myself thinking of a project idea only to search if it has been done and end up just not going through with it with the mindset of someone has already done it and even better that what I would’ve done. I think that has stunted my growth and exploration that I probably could’ve learned a lot from. This also reminded me of how in traditional art, everyone paints a still life or draws a figure at some point. Nobody tells you not to draw a bowl of fruit because it has been done before. These things become a learning stepping stone in your work and I think that is just as valuable.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This article felt like a continuation of that idea and here the main argument linked interactive art to a beginning of a conversation and not so much a finished statement like a painting or a sculpture, and the moment you over-explain what something means or how someone is supposed to feel about it, you have already ended that conversation before it started. Coming from a background where I spent some time exploring traditional art, where the work is usually a fixed object that speaks for itself, I found this shift in thinking genuinely difficult to wrap my head around at first. A painting hangs on a wall and you bring yourself to it. Interactive work is different because the piece is actually incomplete until someone engages with it, and what they do becomes part of what the work is.  I resonated with his comparison to a theater director working with actors. He says you can give an actor props and suggest intentions but you cannot tell them how to feel, they have to find it themselves. I think that is a really honest way of describing what good interactive design should do, you are building the conditions for something to happen.. And I think that is harder than it sounds because there is a natural instinct when you make something to want people to get it the way you intended. I feel that every time I finish a project and immediately want to stand next to it and explain it to whoever walks by. Reading this made me realize that impulse, as understandable as it is, is actually working against the experience I am trying to create.

Week 9: Reading Response

Emotion and Design: Attractive Things Work Better

The reading starts off with the author’s personal teapot collection and notes their unique qualities: one is functionally absurd, one is ugly yet charming, and one is elegantly engineered for the stages of tea brewing. He uses these very objects to back up his claim that usability need not to be in conflict, and that in fact, things that feel good to use and look at actually perform better in our minds because of the emotional state they put us in, and a beautiful product can help a user work through minor problems that the ugly (but functional) counterpart might not. I do agree with his point on prioritizing usability alone can lead to designs that work but feel sterile, and this reminds me of the function over aesthetics mindset that reinforced in architecture, where function almost overshadows how spaces feel for the consumer. However, while I think his argument fits everyday products well, I don’t think it mirrors the same way with how architecture operates under far greater constraints like structure, material, and safety, where poor functional decisions have serious consequences and it is, from what I can see, a context in which aesthetics can’t come first, though an architectural structure can be beautiful, it ultimately has to serve its purpose of being a safe and functional space to the user.

Her Code Got Humans on the Moon

The article follows mathematician Margaret Hamilton who took a programming job at MIT as something temporary while her husband finished law school, and ended up accidentally building the foundation of software engineering while helping land humans on the moon. There came a situation when her daughter crashed the simulator by triggering a program that no astronaut was ever supposed to activate mid flight and although she flagged it as a real issue and wanted to a kind of debugging code to prevent it, NASA pushed back and claimed that astronauts were too well trained for that. Months later, it happened. I think that kind of overconfidence in human perfection is something a lot of institutions fall into, and it actually reminded me of the Titanic. The ship was considered so structurally sound that the people in charge genuinely claimed it as the “unsinkable”, and that certainty is what made them careless about the lifeboats, the speed, and the warnings. I think both cases show the same thing, which is that when you convince yourself something will never happen, you stop preparing for it, and that is exactly when it does. I truly respected Hamilton for making sure, she stayed prepared and her team was ready to fix it when it did go wrong.

Week 7: Midterm Project

Concept

Ella’s Panaderya is an interactive experience set inside a Filipino bakery, also called a panaderya. The piece invites users to explore the space, click on pastries to learn about them, control the music playing on the radio, and interact with a pot of champurado. My goal was to create something that felt warm and nostalgic to Filipino while informative for those  who aren’t.

Final Sketch!

How This Was Made

The sketch was built in p5.js using a game state system to manage which scene is displayed at any given time. Each state has its own draw function that renders the appropriate graphics. The bakery scene uses a photo background with clickable regions mapped onto the pastries and interactive objects in the image. To find the correct coordinates for each pastry, I used a temporary debug line was added inside drawBakery() that displayed live mouse x and Y values on the canvas as the mouse moved. It was easier to pinpoint the top-left and bottom-right corners of each item and calculate the width and height of each clickable zone. I just removed debug line  once all coordinates were set.

Temporary DeBug line:

fill(255, 0, 0);
textSize(10);
text(mouseX + ", " + mouseY, mouseX, mouseY);

Each pastry is a class that has its position, dimensions, name, and description. I made it so when you click a pastry, a popup displays that information. The champurado pot has its own layer with a top-down bowl drawn using shapes, a stirring animation, and a milk interaction that lightens the color of the champurado. The radio buttons are mapped to coordinates on the bakery image  (same as the pastries) and control three songs using the song functions learned from the class.

Reflection & Future Improvements

This project came together under a time crunch, so there are a few things I would have liked to develop further. The biggest one is the fan: the original plan was to make it interactive, cycling through speed settings with a sprite-based spinning animation to show the blades actually turning. Given more time, that would have added another layer of life to the bakery scene. I also would have liked to refine the popup sizing and positioning across all pastries, and possibly add a zoomed-in image for each one rather than just text. Overall though, I think the piece  succeeded in creating a small, explorable slice of a Filipino bakery that feels interactive and grounded in real cultural context.