Month: October 2025
Midterm – Pitchy Bird
following my initial progress on a backend-backed idea, I faced challenges managing the complexities of API usage and LM output, and thus switched to another idea briefly mentioned at the start of my last documentation.
Core Concept
For my midterm project, I wanted to explore how a fundamental change in user input could completely transform a familiar experience. Flappy Bird, known for its simple tap-based mechanics, was what I took to re-imagine with voice control. Instead of tapping a button, the player controls the bird’s height by changing the pitch of their voice. Singing a high note makes the bird fly high, and a low note brings it down.
The goal was to create something both intuitive and novel. Using voice as a controller is a personal and expressive form of interaction. I hoped this would turn the game from a test of reflexes into a more playful—and potentially, honestly, sillier challenge.
How It Works (and What I’m Proud Of)
The project uses the p5.js for all the visuals and game logic, combined with the ml5.js library to handle pitch detection. When the game starts, the browser’s microphone listens for my voice. The ml5.js pitchDetection model (surprisingly it’s lightweight) analyzes the audio stream in real-time and spits out a frequency value in Hertz. My code then takes that frequency and maps it to a vertical position on the game canvas. A higher frequency means a lower Y-coordinate, sending the bird soaring upwards.
click here to access the game as the embed is not functional.
I’m particularly proud of two key decisions I made that really improved the game feel.
First was the dynamic calibration for both noise and pitch. Before the game starts, it asks you to be quiet for a moment to measure the ambient background noise, which is measured set a volume threshold, so the game doesn’t react to the hum of a fan or distant chatter. Then, it has you sing your lowest and highest comfortable notes. This personalizes the control scheme for every player, adapting to their unique vocal range, which could be an important design choice for a voice-controlled game. I conseptualized this “calibration” idea and used AI to explore ways to implementation, finally coding up the components in the game.
setTimeout(() => {
// Set noise threshold (average level + a buffer)
noiseThreshold = mic.getLevel() * 1.5 + 0.01;
console.log("Noise threshold set to: " + noiseThreshold);
gameState = 'calibratePitch';
isCalibratingLow = true;
// Capture lowest pitch after a short delay
setTimeout(() => {
minPitch = smoothedFreq > 50 ? smoothedFreq : 100;
console.log("Min pitch set to: " + minPitch);
isCalibratingLow = false;
isCalibratingHigh = true;
// Capture highest pitch after another delay
setTimeout(() => {
maxPitch = smoothedFreq > minPitch ? smoothedFreq : minPitch + 400;
console.log("Max pitch set to: " + maxPitch);
isCalibratingHigh = false;
// Ensure model is loaded before starting game loop with getPitch
if (pitch) {
gameState = 'playing';
} else {
console.log("Pitch model not ready, waiting...");
// Add a fallback or wait mechanism if needed
}
}, 3000);
}, 3000);
}, 2000); // 2 seconds for noise calibration
Another technical decision I’m happy with was implementing a smoothing algorithm for the pitch input. Early on, the bird was incredibly jittery because the pitch detection is so sensitive. To fix this, I stored the last five frequency readings in an array and used their average to position the bird. This filtered out the noise and made the bird’s movement feel much more fluid and intentional. Additionally, instead of making the bird fall like a rock when you stop singing, I gave it a gentle downward drift. This “breath break” mechanism hopefully makes the game feel like air.
Challenges and Future
My biggest technical obstacle was a recurring bug where the game would crash on replay. It took a lot of console-logging and head-scratching, but it ultimately turned out that stopping and restarting the microphone doesn’t work the way I’d thought. The audio stream becomes invalid after microphone stops, and I couldn’t reuse it. The solution was to completely discard the old microphone object and create a brand new one every time a new game starts.
In addition, there are definitely areas I’d love to improve. The calibration process, while functional, is still based on setTimeout, which can be sort of rigid. A more interactive approach, where the player clicks to confirm their high and low notes, would be an alternative user experience I could test and compare with. Additionally, the game currently only responds to pitch. It might be fascinating to incorporate volume as another control dimension—perhaps making the bird dash forward or shrink to fit through tight gaps if you sing louder.
A more ambitious improvement would be to design the game in a way that encourages the player to sing unconsciously. Right now, the player is very aware that you’re just “controlling” the bird only. But what if the game’s pipe gaps prompt them to perform a simple melody? The pipes could be timed to appear at moments that correspond to the melody’s high and low notes. This might subtly prompt the player to hum along with the music, and in doing so, they would be controlling the bird without even thinking.
Week 6: Pong² (Midterm Project)
Overall Concept:
This is Pong². It’s called Pong² because this isn’t just a spiritual successor to the first ever video game, it’s a dimension beyond. If regular pong only moved you in one axis, it only has one dimension; this game lets you move left to right AND back and forth– creating a very amusing experience with much more depth than the original.
Pong² features two modes. Let’s start with Versus. Versus is a continuation of what I worked on all the way back in Week 3. In Versus, you and another player play a 1 v 1 duel of up to 9 rounds, best of 5. It’s the same win conditions as classic pong but with a lot more strategic depth. Moving your paddle in the opposite direction towards the ball will “smash” or “spike” it into your opponent’s side while moving in the same direction as you receive the ball will help you “control” or “trap” it.
Pong²’s second mode is its best: co-op. In Co-Op mode, you and another player work together to guard the bottom goal against multiple balls that speed up with every return. You have 3 lives and each ball that slips past you takes a life from your shared life total. You have to coordinate with your teammate to make sure you keep a watch on every ball and not just respond as fast as you can to each one, because that’s only going to take you so far (my bet is on 50 seconds).
Feedback: Color Scheme Consistency
On our Monday checkup, Professor Ang told me to have consistent colors for highlighting the gameplay instructions; when I use different colors to highlight I risk player confusion. I also proceeded to keep the game mode colors consistent with the menu instead of having the menu remain exclusively white and purple.
CO-OP Mode Design
I originally planned to make a player vs AI mode but realized that I really didn’t know how to make it respond similarly to how a real player would move the paddle. I had received feedback from Professor Ang to make the concept more interesting than just the usual 1v1 pong, and that’s when it hit me: what if I made 2 Player survival pong?
I had a lot of fun designing the co-op mode on paper, but I made my code a huge mess by duplicating my versus game mode javascript file instead of working from scratch. I THOUGHT that would make the process faster since I would be working with my previous framework but I ended up having to modify so many things it ended up convoluting it.
I originally had the game start with just one ball and you wouldn’t get the second ball until the 15th hit on the top side; however,I realized I wanted players to naturally have the idea to coordinate with each other. For instance, player one might tell his teammate “I’ll take the red ball you take the blue one” to strategize. So what I decided was to make the 15th hit spawn the third ball and the 2nd top bounce spawn the second ball which presents the challenge to players at a good pace.
I was very proud of this for loop I built to manage the many colorful balls that you had to defend against. Between this week and last week, I also had to add a canHitPaddle() method and timer to make sure you wouldn’t accidentally double hit the same ball as you tried to move in the same direction as it.
for (let b of coopBalls){ //runs thru all the balls stored in the array
b.display();
b.move();
b.scoreDetection();
if (coopBallCollidePaddle(b, player1paddle1, "Top")&& b.canHitPaddle()) {
paddleBounceNeutralSFX.play();
b.lastHitTime = millis() //records hit time using canHitPaddle()
b.ballYspeed *= -1;
b.yPos = player1paddle1.yPos - player1paddle1.paddleHeight / 2 - b.radius - 5;
if (keyIsDown(83)) { // 'S'key
paddleBounceControlSFX.play();
b.ballYspeed *= COOPbackControlFactor;
} else {
paddleBounceNeutralSFX.play();
}
}
if (coopBallCollidePaddle(b, player2paddle1, "Top")&& b.canHitPaddle()) {
paddleBounceNeutralSFX.play();
b.lastHitTime = millis()//records hit time using canHitPaddle()
b.ballYspeed *= -1;
b.yPos = player2paddle1.yPos - player2paddle1.paddleHeight / 2 - b.radius - 5; //the 5 helps make a microadjustment to prevent funky collisions
if (keyIsDown(DOWN_ARROW) || keyIsDown(75)) { // '↓' or 'K' key
paddleBounceControlSFX.play();
b.ballYspeed *= COOPbackControlFactor;
} else {
paddleBounceNeutralSFX.play();
}
}
}//closes for b loop
I think the co-op mode overall turned out very well. It was surprisingly challenging but also a good amount of fun for the friends that I asked to test it.
VERSUS Mode Improvements
I had the idea to add direction input with forward and backward velocity. Basically say you’re on the bottom side, if you receive the ball with “down arrow” then you slow it down because you pressed the same key the ball was headed, but if you received it with the “Up arrow” key you would add momentum to it.
I had so much fun implementing this layer of complexity to the game. A main criticism I received at first was that the extra dimension of movement for the paddles didn’t offer a significant difference to the usual pong gameplay loop.
This was my response to that– adding a much needed strategic depth to the game. You weren’t just competing on who was better at defense now, you were seeing who had the better attack strategy.
In fact I originally didn’t plan to add this forward and backward velocity effect to the co-op mode but I loved how it felt so much that I knew I had to add it in some way. This posed a balancing challenge as well; if we were to keep the backControlFactor from the versus mode then it would prove far too easy.
Sound Design
I originally wanted to implement the sounds into my versus mode but then I realized building the infrastructure for all the menus first would make my life a lot easier. So sound design ended up as the last thing on the to-do list.
I usually got sound effects from silly button websites when I needed to edit videos so I never knew where to get actual licensed SFX for indie games. I eventually found one called pixabay that had a good list of sound effects.
Most of the menu button SFX and game over SFX were pretty straightforward, but I really wanted the SFX for the gameplay loop itself to sound very satisfying. My favorite ones were definitely the ball control SFX and the countdown SFX. For the ball control SFX, I was inspired by the sound design in the Japanese anime Blue Lock– which has some incredible sound effects like the bass-y effect when one of the characters controls the ball midair. I found this super nice sounding bass sound and that’s how the idea of using acoustic instrument sounds stuck. The other two ended up being a snare and a kickdrum sound.
Difficulties Endured
The Great Screen Changing Mystery
There was this incredibly confusing bug that I never truly figured out but the gist was that it would randomly switch screens specifically to the versus game mode SOMETIMES when I clicked into Co-Op.
I dug and I dug through the code endlessly until I saw that I left one of the screen-related if-statements to give an else-statement that defaulted the screen to versus. This was here for earlier tests but this seriously still doesn’t make any sense to me. There was never a call to the function that would switch to versus and there was nothing that would leave the screen state invalid for any of the available screens to take over.
This was a frustrating gap in my knowledge that I never fully understood, but I did fix it.
Hit “K” to Let the Ball Through (and look stupid)
For some reason, the bottom paddle kept letting the ball straight through whenever I blocked it while holding “K” or the down arrow. This took hours of looking for the problem, going “I’ll deal with that later” and then running into it again and wondering what the heck is causing this?
It was very specifically the bottom paddle and it HAD to be related to backward movement input detection. Then I realized that it was ONLY the bottom paddle because I had set the ball speed to minSpeed, which is always positive. So all I had to do was to convert it to a negative value equivalent to minSpeed.
if (keyIsDown(UP_ARROW) || keyIsDown(73)) { // '↑' or 'I' key
paddleBounceSmashSFX.play();
ballforVS.ballYspeed *= fwdVelocityFactor;
} else if (keyIsDown(DOWN_ARROW) || keyIsDown(75)) { // '↓' or 'K' key
paddleBounceControlSFX.play();
ballforVS.ballXspeed = minSpeed;
ballforVS.ballYspeed = minSpeed;
} else {
paddleBounceNeutralSFX.play();
}
Cursor Changing
When I was developing the buttons, I realized that they didn’t really feel clickable. So I wanted to add some code to change the cursor to the hand whenever it hovered over a button. This proved surprisingly complicated for what is seemingly an easy task. The most efficient way I found was to append all the buttons to an array for each menu and have it checked in a for loop everytime– then closing off with a simple if-statement.
function updateCursorFor(buttonArray) {
let hovering = false;
for (let btn of buttonArray) {
if (btn.isHovered(mouseX, mouseY)){
hovering = true;
}
}
if (hovering) {
cursor(HAND);
} else {
cursor(ARROW);
}
}
Conclusion & Areas for Improvement
As for areas of improvement, I would probably make the ball-paddle collision logic even more stable and consistent. It usually behaves predictably and as intended but it sometimes shows the way I coded it has a lot of flaws with colliding on non-intended surfaces.
However, despite all this, I am really proud of my project– both from the UI design standpoint and the replayability of the gameplay loop. Once my other friends are done with their midterms I’m gonna have a much better opportunity to see flaws in the design.
Week 7 — Final Game (Midterm Project)
https://editor.p5js.org/hajar.alk.2007/sketches/
Concept Recap
Now that I’ve finished my game, I just want to restate my concept briefly (which I explained in detail in my Week 6 submission). The original idea started as a candy game, but I decided to make it more traditional and personal by turning it into a Gahwa Rush theme. The inspiration came from my grandpa and my little brother how in our culture, gahwa is always seen as fine for kids while soft drinks aren’t. I found that really funny, so I made the goal of the game to catch the gahwa and avoid the Pepsi. It’s a simple idea, but it means a lot to me because it connects to my family and culture, and I’m really happy with how it turned out.
Week 7 Progress
This week, I was mainly working on the final details to perfect the game. Most of it was already done since last week the whole structure and idea were already there. What I focused on this week was adding sounds to make the game feel more alive and interactive. I also let my friend play it, and I noticed that it was really hard for her, even though for me it felt easy. That’s when I realized I had been increasing the difficulty too much without noticing because I kept testing and improving my own gameplay, so I was getting better while others weren’t. To fix that, I decided to lower the difficulty level to make the game more beginner-friendly and enjoyable for everyone.
I also found a small bug where the player would lose even when the character didn’t actually touch the Pepsi. To fix this, I adjusted the collision detection by making smaller rectangle frames around the Pepsi so that it only counts as a collision when it really touches the character.
Sources/Credit
https://youtu.be/enLvg0VTsAo?si=mPNyWkxCoWeOn3CG
https://youtu.be/bMYQbU01u04?si=KDpfq1w9eC_Bifax
https://youtu.be/Z57hx4ey5RY?si=ruAPhn2WmEeyHKXG
https://youtu.be/MuOcdYjF2F4?si=Z160JD3BE2VQnpvr
Although these YouTube links aren’t the same concept as my game and are actually very different, I used them to help me with the technical parts of creating my game. Since I’m still a beginner at coding, some things especially the math equations were really hard for me to figure out on my own. These videos really helped me understand how to make objects fall, how to make them appear randomly, and how to control different elements in the game. Even though their games had completely different themes, they included similar components that I could learn from, and I applied those ideas to make my own game work.
Code Highlight
const pw = p.w * PLAYER_HIT_SCALE.w;
const ph = p.h * PLAYER_HIT_SCALE.h;
const s = ITEM_HIT_SCALE[it.kind] || { w: 0.5, h: 0.7 };
const iw = it.w * s.w, ih = it.h * s.h;
return (Math.abs(p.x - it.x) * 2 < (pw + iw)) &&
(Math.abs(p.y - it.y) * 2 < (ph + ih));
When I was fixing the collision bug, it took me many tries to get it right. Even though I found YouTube videos about collision detection, I still couldn’t really figure it out at first because my game’s sprites and hitboxes didn’t match properly. The player would lose even when the Pepsi was far away. I kept testing and adjusting the numbers, but it wasn’t working. Then I found one YouTube video that explained hitboxes in a really simple way, and it finally made sense to me. That video helped me understand how to scale the hitboxes separately for each object, so I created smaller hitboxes around the Pepsi and the gahwa, and after that, the collisions finally worked perfectly. https://youtu.be/HK_oG_ev8FQ?si=BqtCL3WpHv3UpPQ0
End Of Project Reflection
Overall, I really enjoyed this project I genuinely, genuinely did. I loved developing my idea and adding more to it every single week. It’s crazy to see how much it changed from week 5, when I first thought of it, to how it looks now. It’s such an inspiring project for me because I got to be creative and technical at the same time. I also really enjoyed sharing it with my friends and family; everyone wanted to try it and play.
Even though it was a heavy project that took hours of work and a lot of effort, the result was completely worth it. I felt so accomplished when I finally finished it. It took a lot of trial and error, but that’s honestly what helped me learn the most. This project pushed me to apply everything I learned in class not just follow instructions, but actually take risks, test ideas, and build something real. It also made me go beyond what we learned and look for new solutions from other sources, like YouTube tutorials.
In the end, it was a very in-depth and challenging project, but I truly enjoyed every step of it not just the outcome. I loved the process of testing, debugging, and improving. It was fun, creative, and one of the most rewarding projects I’ve ever done.
There were definitely moments when I found bugs that I just couldn’t fix, and it felt so overwhelming. It was really frustrating because I would go through the code again and again and still couldn’t figure out what was wrong. But I started using the debugging techniques we learned in class, and that really helped me calm down and approach the problem more logically instead of panicking. There were also days when I spent hours trying to fix one thing, and after a while, my brain would just stop functioning. I couldn’t think straight anymore. But whenever I took a break and came back later, it was like my mind was refreshed, I could suddenly see the problem so much more clearly and finally debug it.
At some points, I honestly wanted to delete the whole code and just make a simpler game because I was so frustrated. But I’m really glad I didn’t. Finishing it made me feel so accomplished, and it really boosted my confidence in coding. I kept going even when I wanted to give up, and I pushed myself to find answers and look for external resources when I got stuck. That persistence made me realize that even if something feels impossible at first, I can figure it out if I stay patient and keep trying.
This project definitely required a lot of patience and I think that’s a skill I’m really starting to develop. I realized that when you’re coding, you’re not always going to get it right the first time, and that’s completely okay. There’s nothing wrong with making mistakes or needing multiple tries. Especially when you’re creating a game or something complex, patience is everything. You have to be willing to try again and again, test small changes, and stay calm even when things don’t work right away. This project really taught me that. It helped me understand that patience isn’t just something nice to have it’s one of the most important skills in programming.
Midterm Project – Balloon Pop
Game link:
https://editor.p5js.org/mm13942/full/lU_ibrAn2
Concept
My project is a fun and lighthearted balloon survival game where the player controls a balloon flying through the sky. The goal is simple — avoid the falling bombs, collect hearts for extra points, and stay alive as long as possible. The balloon can only move left and right, while the background scrolls continuously to give the feeling of rising into the clouds. The moment the balloon hits a bomb or the timer bar runs out, it pops and game over.
Inspiration
I was inspired by classic arcade-style games where the goal is to survive for as long as possible while dodging obstacles. I wanted to make something cheerful and colorful but still challenging. The idea of having a balloon as the main character felt fitting because it’s fragile yet expressive, and visually, it works well with soft, moving clouds and bright skies.
Production and Main Features
Initially, I was planning to do the obstacles on the sides that appear randomly:
but then I understood that it wouldn’t be a great decision, considering the size of a laptop screen, so I decided to stick to my another idea with objects falling randomly from the sky. After creating the draft of the game, I wrote down everything that needed to be added to my work to complete the game.
Key Features:
- Object-Oriented Design – Classes for Balloon, Bomb, Heart, and Button
- Pages:
- Main page with right-aligned buttons
- Instructions page
- Weather choice page (Rainy, Sunny, Snowy)
- Skin choice page (Pink, Yellow, Orange balloons)
- Game over page with left-aligned content
- Gameplay Mechanics:
- Bombs spawn with increasing difficulty (max 7)
- Hearts spawn every 8 points (+5 score bonus)
- Hearts spawn away from bombs to avoid interference
- Proper collision detection with circular hitboxes (no false collisions)
- Infinitely scrolling backgrounds based on weather
- Score tracking with high score display
- Controls:
- Arrow keys or A/D to move
- C key for fullscreen toggle
- Mouse clicks for all buttons
- Audio Support – Sound functions
- Highlighted Buttons – Selected weather/skin buttons get highlighted
- Back Buttons – On every sub-page to return to main menu
Code I’m Proud Of
Part of the code that I’m really proud of might not seem too hard, but it was definitely the most time consuming one and took a lot of trial and error. it was removing the white background from the uploaded pictures. when it first worked out, I thought everything was good but turns out P5 was creating 60 frames per second of one single image and when I played for more that 10 seconds it kept shutting down. I had to do a lot of debugging to understand what the actual problem was and was finally able to make it work without lagging, which really made me happy
processedBombImg = removeWhiteBackground(bombImg);
processedHeartImg = removeWhiteBackground(heartImg);
processedPinkBalloonImg = removeWhiteBackground(pinkBalloonImg);
processedYellowBalloonImg = removeWhiteBackground(yellowBalloonImg);
processedOrangeBalloonImg = removeWhiteBackground(orangeBalloonImg);
// Initialize player balloon
player = new Balloon(width/2, height - 80 * scaleFactor);
// Create all button objects with proper positions
setupButtons();
// Set the pixel font for all text
textFont(pixelFont);
}
// remove White Background
function removeWhiteBackground(img) {
// Create a graphics buffer to manipulate pixels
let pg = createGraphics(img.width, img.height);
pg.image(img, 0, 0);
pg.loadPixels();
// Loop through all pixels and make white ones transparent
for (let i = 0; i < pg.pixels.length; i += 4) {
let r = pg.pixels[i]; // Red channel
let g = pg.pixels[i + 1]; // Green channel
let b = pg.pixels[i + 2]; // Blue channel
// If pixel is mostly white (R, G, B all > 200), make it transparent
if (r > 200 && g > 200 && b > 200) {
pg.pixels[i + 3] = 0; // Set alpha to 0 (transparent)
}
}
pg.updatePixels();
return pg; // Return the processed image
}
I also really liked the energy bar idea. I was really struggling with coming up with ideas, and my friend Nigina gave some feedback on my game and suggested to add this feature, which prevents the players from skipping the hearts.
function drawEnergyBar() {
// Position in top-right corner
let barX = width - energyBarWidth * scaleFactor - 20 * scaleFactor;
let barY = 20 * scaleFactor;
let barW = energyBarWidth * scaleFactor;
let barH = energyBarHeight * scaleFactor;
// Draw outer frame
stroke(255); // White border
strokeWeight(3 * scaleFactor);
noFill();
rect(barX, barY, barW, barH);
// Calculate fill width based on current energy percentage
let fillWidth = (energyBar / 100) * barW;
// Determine fill color based on energy level
let barColor;
if (energyBar > 60) {
barColor = color(0, 255, 0); // Green when high
} else if (energyBar > 30) {
barColor = color(255, 255, 0); // Yellow when medium
} else {
barColor = color(255, 0, 0); // Red when low
}
// Draw filled portion of energy bar
noStroke();
fill(barColor);
rect(barX, barY, fillWidth, barH);
// Draw "ENERGY" label above bar
fill(255);
textAlign(CENTER, BOTTOM);
textSize(16 * scaleFactor);
text("ENERGY", barX + barW / 2, barY - 5 * scaleFactor);
}
Design
Visually, I wanted it to feel airy and positive, so I used soft pastel colors, smooth cloud movement, and rounded buttons. Each page has its own layout — right-aligned buttons on the main page and left-aligned elements on the Game Over screen — to make navigation easy.
Challenges
The hardest part of my code was definitely managing how all the game elements work together (the bombs, hearts, clouds, timer bar, and different pages). Getting everything to appear, move, and disappear smoothly without glitches took a lot of trial and error.Sometimes bombs appeared too frequently or hearts overlapped with them. I fixed this by randomizing positions with distance checks.
if (bombs.length < maxBombs && frameCount % 50 === 0) { bombs.push(new Bomb(random(width), -20)); } if (score % 10 === 0 && !heartExists) { hearts.push(new Heart(random(width), -20)); heartExists = true; }
The collision detection between the balloon and falling bombs was tricky too, since I had to make sure it felt fair and accurate using circular hitboxes. Another challenging part was balancing the gameplay, making bombs fall fast enough to be fun but not impossible, while also keeping the hearts from overlapping with them. On top of that, managing all the page transitions (main menu, instructions, weather, skins, game over) and keeping the selected settings consistent made the logic even more complex. Overall, the hardest part was making everything work together in a way that felt natural and didn’t break the flow of the game.
Future Improvements
In the future, I’d like to make the game feel more complete by adding real background music and more sound effects for popping, collecting hearts, and clicking buttons. Another improvement would be to make the difficulty change as the score increases, for example, bombs could fall faster or spawn more frequently the longer you survive. I’m also thinking of adding new power-ups like shields or magnets to make the gameplay more interesting. On the design side, animated buttons and smoother page transitions could make the menus feel more polished. Eventually, I’d love to include a high score system to track progress and make players more competitive.
Midterm Project: OOO – EEE
I made a game before that was controlled with keyboard inputs so this time, I wanted to create a game that used different input.
As I was scrolling through Youtube Shorts to find inspiration of my project, I came across a simple game playable with user’s pitch levels. In the video I watched, the character moved up and forward depending on the level of pitch. With this in mind, I tried making simple programs that took user input.
First, I built a program that detected a specific pitch, in this case “C”. If the user sings the pitch, the block is moved upwards and if the user maintains the same level for certain amount of time, the block permanently moves upwards. I made this because my initial strategy was to make a adventure game where the character travels a 2D map and make certain interactions that triggers things such as lifting a boulder with a certain note. This little exercise allowed me to get familiar with sound input and how I can utilize it in the future.
For my midterm, I decided to create a simple game that uses paddles that move left and right. The goal of the game is to catch the falling objects with these moving paddles. The hardest part about the game was obviously moving the paddles depending on the user’s pitch. At first, the paddles were so sensitive to the point that its movement was all over the place even with a slight input of sound. Adjusting that and making it so that it moves smoothly was the key in my game.
While I was testing for movements, I realized that I was making sounds that resembled a monkey. I was making the sounds OOO for the low volumne and EEE to make high pitched noise. So I came up with a clever idea to make the game monkey-themed with falling objects being bananas and the paddles as monkey hands. It made me laugh thinking that the users would have to immitate monkeys in order to play the game. Also, I added a little feature in the end to replay the sound that the players made while playing my game so that they can feel a bit humilliated while playing the game. I thought this was a great idea to bring some humor in my game. I also had to test this multiple times making the game and had to experience it beforehand.
My game is divided into 4 major stages: start screen, instructions, gameplay, and gameover screen. As explained in class, I utilized the different stages so that resetting was easiler.
The startscreen is the title screen. It has 3 buttons: instructions, play and full screen button. Clicking the buttons make a clicking sound. That is the only sound feature I have since my gameplay is hugely effected by sound. Any background music or sound effects effect how the game is played so I kept it to the minimum. Also, making the game full screen effects the game play, so I had the fullscreen feature to fill everywhere else with black.
Before playing the users can click the instructions page to find out the controls and calibrate their pitch range. I deliberately say to use OOOs and EEEs for the pitches so that they can sound like monkeys. These pitch ranges are adjustable with up and down arrows and are stored in the local storage so that the settings remain even after resetting the game. I also show a live paddle so that the users can see how their voice with move the hands.
Once they hit play, the core loop is simple: bananas spawn at the top and fall; the goal is to catch them with the monkey “hands” (the paddle) at the bottom. I map the detected pitch to x-position using the calibrated min/max from instructions, I clamp the raw frequency into that window, map it to the screen’s left/right bounds (so the hands never leave the canvas), then smooth it. To keep control stable I added a small noise gate (ignore very quiet input), a frequency deadzone (ignore tiny wiggles), linear smoothing with lerp, and a max step cap so sudden jumps don’t overshoot. the result feels responsive without the little movement I had early on. The player scores when a banana touches the hands and loses a life on a miss; three misses ends the round.
When the run ends, the gameover screen appears with the background art, a big line like “you got x bananas!”, and two buttons: “play again” and “did you sound like a monkey?”. during gameplay i record the same mic that powers pitch detection; on gameover I stop recording and let the player play/stop that clip. it’s a tiny feature, but it adds a fun (and slightly embarrassing) payoff that matches the monkey concept.
I’m especially proud of how I handled pitch jumps. early on, tiny jitters made the hands twitchy, but big interval jumps still felt sluggish. I fixed this by combining a few tricks: a small deadzone to ignore micro-wiggles, smoothing with lerp for steady motion, and a speed boost that scales with the size of the pitch change. when the detected frequency jumps a lot in one frame (like an “ooo” to a sharp “eee”), I temporarily raise the max movement per frame, then let it settle back down. that way, small fluctuations don’t move the paddle, normal singing is smooth, and deliberate leaps produce a satisfying snap across the screen without overshooting. Getting this balance right made the controls feel musical.
For future improvements on the game itself, I want to smooth the frustration without losing the funny chaos. Bananas don’t stack, but several can arrive in different lanes at the same moment, and with smoothing plus a max step on the hands, some patterns are effectively unreachable. I kept a bit of that because the panic is part of the joke, but I’d like the spawner to reason about landing time instead of just spawn time, spacing arrivals so that at least one of the simultaneous drops is realistically catchable. I can still sprinkle in deliberate “double-arrival” moments as set pieces, but the baseline should feel fair.
Midterm Project Documentation: All Day Breakfast
Sketch(f for fullscreen): https://editor.p5js.org/joyzheng/full/tb0uwj2nP
Overall Concept
As a visiting student at NYUAD, I found the made-to-order dining system, particularly at the All Day Breakfast counter, to be very confused. Unlike the pre-made options I was used to, the text-only menus made it difficult to visualize my order. I always confused what and how many did I ordered if there’s no picture (some are arabic food I don’t know) and I often found myself pulling out a calculator to see if my selections added up to a full meal plan.
These frictions made me want to digitalize the experience to an interactive game that aims to gamify the ordering process. The core goal is to provide a more intuitive and visual way for players to assemble a meal, manage inventory, understand the costs, and manage their spending. By turning the process into a game with clear steps and rewards (badges), the project transforms a problem/demand discovered in my life into an engaging and replayable experience.
How It Works
The game guides the player through a six-scene narrative that mirrors the real-life process and menu of getting food at the D2 dining hall A LA BRASA All Day Breakfast Counter.

UI Prototype:
UE:
Scene 1:
Start Screen: The player is presented with the All Day Breakfast counter and prompted to “Ready to Order?”. Clicking the triangle button begins the game. The badge board is also displayed here, showing the player’s progress.
Scene 2:
Choose Food: The player is shown a grill with all available food items. They must first click to pick up a pair of tongs, which then attaches to their mouse. They can then click on food items to pick them up and click on the plate to add them to their meal. The total cost is updated in real-time.
Scene 3:
Scan Items: The player takes their plate to the cashier. They must pick up the scanner tool and move it over each food item on the plate. As each item is scanned, a beep sound plays, and the item is added to a virtual receipt.
Scene 4:
Payment: The cashier opens, revealing a coin tray. The player must pay the total amount shown on the receipt by clicking on coins from a palette and dropping them into the tray.
Scene 5:
Eat: The player sits down to eat. They must pick up a fork and use it to pick up food from their plate and bring it to the character(NYUAD Girl)’s mouth to “eat” it, which plays a sound and makes the food disappear.
Scene 6:
End Screen & Badges: After the meal, the game checks if the player’s actions have met the conditions for any new badges. If so, a special animation plays. The player is then given the option to “Dine in AGAIN!”, which resets the game and starts a new session.
Technical Decisions & Game Design I’m Proud of
I am proud of completing a fully functional and well-designed game within the project timeline, especially after iterating on the initial idea. A key technical challenge was to build the entire game to be fully responsive. The core of the responsive design is a set of helper functions (updateLayoutDimensions, scaleRectangle, scaleValue) that calculate scaling factors based on the current window size versus the original 700×500 design grid. This allows every element to reposition and resize dynamically, ensuring the game is playable on any screen.
It’s also helpful to discuss with Professor Mang to improve the interactivity and replayability of the game. We came up the ideas of implementing the stock management system and humorous badge reward that every NYUAD students who went to this dining hall could resonate with(e.g., never being able to spend a whole meal plan; why is 1 meal plan 33.6? Is that 0.1 for service fee?). I design the inventory as the same as how it usually would be in the counter, for instance, there’s always only a few avocado toast and I just never being able to get tofu omelet till now. Overall, this is also very meditating and educational (in some sense) that it reminds people to feed themselves well in dining hall even when you are rushing in classes and encourage user to do a balanced meal with enough amount of fiber everyday.
// =======================================
// SCENE 2: CHOOSE FOOD
// this function calculates the responsive positions for all food items in scene 2
function buildScene2FoodGrid() {
// clears the array of food objects to ensure a fresh start each time the grid is rebuilt (e.g., window resize)
scene2FoodObjects = [];
// constants that define the original pixel dimensions of the background art and the specific rectangular area within it where the food is displayed
const sourceImageSize = { w: 1536, h: 1024 };
const sourceFoodArea = { x: 124, y: 138, w: 1284, h: 584 };
// responsive calculation
// current on-screen position and size of the food area
// by finding the scaling ratio between the current canvas and the original background image
// so the grid always perfectly overlays the correct part of the background art
const foodGridRect = {
x: sourceFoodArea.x * (canvasWidth / sourceImageSize.w),
y: sourceFoodArea.y * (canvasHeight / sourceImageSize.h),
w: sourceFoodArea.w * (canvasWidth / sourceImageSize.w),
h: sourceFoodArea.h * (canvasHeight / sourceImageSize.h)
};
// the calculated grid area is then divided into cells (8 columns by 2 rows) to position each food item
const columns = 8;
const rows = 2;
const cellWidth = foodGridRect.w / columns;
const cellHeight = foodGridRect.h / rows;
// the size of each food item is based on the smaller dimension (width or height) of a grid cell
// this prevents the food images from looking stretched
scaled by 70% to add padding
const itemSize = min(cellWidth, cellHeight) * 0.7;
// this loop iterates through every food item defined
for (let i = 0; i < ALL_FOOD_ITEMS.length; i++) {
// math.floor() and % convert the 1d loop index (i) into a 2d (row, col) grid coordinate
let row = Math.floor(i / columns);
let col = i % columns;
// calculates the final top left (x, y) coordinate for each food item
// starts at the grid's origin
// adds the offset for the column/row
// adds a centering offset
let itemX = foodGridRect.x + col * cellWidth + (cellWidth - itemSize) / 2;
let itemY = foodGridRect.y + row * cellHeight + (cellHeight - itemSize) / 2;
// a new food object is created with its calculated position and size
// added to the array to be drawn
scene2FoodObjects.push(new FoodItem(ALL_FOOD_ITEMS[i], itemX, itemY, itemSize));
}
}
The most complex piece of code, and the one I’m most proud of, is the logic in the buildScene2FoodGrid() function. Unlike other elements that scale relative to the canvas, this grid must scale relative to the background image itself to ensure the food items are perfectly aligned with the artwork.
This logic calculates a scaling ratio based on how the background image has been stretched to fit the screen, and then applies that same ratio to the coordinates of the food grid. It’s a powerful piece of code that makes the experience feel seamless.
Challenges & Improvements
The development process was a valuable relearning game development. I’m surprised by the amount of free assets resources and tutorials for game development online. I’m also inspired by the Coffee Shop Experience example of how to use p5js to manage a game and toggle between scenes.
One of the most surprisingly time-consuming challenges was a simple debugging session that lasted hours, only to discover I had misspelled “diarrhea” as “diarreah” or “diareah” in different location. This taught me the importance of meticulous checking and creating simple debugging tools to isolate issues early.
I also got the opportunities to explore AI created assets through this project. For this huge amount of assets, AI assets might be the best option for me in order to finish on time. However, I still spent at least half of the game development just to get back and forth for “drawing a good card” of images. To be honest, I want to say Nano Banana didn’t worth the hype for image creation. For game assets development, ChatGPT might be the best choice after trying a few different apps like Midjourney or Canva. This is very lightweight and it also supports transparent background with png, so it could be directly use without manually removing the background.
For the future, I have several ideas for improvement:
- Expand to Other Counters: I would like to implement a similar ordering system for the D1 dining hall, which also has a confusing menu.
- UI Enhancements: I plan to add a toggle to hide or show the badge board, giving the player more control over their screen space.
- More Badges: Adding more creative badges would further increase the incentive for players to try different food combinations and spending strategies.
- Scene Refinement: Some scenes are quite dense with assets. In a future version, I might split complex steps into more scenes to make the layout feel cleaner and less cluttered.
- Real Implementation: After Midterm, I will demo this to dining hall manager to see if they want to adopt this ordering system or just using a more intuitive and interactive menu to run the dining hall more efficient.
Midterm Project: Twenty Seconds
View Twenty Seconds here:
https://editor.p5js.org/siyonagoel/full/VcTSl8x7V
My Concept:
Twenty Seconds is an immersive minigame experience that aims to make its users uncomfortable. Every part of the project has been developed with the intention that its use should ultimately make someone feel uneasy, and most of this logic is based on things that make me uncomfortable.
This also means that some parts of the project may not cause the same level of discomfort for everyone, and that’s okay, but I’ve tried my best to use things that make most of the population uncomfortable. The project essentially features 8 rooms, each of which has a 20-second timer running, and the user has to either complete some sort of a challenge, or sit through a bad
experience (like uncomfortable sounds) for 20 seconds. They cannot move to the next room until they complete the challenge in their current room, and they complete the experience only after going through all rooms.
There are some deliberate design choices within this project. To start with, I made sure that there is a very minimal use of color throughout the project. Hence, the only colors you will see are white, black, and red.
Initially, I was thinking of only white and black, but after I realised one more color is a necessity, I added red as I find it one of the most uncomfortable colors due to its association with violence. Also, there is no general background music that plays throughout the project, although there are some specific sounds for a few rooms and the pop-up messages. What can be more uncomfortable than silence, when people can actually hear their own thoughts? The font I used—Reenie Beanie—was the best mix I could find between a readable font and human handwriting, something that looks like it was scrawled on a blackboard with chalk.
For my midterm project, I wanted to do something that is a unique mix of both a game and interactive art, and I believe Twenty Seconds captures this quite nicely.
Technical and Game Design:
The project treats each room as a self-contained mini challenge while keeping a single central state
(the hallway leads to a door which leads to a room which leads back to the hallway). I am proud of the clear division in my code between different sections, such as resource loading, room initialization, and rendering. For example, preload() gathers all the images and sounds that I use, each initRoomX() sets up the state, and draw() delegates to the current room. Because of this structure, I could easily extend the code every time I wanted to add a new room, and made debugging predictable. Here’s an example:
function initRoom2() {
roomItems = [];
// Define positions and images for the 4 items
let positions = [
{ x: 50, y: 130, w: 230, h: 192, img: bedImg, name: "bed" },
{ x: 320, y: 130, w: 230, h: 204, img: labubuImg, name: "labubu" },
{ x: 600, y: 130, w: 162, h: 263, img: toiletImg, name: "toilet" },
{ x: 810, y: 150, w: 220, h: 157, img: sofaImg, name: "sofa" }
];
roomItems = positions;
startTimer();
}
function drawRoom2() {
background(0);
// Instructions
fill("white");
textSize(30);
textAlign(CENTER, TOP);
text("Which of these definitely does not belong in any of our homes?", width / 2, 10);
drawTimer();
// Draw all room items
for (let item of roomItems) {
image(item.img, item.x, item.y, item.w, item.h);
// show the item's name when hovering over it
if (isMouseOverItem(item)) {
if (item.name === "bed") {
fill("white");
textSize(30);
text("Spiky bed", 190, 350);
} else if (item.name === "labubu") {
fill("white");
textSize(30);
text("A labubu", 480, 350);
} else if (item.name === "toilet") {
fill("white");
textSize(30);
text("Snake toilet", 755, 400);
} else if (item.name === "sofa") {
fill("white");
textSize(30);
text("Centipede sofa", 995, 350);
}
}
// failure condition
checkTimerExpired("You're trapped until you find the right answer >:)");
}
}
So, every time I had to implement a new room, I would just add its required initRoomX() function and drawRoomX() function to the existing code, along with the required functionality and pop-up logic in the mousePressed() function. Since elements like the pop-ups and the timer were to be used repeatedly for all the rooms, I made sure to structure them as easily reusable functions that I can call in one line without having to paste the same 4-5 lines of code in the code for every room.
On the technical side, there are a couple of algorithms I’m proud of for some of the games. The first is the ones I used for room 1, the room filled with clocks. I used a circle packing algorithm, learnt from here, to generate the placement of the clocks without them overlapping.
// circle packing algorithm for generating non-overlapping clocks
while (attempts > 0) {
// random position and size of circles
let r = random(minR, maxR);
let x = random(r, width - r);
let y = random(r + topBuffer, height - r);
// Check if position overlaps with existing clocks
let valid = true;
for (let c of clocks) {
let d = dist(x, y, c.x, c.y);
if (d < r + c.r) {
valid = false;
break;
}
}
// generate a clock if the position is valid
if (valid) {
clocks.push(new Clock(x, y, r));
}
For the warping of the clock’s hands when the clock “melts”, I created a function called drawMeltingHand() in the Clock class that uses subtle bezier deformation for the cool effect. Before this I had no idea that something known as Bezier curves exist, and found out that there is a p5js function for it when i was searching for ways online to draw curved lines smoothly.
drawMeltingHand(x, y, length, angle, melt, weight) {
push();
stroke("red");
strokeWeight(weight);
// Midpoint of the hand
let midLength = length * 0.5;
let x1 = cos(angle) * midLength;
let y1 = sin(angle) * midLength;
// straight first half part of the hand
line(0, 0, x1, y1);
// curved tip that bends downwards
let x2 = cos(angle) * length;
let y2 = sin(angle) * length + melt * 0.5;
// bezier(x1, y1, x2, y2, x3, y3, x4, y4)
bezier(x1, y1, x1, y1 + melt * 0.3, x2, y2 - melt * 0.2, x2, y2);
pop();
}
Another interaction design choice I’m proud of is the reversed cursor for whack-a-mole. I thought it would be complicated to implement, but the math actually turned out to be very very simple. If I just subtract from the center of the canvas the distance between the center of the canvas and the user’s real cursor, it would give me the corresponding coordinate for the virtual reversed cursor.
// this calculates the reversed cursor position // the virtual mouse moves opposite to the user's actual mouse let centerX = width / 2; let centerY = height / 2; virtualMouseX = centerX - (mouseX - centerX); virtualMouseY = centerY - (mouseY - centerY);
I also really like the implementation of the eyes in room no. 6. I learnt about using the atan2() function for this purpose from here. It’s probably one of my most favorite rooms, because the code wasn’t too complicated, and the resulting effect was still very cool.
if (this.isSpecial) {
// Static pupil
pupilX = this.x + this.staticPupilX; // keep it centered
pupilY = this.y + this.staticPupilY;
} else {
// Following pupil
// tracks the mouse
// atan2() finds the angle formed by a point, origin, and positive x-axis
// calculate the angle between the eye center and the mouse position
let angle = atan2(mouseY - this.y, mouseX - this.x);
// don't want pupil leaving the eyeball
// Set a maximum distance that the pupil can move from the eye center
// 15% of the eye’s width or height (whichever is smaller)
let distance = min(this.w * 0.15, this.h * 0.15);
// calculate new pupil position
pupilX = this.x + cos(angle) * distance;
pupilY = this.y + sin(angle) * distance;
}
Problems I ran into:
- I made the hallway doors on Canva. So the distance and angle between the doors was based on their relative orientation to each other on the canva surface size. What I didn’t realize was that I would need to have the exact same relative distance between the doors on my p5js canvas as well so that the perspective lines align, and due to this the hallway pathway ended up becoming much broader than I had planned. The only way to fix this would have been to remake all the doors again keeping this in mind, but since that wasn’t a time-feasible option, I left the hallway as is.
- Another problem I ran into was with drawing Dali’s clock. While I achieved the effect that I wanted with the clock hands, I cannot say the same for the circular frame of the clock. I wanted the bottom half of the clock to stretch downwards in a wavy shape so it would resemble Dali’s clocks, but I could not figure out how to achieve that effect. I tried asking large language models like ChatGPT and Claude to help with this but their attempts failed quite horrendously. Finally, I settled for the next best thing and just made the bottom part of the clock stretch downwards in a straight fashion. I did this using the following code:
if (angle > PI / 4 && angle < (3 * PI) / 4) { py += map(sin(angle), 0, 1, 0, this.meltAmount); }The if condition selects only the lower arc of the circle. map(sin(angle), 0, 1, 0, this.meltAmount) converts the gradient gotten from sin(angle) into a vertical offset that increases toward the bottom of the circle. So basically, the value returned by sin(angle) in the range 0-1 is mapped proportionally to a value in the range of 0- the melting amount value set by me, and by doing py += , I am able to pull the y-coordinate downwards.
- Figuring out the ideas for each room. It took some time, but here’s the reason each room has what it has:
- Room 1: Clocks and the sound of a ticking clock just makes me really uncomfortable, a room full of them is just terrible.
- Room 2: Self-explanatory.
- Room 3: Needle-in-a-haystack kind of a situation. I grew up hearing this phrase a lot, and I don’t like it.
- Room 4: I hate the feeling of disorientation. I wanted people to go through a similar feeling but making them deal with a reversed cursor.
- Room 5: I think there are some sounds that make you want to pull your hair out. I made sure they’re all in this one room.
- Room 6: The idea of being watched all the time is so uncomfortable.
- Room 7: Some words in the English language just feel so wrong. I thought a collection of them floating around in a one place would be nice.

- Room 8: This room was technically to give people a break. So while they can relax and watch rain pouring for 15 seconds, that last 5 seconds the rain turns red, and I think red rain definitely doesn’t make anyone feel more comfortable.
Areas for improvement:
- A friend suggested this really cool idea for jumbling up the doors everytime someone returns back to the hallway. This would make the whole experience so much worse.
- Currently, the rooms aren’t manipulated by any kind of user input. What I mean by this is that yes the user interacts with the elements in the room, but it’s only through moving the cursor around or clicking. In the future, I would like to add more direct user interaction, such as user text input. I would also like to experiment with machine learning tools like computer vision and use the audience’s bodily movement or facial experiments as inputs.
- I purposely chose not to have background music that runs throughout the game, but I think if I really found the perfect music for the ambience I’m going for, I would be open to using it.
- In room 5, the sounds stop bluntly when crossing regions. In the future I would implement smooth crossfades, to create a more cohesive soundscape and reduce abruptness, which will make transitions feel intentional and less jarring.
Week 6 – Game Progress
Project concept
Last week, when I first thought of my game idea, I had almost the same concept as now, but it was originally a candy-themed game. Then I decided to make it closer to home, more customizable, and more traditional. I was trying to think about what should fall from the sky instead of candy, but I really couldn’t come up with anything at first.
Then one day, I was at my grandpa’s house, and I saw him drinking coffee with my little brother. It made me laugh because in our culture, especially in Arab families, they always tell kids not to drink soft drinks, but somehow Gahwa (Arabic coffee) is always allowed, even though it’s full of caffeine! What’s even funnier is that my grandpa used to give my little brother kahwa when he was literally a baby. He could only drink milk, but my grandpa would secretly give him kahwa without my parents knowing. In his mind, kahwa was totally fine, but soft drinks were bad.
That moment gave me the idea to make the game about Gahwa. So now, in my game, the goal is to catch the Gahwa and avoid the Pepsi , if you catch the Pepsi, the game is over. That twist made the game feel really traditional and personal to me.
I also decided to name the characters with Arab names and base them on my younger brother and sister. I really enjoyed that part because it made the game feel more meaningful and connected to my culture. Even though it’s a simple game and my technical experience isn’t that advanced yet, I really love the creative side of it. It feels fun and personal, and that’s what I enjoyed most.
What I am proud of
For my Week 6 submission, what I’m most proud of is that I made this entire game from scratch it’s my first time ever creating a full game, and that alone makes me really proud of myself. I’m also proud of how resourceful I was throughout the process. Even though the YouTube videos I found weren’t the same as my game at all, I still managed to understand their concepts and figure out how to apply them to my own project. I feel like I found really good sources that helped me learn and improve, and it showed me that I’m capable of teaching myself new things and solving problems independently.
I’m also really proud of my concept because it has a story behind it that makes it personal and meaningful. I feel like I did really well in both the creative and technical parts of the project. On the technical side, I worked hard to understand new concepts and used different sources to make my game function the way I wanted. On the creative side, it all came naturally because I was genuinely passionate about the idea. I really enjoyed bringing my story to life through the game, and that made the whole process even more special for me.
Bug
The only thing that’s bothering me about my game right now is that sometimes you lose even when the Pepsi is kind of far from the character. It doesn’t actually touch or collide with the player, but the game still ends as if it did. I think it’s because the data or hitbox isn’t directed properly to the character, so I’m planning to fix that bug. I also haven’t added any sounds yet, but I really want to, I just need to watch some tutorials first to learn how to do it correctly.
Code Highlight
This part of the code is one of the main things we learned in class, and it really became the foundation for my whole project. It basically controls what shows up on the screen depending on which stage of the game you’re in, like the home screen, instructions, character select, gameplay, or game over. This structure made everything so much easier for me because it helped organize my code and made it clear what each part should do. I started with this as my base and then kept building on it, adding more functions and features as I went along. It really helped me understand how to manage different screens in a game.
function draw() {
background(0);
//conditions for screens
if (currentScreen === 'home') drawHome();
else if (currentScreen === 'instructions') drawInstructions();
else if (currentScreen === 'character') drawCharacterSelect();
else if (currentScreen === 'game') drawGame();
else if (currentScreen === 'gameover') drawGameOver();
Some Images I Used
Midterm Project: Worm vs Sanity
Concept
Food has always been something deeply emotional forme a way to heal, connect, and recharge after long or lonely days. Whether it’s sharing a meal with friends and family or eating quietly in solitude, food always finds a way to lift the spirit. For me, food is more than just fuel; it’s comfort, joy, and sometimes even memory itself. Every dish I eat reminds me of a moment, a feeling, or a person.
But, of course, there’s another side to that relationship those unforgettable moments when something unexpected shows up in your food: a hair, a fly, a worm, even a tiny stone. It’s disgusting, sometimes shocking, and yet over time it becomes something you laugh about. The idea for this project actually struck me when I once found a fly in my food. In that split second, my emotions bounced between anger, disgust, and disbelief and then, later, laughter. I realized how something so small could completely shift my mood and turn an ordinary meal into a story I’d never forget.
It also reminded me of moments with my grandmother. She used to cook for the whole family, and occasionally, there would be a stray hair in the food. Instead of getting angry, I’d turn it into a lighthearted joke so everyone could laugh. Those moments became cherished not because the food was perfect, but because the imperfections made it real, made it ours. They were messy, human, and full of love.
Through my project, I wanted to recreate those shifting emotions from disgust and frustration to humor and warmth. I tried to capture the entire emotional cycle we experience in those moments: the anger when something feels ruined, the creepiness of noticing something “off,” and the humor that comes when you finally laugh it off.
-
Anger is portrayed through intense, chaotic visuals like the “deadly” appearance of the dining hall, the harsh red tones.
-
Creepiness comes through the eerie atmosphere the bloody dining hall textures, dim lighting, and strange, almost horror-like visual style that makes you feel uneasy, the same way you feel when you find something in your food that shouldn’t be there.
-
Humor ties it all together. I added funny instructions like “Ultimate Guide to impress Worm” that turns disgust into comedy. It’s a playful reminder that these moments, while annoying, are also absurd and relatable something we can laugh about later.

To make it more personal, I brought in imagery from NYUAD specifically D2 and the Library, two of my favorite places on campus. They hold countless memories of food, laughter, and friendship, so I wanted to reimagine them in my project. I took photos and used ChatGPT to generate artistic, surreal versions of these spaces blending reality and imagination. The result is an environment that feels both familiar and eerie, mirroring that strange feeling of discovering something unexpected in what you love.
Lastly, I chose to use hand gestures as one of the interaction method because I wanted the experience to feel physical and expressive, not just mechanical. In real life, our hands are what connect us to food. We cook with them, eat with them, react with them. So, using gestures like moving the left hand to go left, right hand to go right, and closing the fist to jump feels symbolically and emotionally right. It mirrors how our hands instinctively respond when we’re disgusted or startled. We pull back, push away, or clench up.
While it might not be the most conventional control scheme, that’s precisely what makes this project unique and artistic rather than a simple computer game. The goal wasn’t to make a polished arcade game, but to create a more embodied experience one that makes the player aware of their own physical reactions.
How to Play:
At its core, the project is an interactive game centered around a simple but expressive idea: defeat the worms that are being generated from right of the screen before they reach the end of the screen on left.
Players can interact with the game in two different ways:
Keyboard controls — using the arrow keys to move and jump: → to go right, ← to go left, and ↑ to jump.
Hand gesture controls — raise your left hand to go left and raise your right hand to go right. By raise I mean make your hand visible to the camera and when u don’t want to go say left make you left hand unvisible to the sight of the camera. If you make a fist or close your finger the girl will jump.
The basic rule is simple: jump over the worms to eliminate them before they cross the screen. Players have three lives, and if they manage to survive until time >= 900 (meaning the draw function has run 900 times) with at least one life left, they win.
At first, it might feel unintuitive, but as you play, it becomes surprisingly fun and natural like you’re physically fighting off those unwanted “guests” in your meal.
Parts I’m Proud Of
The part I’m most proud of is integrating Machine Learning into a project that’s not only technical but emotional and personal. As a Computer Science major, I’m always drawn to exploring how technology can express feeling and creativity. Implementing gesture control allowed me to bridge art and code to make something that doesn’t just work, but feels alive.
I’m also proud of how I personalized the experience. By using NYUAD-specific places like D2 and the Library, I rooted the project in a world I know and love. It gives the game a familiar atmosphere one that other NYUAD students can relate to while still transforming it into something strange and artistic.
Areas for Improvement
While I’m proud of how the game turned out, there are several areas I’d like to refine. The hand gesture control, though innovative, can feel slightly clunky at first. I’d like to make it more responsive and intuitive perhaps by training the ML model with more data or maybe using ML to detect body that can say if a person is turning left or right and the character themself move left or right.
I’d also love to expand the visual storytelling. Right now, the “bloody” D2 gives the right kind of creepiness, but I imagine adding more levels or moods maybe transitioning from a calm dining scene to a chaotic food fight as the difficulty increases.
Problems that you ran into
While building the project, I faced a few interesting technical challenges that pushed me to think creatively about how motion and input are detected and processed.
1. Detecting when the hand is closed (fist gesture):
My first major challenge was figuring out how to detect when the user’s hand is closed. I wanted the “fist” gesture to trigger a jump action, but at first, I wasn’t sure which hand landmarks to compare. Eventually, I decided to track the index fingertip (keypoint 8) and the base of the index finger (keypoint 5).
The idea was simple: if the y-coordinate of the fingertip (hand.keypoints[8].y) becomes greater than that of the finger base (hand.keypoints[5].y), it means the fingertip is lower in the camera frame in other words, the finger is curled in, forming a fist.
I used console.log(hand.keypoints[8].y, hand.keypoints[5].y) to visualize the values and experimented by opening and closing my hand repeatedly to see when the condition triggered. This trial-and-error approach helped me fine-tune the threshold for reliable gesture recognition. It was satisfying to see the jump action respond accurately once the logic clicked.
2. Managing repeated function calls with hand gestures:
The second issue was with repeated trigger events when using gesture control. Unlike pressing a key which calls the action just once per pressraising a hand is a continuous motion, so the detection function kept firing dozens of times per second.
For example, calling girl1.jump() or movement functions using hand gestures caused the action to repeat uncontrollably fast. To solve this, I implemented a counter-based system and used a modulus condition to limit how often the action executes. Essentially, if the function was being called too rapidly, I only allowed it to execute once every ten calls.
Similarly, I adjusted the character’s movement speed when controlled by gestures. Instead of moving by this.speed_x each frame (which made her move unrealistically fast), I scaled it down to this.speed_x * 0.005 inside the update_Ml() function. This made her movement smooth and proportional to the natural pace of a hand gesture, giving the game a more balanced and controlled feeling.
This also applied to animation strip changes by updating them every tenth frame, the animation stayed visually consistent without flickering or overloading.
My Sketch :
view only screen link: https://editor.p5js.org/aa11972/full/b224cudrh













