Midterm Project – Barbie Dreamhouse

Barbie’s Dreamhouse 

Link to sketch: https://editor.p5js.org/rma9603/full/y2y3-M4zC

Whenever I play with interactive art, I try to build something that invites slow exploration instead of a single-goal game. For my final project I built Barbie’s Dreamhouse: a small interactive world with rooms to explore (Outside → Inside → Closet, Kitchen, Bedroom, Living Room), each containing subtle objects the user can click or interact with. The goal was to create a calm, nostalgic environment that encourages clicking, discovering, and lingering.

Concept

The Dreamhouse is not a “win/lose” game, it’s an exploratory scene. The idea was to capture the cozy, pastel vibe you expect from a dreamhouse and layer in small interactive details:

  • An exterior view with a theme song and a door that rings a real doorbell when clicked.

  • An interior view with hotspots for the closet, kitchen, bedroom, and living room.

  • A Closet (Wardrobe) with multiple outfit sets and selectable frames, accompanied by sparkle effects and sound.

  • A Kitchen  to pick a cupcake base, bake it, then decorate with frosting.

  • A Bedroom with a clickable book that opens a reader overlay

  • A Living Room with a TV area and remote control for channel-like images with music that resonated with the pic shown.

On a personal note: I loved Barbie as a kid, and some of my favorite Barbie movies directly inspired the look and feel of the living room — the pastel decor, playful props, and the idea of a tiny TV full of different “channels” came straight from that nostalgia.

The focus was on atmosphere: soft pastel visuals, gentle audio, and small surprises that reward clicking around.

Here is a rough sketch of what I envisioned:

Key features and interactions

Room transitions

  • Click the door from the exterior to enter the house.

  • From the interior, click room hotspots to open that room full-screen.

  •  Scene state management makes switching easy.

Closet 

  • Pages of outfit frames sliced from larger sprite sheets. (took me a while to slice them correctly).

  • Click dots to switch pages/sets, arrows to cycle frames, and a ✓ button to confirm selection.

  • Sparkle overlay + sparkle sound on selection when outfit is confirmed.

  • I added a wardrobe reset so pressing R reliably restarts the closet to its initial state.

Kitchen

  • A mini workflow: pick base → bake (progress bar) → confirm → decorate (frosting).

  • Cupcake base and frosting are separate sprites; frosting is aligned to the base bottom using computed offsets so different frosted overlays sit properly.

  • Tweaked the base preview size so the cupcake base doesn’t dominate the scene.

Bedroom

  • A book hotspot that opens a reader overlay with pages.

  • Prev/Next page buttons and R to close.

Living room

  • The living room is where my childhood Barbie inspiration shows most — pastel furniture, framed photos, and a playful TV nook. I built a small TV area with channel-like images and a responsive remote so users can flip through visuals like changing channels in a cozy movie night.

  • TV image area and remote hotspots scale responsively with the canvas; the living room’s color and props riff off my favorite Barbie movies.

Audio

  • Background theme for the exterior (looping Barbie Life in the Dreamhouse theme song).

  • Doorbell sound that plays when clicking the door — the theme song stops when the door is pressed so the bell is audible and the audio doesn’t overlap.

  • Special audio for sparkle, baking ding, closet music, and bedroom lullaby — all are conditionally played/stopped when entering/exiting rooms or selecting items.

Workflow & what went well

  1. Scene manager + state machine
    Making a small SceneManager (global state variable with states like outside, inside, closet, bedroom, living room) made it trivial to manage transitions and keep room-specific logic isolated.
  2. Drawing and assets
    I drew assets in  Canva and used AI to generate some of the visuals  at the same size as the p5 canvas where possible — this hugely simplified positioning and saved time. For sprite sheets (like cupcake bases/frostings and outfits) I sliced frames programmatically so I could treat them like tiled sprites.
  3. Small polish details
  • Preventing continuous hover sounds (door bell) by gating the knock with a boolean.

  • Ensuring music doesn’t layer (check .isPlaying() and .pause() before starting a new track).

  • Adding a “sparkle” overlay and stopping closet music when confirming a selection so the sparkle sound can be heard.

What coding I’m proud of

The piece of code I’m proudest of is  honestly the whole kitchen, specifically the sprite-slicing + alignment system for the Kitchen. I wrote utilities that trim transparent pixels from sprite frames, compute each frame’s visual center/bottom, and then use those offsets to automatically align frosting to the cupcake base across many different sprite sheets and sizes. That makes wildly different art assets behave as a single cohesive object without manual per-frame positioning. It also required careful handling of canvas scaling, timing (bake/ding), and audio overlap — a lot of little edge cases that had to work together cleanly. 

 // slice sprites  — trims each frame and computes offsets on trimmed images
  _sliceSprites() {
    // base frames
    this.baseFrames = [];
    this.baseOffsets = [];
    if (
      this.cupcakeImg &&
      this.cupcakeImg.width &&
      this.cupcakeImg.height &&
      this.baseCount > 0
    ) {
      const g = this._bestGridFor(this.cupcakeImg, this.baseCount);
      const fw = Math.round(this.cupcakeImg.width / g.cols);
      const fh = Math.round(this.cupcakeImg.height / g.rows);
      let idx = 0;
      for (let r = 0; r < g.rows; r++) {
        for (let c = 0; c < g.cols; c++) {
          if (idx >= this.baseCount) break;
          const sx = c * fw,
            sy = r * fh;
          try {
            const raw = this.cupcakeImg.get(sx, sy, fw, fh);
            const trimmed = this._trimTransparent(raw) || raw;
            this.baseFrames[idx] = trimmed;
            this.baseOffsets[idx] = this._computeContentBounds(trimmed);
          } catch (e) {
            this.baseFrames[idx] = null;
            this.baseOffsets[idx] = {
              xOffset: 0,
              yOffset: 0,
              maxY: Math.floor(fh / 2),
            };
          }
          idx++;
        }
      }
    }

    // frosting frames
    this.frostingFrames = [];
    this.frostingOffsets = [];
    if (
      this.frostingImg &&
      this.frostingImg.width &&
      this.frostingImg.height &&
      this.frostingCount > 0
    ) {
      const g = this._bestGridFor(this.frostingImg, this.frostingCount);
      const fw = Math.round(this.frostingImg.width / g.cols);
      const fh = Math.round(this.frostingImg.height / g.rows);
      let idx = 0;
      for (let r = 0; r < g.rows; r++) {
        for (let c = 0; c < g.cols; c++) {
          if (idx >= this.frostingCount) break;
          const sx = c * fw,
            sy = r * fh;
          try {
            const raw = this.frostingImg.get(sx, sy, fw, fh);
            const trimmed = this._trimTransparent(raw) || raw;
            this.frostingFrames[idx] = trimmed;
            this.frostingOffsets[idx] = this._computeContentBounds(trimmed);
          } catch (e) {
            this.frostingFrames[idx] = null;
            this.frostingOffsets[idx] = {
              xOffset: 0,
              yOffset: 0,
              maxY: Math.floor(fh / 2),
            };
          }
          idx++;
        }
      }
    }
  }

 

Areas for improvement / future work

  • Add instructions or an optional guided mode (right now the experience is intentionally exploratory, but an in-game menu could help some users).

  • Refine click detection for non-rectangular images (pixel-perfect hit testing for PNGs with transparency).

  • Add more kitchen interactions: coffee machine, more decoration options, or an inventory for outfits.


What I learned

  • Breaking the app into small room controllers (Wardrobe, Kitchen, Bedroom, LivingRoom) makes the codebase much easier to maintain and debug.

  • Small details matter: gating hover sounds, preventing overlapping music, and subtle visual feedback (sparkle, dots) make the experience feel much more polished.

  • Drawing assets at canvas scale saves tons of time when positioning interactive pieces.

Closing

I loved Barbie when I was a kid, and designing this project felt like a grown-up, interactive love letter to those movies ,  especially when building out the living room. I enjoyed making something soft and low-pressure that rewards clicking and exploration. The Dreamhouse was a great exercise in scene management, responsive layout, and polishing interactions that make users want to hang out in a piece of art.

Midterm Project – Music Vent

What is Music Vent?

 

So I created this music visualizer called **Music Vent**, and the whole idea came from thinking about how we use music when we’re feeling emotional – especially when we’re sad or need to vent. You know how sometimes you just want to put on some music and let it all out? That’s exactly what this project is about.

 

The point of Music Vent is to create an immersive experience for music listening, especially for those moments when you want to vent through sad music. But here’s the thing – while you’re going through those emotions, there are these cute, colorful elements that somehow bring you ease and comfort.

 

The Concept Behind It:

 

I wanted to capture this duality that happens when we listen to music emotionally. On one hand, you have these really comforting, almost therapeutic elements:

 

– **Flying radio-cloud birds**: These little radios attached to clouds that float across the screen in the most adorable way. They’re like digital companions that keep you company while you’re listening.
– **A beautiful galaxy background**: I created this artistic Milky Way galaxy with twinkling stars and colorful dust clouds that creates this peaceful, cosmic atmosphere.
– **Soft colors and smooth animations**: Everything flows gently and uses calming colors that make you feel at ease.

 

But then on the other hand, you have the more intense, cathartic elements:

 

– **Beat-responsive visualizations**: These are the NCS-style spectrum analyzers (those green bar graphs you see in the middle) that react aggressively to the music’s beats. They can feel a bit disruptive to the peaceful vibe, but that’s intentional – they represent the raw emotional energy you’re releasing.

 

How I Built It

 

The Technical Setup

 

I built this using p5.js and JavaScript, and I tried to keep the code organized using classes so it wouldn’t become a complete mess. Here’s basically how it’s structured:

 

“`class MusicVisualizerApp {
constructor() {
this.audioManager=newAudioManager();
this.visualManager=newVisualizationManager();
this.uiManager=newUIManager();
}
}“`
I have separate managers for handling the audio, the visuals, and the user interface. This way, if I want to change how the audio analysis works, I don’t have to mess with the visual code.

 

The Audio Analysis Part

 

This was probably the trickiest part. I needed the system to actually “understand” the music and respond to it emotionally. So I created this mood detection algorithm:

 

“`javascript
class MoodProfile {
analyzeMood() {
constavgEnergy=this.average(this.analysisBuffer.map(d=>d.energy));
constavgBass=this.average(this.analysisBuffer.map(d=>d.frequencyBands.bass));
constavgHigh=this.average(this.analysisBuffer.map(d=>d.frequencyBands.high));
// Calculate emotional characteristics
this.currentMood.energy=Math.min(avgEnergy*2, 1.0);
this.currentMood.danceability=Math.min((avgBass+this.currentMood.energy) *0.8, 1.0);
this.currentMood.valence=Math.min((avgHigh+avgCentroid) *0.9, 1.0);
}
}
“`

 

Basically, the system listens to the music and analyzes different frequency bands – like how much bass there is, how much high-frequency content, the overall energy level. Then it tries to figure out the “mood” of the song and adapts the visuals accordingly.

 

The cool thing is that it can detect beats in real-time and make the black hole effect happen right when the beat hits. I spent way too much time getting the beat detection algorithm right!

 

Creating the Galaxy Background

 

I wanted something that felt cosmic and peaceful, so I created this Milky Way galaxy effect. It has about 500 twinkling stars, colorful dust clouds, and these spiral arms that slowly rotate. But here’s the cool part – when a beat hits in the music, the whole galaxy gets sucked into a black hole!

 

“`javascript
// When beats are detected, everything spirals inward
if (beatDetected) {
this.targetBlackHoleIntensity=1.0;
// Stars and particles get pulled toward the center
}
“`

 

The black hole effect was inspired by how intense emotions can feel like they’re pulling everything into them. When the beat drops, you see this dramatic transformation where all the peaceful elements get drawn into this swirling vortex with orange and purple colors.

 

### The Flying Radio-Cloud Birds

 

This was probably my favorite part to code. I took inspiration from a radio drawing I had made before and turned it into these little geometric radios that fly around attached to fluffy clouds. They spawn randomly from either side of the screen and just float across peacefully.

 

“`javascript
class RadioCloudBird {
constructor(x, y, direction=1) {
this.cloudColor=random([‘white’, ‘lightblue’, ‘pink’, ‘purple’]);
this.radioColor=random([‘brown’, ‘black’, ‘silver’, ‘gold’]);
this.bobSpeed=random(0.02, 0.05); // Makes them bob gently
}
}
“`

 

Each radio is drawn using basic geometric shapes – rectangles for the body, circles for the speakers and knobs, lines for the antenna. I had to figure out how to scale everything properly so they’d look right when flying around, but once I got it working, they became these adorable little companions that make the whole experience feel less lonely.

 

## What I Learned and Challenges I Faced

 

### Making Everything Feel Smooth

 

One thing I really focused on was making sure all the animations felt organic and not jarring. I used a lot of interpolation to smooth out the transitions:

 

“`javascript
// Instead of sudden changes, everything gradually transitions
this.values.bass = lerp(this.values.bass, newBassValue, 0.1);
this.values.energy = lerp(this.values.energy, newEnergyValue, 0.1);
“`

 

This makes the whole experience feel more natural and less like you’re watching a computer program.

 

### A Small Touch: Conversation Detection

 

I also added this feature where if the system detects you’re talking (through the microphone), it automatically lowers the music volume. I Included this interactivity feature because that one feature I really wished to see in music party listening softwares. As someone who used to listen to music bots a lot on discord a lot, I always found it annoying to manually reduce or mute the music bot whenever I wanna speak to my friends while listening. This was the initial inspiration to create this project by the way, but then I got the idea of the concept behind this visualizing experience so I focused more on it.

Here is the project on p5, have fun experiencing it!

 

Midterm – Pitchy Bird

following my initial progress on a backend-backed idea, I faced challenges managing the complexities of API usage and LM output, and thus switched to another idea briefly mentioned at the start of my last documentation.

Core Concept

For my midterm project, I wanted to explore how a fundamental change in user input could completely transform a familiar experience. Flappy Bird, known for its simple tap-based mechanics, was what I took to re-imagine with voice control. Instead of tapping a button, the player controls the bird’s height by changing the pitch of their voice. Singing a high note makes the bird fly high, and a low note brings it down.

The goal was to create something both intuitive and novel. Using voice as a controller is a personal and expressive form of interaction. I hoped this would turn the game from a test of reflexes into a more playful—and potentially, honestly, sillier challenge.

How It Works (and What I’m Proud Of)

The project uses the p5.js for all the visuals and game logic, combined with the ml5.js library to handle pitch detection. When the game starts, the browser’s microphone listens for my voice. The ml5.js pitchDetection model (surprisingly it’s lightweight) analyzes the audio stream in real-time and spits out a frequency value in Hertz. My code then takes that frequency and maps it to a vertical position on the game canvas. A higher frequency means a lower Y-coordinate, sending the bird soaring upwards.

click here to access the game as the embed is not functional.

I’m particularly proud of two key decisions I made that really improved the game feel.

First was the dynamic calibration for both noise and pitch. Before the game starts, it asks you to be quiet for a moment to measure the ambient background noise, which is measured set a volume threshold, so the game doesn’t react to the hum of a fan or distant chatter. Then, it has you sing your lowest and highest comfortable notes. This personalizes the control scheme for every player, adapting to their unique vocal range, which could be an important design choice for a voice-controlled game. I conseptualized this “calibration” idea and used AI to explore ways to implementation, finally coding up the components in the game.

setTimeout(() => {
  // Set noise threshold (average level + a buffer)
  noiseThreshold = mic.getLevel() * 1.5 + 0.01;
  console.log("Noise threshold set to: " + noiseThreshold);
  
  gameState = 'calibratePitch';
  isCalibratingLow = true;
  // Capture lowest pitch after a short delay
  setTimeout(() => {
    minPitch = smoothedFreq > 50 ? smoothedFreq : 100;
    console.log("Min pitch set to: " + minPitch);
    isCalibratingLow = false;
    isCalibratingHigh = true;
    // Capture highest pitch after another delay
    setTimeout(() => {
      maxPitch = smoothedFreq > minPitch ? smoothedFreq : minPitch + 400;
      console.log("Max pitch set to: " + maxPitch);
      isCalibratingHigh = false;
      // Ensure model is loaded before starting game loop with getPitch
      if (pitch) {
          gameState = 'playing';
      } else {
          console.log("Pitch model not ready, waiting...");
          // Add a fallback or wait mechanism if needed
      }
    }, 3000);
  }, 3000);
}, 2000); // 2 seconds for noise calibration

Another technical decision I’m happy with was implementing a smoothing algorithm for the pitch input. Early on, the bird was incredibly jittery because the pitch detection is so sensitive. To fix this, I stored the last five frequency readings in an array and used their average to position the bird. This filtered out the noise and made the bird’s movement feel much more fluid and intentional. Additionally, instead of making the bird fall like a rock when you stop singing, I gave it a gentle downward drift. This “breath break” mechanism hopefully makes the game feel like air.

Challenges and Future

My biggest technical obstacle was a recurring bug where the game would crash on replay. It took a lot of console-logging and head-scratching, but it ultimately turned out that stopping and restarting the microphone doesn’t work the way I’d thought. The audio stream becomes invalid after microphone stops, and I couldn’t reuse it. The solution was to completely discard the old microphone object and create a brand new one every time a new game starts.

In addition, there are definitely areas I’d love to improve. The calibration process, while functional, is still based on setTimeout, which can be sort of rigid. A more interactive approach, where the player clicks to confirm their high and low notes, would be an alternative user experience I could test and compare with. Additionally, the game currently only responds to pitch. It might be fascinating to incorporate volume as another control dimension—perhaps making the bird dash forward or shrink to fit through tight gaps if you sing louder. 

A more ambitious improvement would be to design the game in a way that encourages the player to sing unconsciously. Right now, the player is very aware that you’re just “controlling” the bird only. But what if the game’s pipe gaps prompt them to perform a simple melody? The pipes could be timed to appear at moments that correspond to the melody’s high and low notes. This might subtly prompt the player to hum along with the music, and in doing so, they would be controlling the bird without even thinking.

Week 6: Pong² (Midterm Project)

Overall Concept:

This is Pong². It’s called Pong² because this isn’t just a spiritual successor to the first ever video game, it’s a dimension beyond. If regular pong only moved you in one axis, it only has one dimension; this game lets you move left to right AND back and forth– creating a very amusing experience with much more depth than the original.

Pong² features two modes. Let’s start with Versus. Versus is a continuation of what I worked on all the way back in Week 3. In Versus, you and another player play a 1 v 1 duel of up to 9 rounds, best of 5. It’s the same win conditions as classic pong but with a lot more strategic depth. Moving your paddle in the opposite direction towards the ball will “smash” or “spike” it into your opponent’s side while moving in the same direction as you receive the ball will help you “control” or “trap” it. 

Pong²’s second mode is its best: co-op. In Co-Op mode, you and another player work together to guard the bottom goal against multiple balls that speed up with every return. You have 3 lives and each ball that slips past you takes a life from your shared life total. You have to coordinate with your teammate to make sure you keep a watch on every ball and not just respond as fast as you can to each one, because that’s only going to take you so far (my bet is on 50 seconds).

 

Feedback: Color Scheme Consistency

On our Monday checkup, Professor Ang told me to have consistent colors for highlighting the gameplay instructions; when I use different colors to highlight I risk player confusion. I also proceeded to keep the game mode colors consistent with the menu instead of having the menu remain exclusively white and purple.

CO-OP Mode Design

I originally planned to make a player vs AI mode but realized that I really didn’t know how to make it respond similarly to how a real player would move the paddle. I had received feedback from Professor Ang to make the concept more interesting than just the usual 1v1 pong, and that’s when it hit me: what if I made 2 Player survival pong?

I had a lot of fun designing the co-op mode on paper, but I made my code a huge mess by duplicating my versus game mode javascript file instead of working from scratch. I THOUGHT that would make the process faster since I would be working with my previous framework but I ended up having to modify so many things it ended up convoluting it.

I originally had the game start with just one ball and you wouldn’t get the second ball until the 15th hit on the top side; however,I realized I wanted players to naturally have the idea to coordinate with each other. For instance, player one might tell his teammate “I’ll take the red ball you take the blue one” to strategize. So what I decided was to make the 15th hit spawn the third ball and the 2nd  top bounce spawn the second ball which presents the challenge to players at a good pace.

I was very proud of this for loop I built to manage the many colorful balls that you had to defend against. Between this week and last week, I also had to add a canHitPaddle() method and timer to make sure you wouldn’t accidentally double hit the same ball as you tried to move in the same direction as it.

for (let b of coopBalls){ //runs thru all the balls stored in the array
    b.display();
    b.move();
    b.scoreDetection();
    
    if (coopBallCollidePaddle(b, player1paddle1, "Top")&& b.canHitPaddle()) {
      paddleBounceNeutralSFX.play();
      b.lastHitTime = millis() //records hit time using canHitPaddle()
      b.ballYspeed *= -1;
      b.yPos = player1paddle1.yPos - player1paddle1.paddleHeight / 2 - b.radius - 5;
      
      if (keyIsDown(83)) { // 'S'key 
        paddleBounceControlSFX.play();
        b.ballYspeed *= COOPbackControlFactor;
      } else {
        paddleBounceNeutralSFX.play();
      }
    }

    if (coopBallCollidePaddle(b, player2paddle1, "Top")&& b.canHitPaddle()) {
      paddleBounceNeutralSFX.play();
      b.lastHitTime = millis()//records hit time using canHitPaddle()
      b.ballYspeed *= -1; 
      b.yPos = player2paddle1.yPos - player2paddle1.paddleHeight / 2 - b.radius - 5; //the 5 helps make a microadjustment to prevent funky collisions
      
      if (keyIsDown(DOWN_ARROW) || keyIsDown(75)) { // '↓' or 'K' key 
        paddleBounceControlSFX.play();
        b.ballYspeed *= COOPbackControlFactor;
      } else {
        paddleBounceNeutralSFX.play();
      }
      
    }
    
  }//closes for b loop  

I think the co-op mode overall turned out very well. It was surprisingly challenging but also a good amount of fun for the friends that I asked to test it.

 

VERSUS Mode Improvements

I had the idea to add direction input with forward and backward velocity. Basically say you’re on the bottom side, if you receive the ball with “down arrow” then you slow it down because you pressed the same key the ball was headed, but if you received it with the “Up arrow” key you would add momentum to it.

I had so much fun implementing this layer of complexity to the game. A main criticism I received at first was that the extra dimension of movement for the paddles didn’t offer a significant difference to the usual pong gameplay loop. 

This was my response to that– adding a much needed strategic depth to the game. You weren’t just competing on who was better at defense now, you were seeing who had the better attack strategy.

In fact I originally didn’t plan to add this forward and backward velocity effect to the co-op mode but I loved how it felt so much that I knew I had to add it in some way. This posed a balancing challenge as well; if we were to keep the backControlFactor from the versus mode then it would prove far too easy. 

 

Sound Design

I originally wanted to implement the sounds into my versus mode but then I realized building the infrastructure for all the menus first would make my life a lot easier. So sound design ended up as the last thing on the to-do list.

I usually got sound effects from silly button websites when I needed to edit videos so I never knew where to get actual licensed SFX for indie games. I eventually found one called pixabay that had a good list of sound effects.

Most of the menu button SFX and game over SFX were pretty straightforward, but I really wanted the SFX for the gameplay loop itself to sound very satisfying. My favorite ones were definitely the ball control SFX and the countdown SFX. For the ball control SFX, I was inspired by the sound design in the Japanese anime Blue Lock– which has some incredible sound effects like the bass-y effect when one of the characters controls the ball midair. I found this super nice sounding bass sound and that’s how the idea of using acoustic instrument sounds stuck. The other two ended up being a snare and a kickdrum sound.

 

Difficulties Endured

The Great Screen Changing Mystery

There was this incredibly confusing bug that I never truly figured out but the gist was that it would randomly switch screens specifically to the versus game mode SOMETIMES when I clicked into Co-Op. 

I dug and I dug through the code endlessly until I saw that I left one of the screen-related if-statements to give an else-statement that defaulted the screen to versus. This was here for earlier tests but this seriously still doesn’t make any sense to me. There was never a call to the function that would switch to versus and there was nothing that would leave the screen state invalid for any of the available screens to take over.

This was a frustrating gap in my knowledge that I never fully understood, but I did fix it.

 

Hit “K” to Let the Ball Through (and look stupid)

For some reason, the bottom paddle kept letting the ball straight through whenever I blocked it while holding “K” or the down arrow. This took hours of looking for the problem, going “I’ll deal with that later” and then running into it again and wondering what the heck is causing this?

It was very specifically the bottom paddle and it HAD to be related to backward movement input detection. Then I realized that it was ONLY the bottom paddle because I had set the ball speed to minSpeed, which is always positive. So all I had to do was to convert it to a negative value equivalent to minSpeed.

if (keyIsDown(UP_ARROW) || keyIsDown(73)) { // '↑' or 'I' key 
  paddleBounceSmashSFX.play();
  ballforVS.ballYspeed *= fwdVelocityFactor;
} else if (keyIsDown(DOWN_ARROW) || keyIsDown(75)) { // '↓' or 'K' key 
  paddleBounceControlSFX.play();
  ballforVS.ballXspeed = minSpeed;
  ballforVS.ballYspeed = minSpeed;
} else {
  paddleBounceNeutralSFX.play();
}
Cursor Changing

When I was developing the buttons, I realized that they didn’t really feel clickable. So I wanted to add some code to change the cursor to the hand whenever it hovered over a button. This proved surprisingly complicated for what is seemingly an easy task. The most efficient way I found was to append all the buttons to an array for each menu and have it checked in a for loop everytime– then closing off with a simple if-statement.

function updateCursorFor(buttonArray) {
  let hovering = false;
  
  for (let btn of buttonArray) {
    if (btn.isHovered(mouseX, mouseY)){
      hovering = true;
    } 
  }
  if (hovering) {
    cursor(HAND);
  } else {
    cursor(ARROW);
  }
}
Conclusion & Areas for Improvement

As for areas of improvement, I would probably make the ball-paddle collision logic even more stable and consistent. It usually behaves predictably and as intended but it sometimes shows the way I coded it has a lot of flaws with colliding on non-intended surfaces.

However, despite all this, I am really proud of my project– both from the UI design standpoint and the replayability of the gameplay loop. Once my other friends are done with their midterms I’m gonna have a much better opportunity to see flaws in the design.

 

 

Week 7 — Final Game (Midterm Project)

https://editor.p5js.org/hajar.alk.2007/sketches/

Concept Recap

Now that I’ve finished my game, I just want to restate my concept briefly (which I explained in detail in my Week 6 submission). The original idea started as a candy game, but I decided to make it more traditional and personal by turning it into a Gahwa Rush theme. The inspiration came from my grandpa and my little brother how in our culture, gahwa is always seen as fine for kids while soft drinks aren’t. I found that really funny, so I made the goal of the game to catch the gahwa and avoid the Pepsi. It’s a simple idea, but it means a lot to me because it connects to my family and culture, and I’m really happy with how it turned out.

Week 7 Progress

This week, I was mainly working on the final details to perfect the game. Most of it was already done since last week the whole structure and idea were already there. What I focused on this week was adding sounds to make the game feel more alive and interactive. I also let my friend play it, and I noticed that it was really hard for her, even though for me it felt easy. That’s when I realized I had been increasing the difficulty too much without noticing because I kept testing and improving my own gameplay, so I was getting better while others weren’t. To fix that, I decided to lower the difficulty level to make the game more beginner-friendly and enjoyable for everyone.

I also found a small bug where the player would lose even when the character didn’t actually touch the Pepsi. To fix this, I adjusted the collision detection by making smaller rectangle frames around the Pepsi so that it only counts as a collision when it really touches the character.

Sources/Credit

https://youtu.be/enLvg0VTsAo?si=mPNyWkxCoWeOn3CG

https://youtu.be/bMYQbU01u04?si=KDpfq1w9eC_Bifax

https://youtu.be/Z57hx4ey5RY?si=ruAPhn2WmEeyHKXG

https://youtu.be/MuOcdYjF2F4?si=Z160JD3BE2VQnpvr

Although these YouTube links aren’t the same concept as my game and are actually very different, I used them to help me with the technical parts of creating my game. Since I’m still a beginner at coding, some things especially the math equations were really hard for me to figure out on my own. These videos really helped me understand how to make objects fall, how to make them appear randomly, and how to control different elements in the game. Even though their games had completely different themes, they included similar components that I could learn from, and I applied those ideas to make my own game work.

Code Highlight

const pw = p.w * PLAYER_HIT_SCALE.w;
const ph = p.h * PLAYER_HIT_SCALE.h;
const s  = ITEM_HIT_SCALE[it.kind] || { w: 0.5, h: 0.7 };
const iw = it.w * s.w, ih = it.h * s.h;

return (Math.abs(p.x - it.x) * 2 < (pw + iw)) &&
       (Math.abs(p.y - it.y) * 2 < (ph + ih));

When I was fixing the collision bug, it took me many tries to get it right. Even though I found YouTube videos about collision detection, I still couldn’t really figure it out at first because my game’s sprites and hitboxes didn’t match properly. The player would lose even when the Pepsi was far away. I kept testing and adjusting the numbers, but it wasn’t working. Then I found one YouTube video that explained hitboxes in a really simple way, and it finally made sense to me. That video helped me understand how to scale the hitboxes separately for each object, so I created smaller hitboxes around the Pepsi and the gahwa, and after that, the collisions finally worked perfectly. https://youtu.be/HK_oG_ev8FQ?si=BqtCL3WpHv3UpPQ0

End Of Project Reflection

Overall, I really enjoyed this project I genuinely, genuinely did. I loved developing my idea and adding more to it every single week. It’s crazy to see how much it changed from week 5, when I first thought of it, to how it looks now. It’s such an inspiring project for me because I got to be creative and technical at the same time. I also really enjoyed sharing it with my friends and family; everyone wanted to try it and play.

Even though it was a heavy project that took hours of work and a lot of effort, the result was completely worth it. I felt so accomplished when I finally finished it. It took a lot of trial and error, but that’s honestly what helped me learn the most. This project pushed me to apply everything I learned in class not just follow instructions, but actually take risks, test ideas, and build something real. It also made me go beyond what we learned and look for new solutions from other sources, like YouTube tutorials.

In the end, it was a very in-depth and challenging project, but I truly enjoyed every step of it not just the outcome. I loved the process of testing, debugging, and improving. It was fun, creative, and one of the most rewarding projects I’ve ever done.

There were definitely moments when I found bugs that I just couldn’t fix, and it felt so overwhelming. It was really frustrating because I would go through the code again and again and still couldn’t figure out what was wrong. But I started using the debugging techniques we learned in class, and that really helped me calm down and approach the problem more logically instead of panicking. There were also days when I spent hours trying to fix one thing, and after a while, my brain would just stop functioning. I couldn’t think straight anymore. But whenever I took a break and came back later, it was like my mind was refreshed, I could suddenly see the problem so much more clearly and finally debug it.

At some points, I honestly wanted to delete the whole code and just make a simpler game because I was so frustrated. But I’m really glad I didn’t. Finishing it made me feel so accomplished, and it really boosted my confidence in coding. I kept going even when I wanted to give up, and I pushed myself to find answers and look for external resources when I got stuck. That persistence made me realize that even if something feels impossible at first, I can figure it out if I stay patient and keep trying.

This project definitely required a lot of patience and I think that’s a skill I’m really starting to develop. I realized that when you’re coding, you’re not always going to get it right the first time, and that’s completely okay. There’s nothing wrong with making mistakes or needing multiple tries. Especially when you’re creating a game or something complex, patience is everything. You have to be willing to try again and again, test small changes, and stay calm even when things don’t work right away. This project really taught me that. It helped me understand that patience isn’t just something nice to have it’s one of the most important skills in programming.

Midterm Project – Balloon Pop

Game link:

https://editor.p5js.org/mm13942/full/lU_ibrAn2

Concept

My project is a fun and lighthearted balloon survival game where the player controls a balloon flying through the sky. The goal is simple — avoid the falling bombs, collect hearts for extra points, and stay alive as long as possible. The balloon can only move left and right, while the background scrolls continuously to give the feeling of rising into the clouds. The moment the balloon hits a bomb or the timer bar runs out, it pops and game over.

Inspiration

I was inspired by classic arcade-style games where the goal is to survive for as long as possible while dodging obstacles. I wanted to make something cheerful and colorful but still challenging. The idea of having a balloon as the main character felt fitting because it’s fragile yet expressive, and visually, it works well with soft, moving clouds and bright skies.

Production and Main Features

Initially, I was planning to do the obstacles on the sides that appear randomly:

but then I understood that it wouldn’t be a great decision, considering the size of a laptop screen, so I decided to stick to my another idea with objects falling randomly from the sky. After creating the draft of the game, I wrote down everything that needed to be added to my work to complete the game.

Key Features:

  1. Object-Oriented Design – Classes for Balloon, Bomb, Heart, and Button
  2. Pages:
    • Main page with right-aligned buttons
    • Instructions page
    • Weather choice page (Rainy, Sunny, Snowy)
    • Skin choice page (Pink, Yellow, Orange balloons)
    • Game over page with left-aligned content
  3. Gameplay Mechanics:
    • Bombs spawn with increasing difficulty (max 7)
    • Hearts spawn every 8 points (+5 score bonus)
    • Hearts spawn away from bombs to avoid interference
    • Proper collision detection with circular hitboxes (no false collisions)
    • Infinitely scrolling backgrounds based on weather
    • Score tracking with high score display
  4. Controls:
    • Arrow keys or A/D to move
    • C key for fullscreen toggle
    • Mouse clicks for all buttons
  5. Audio Support – Sound functions
  6. Highlighted Buttons – Selected weather/skin buttons get highlighted
  7. Back Buttons – On every sub-page to return to main menu

Code I’m Proud Of

Part of the code that I’m really proud of might not seem too hard, but it was definitely the most time consuming one and took a lot of trial and error. it was removing the white background from the uploaded pictures. when it first worked out, I thought everything was good but turns out P5 was creating 60 frames per second of one single image and when I played for more that 10 seconds it kept shutting down. I had to do a lot of debugging to understand what the actual problem was and was finally able to make it work without lagging, which really made me happy

processedBombImg = removeWhiteBackground(bombImg);
  processedHeartImg = removeWhiteBackground(heartImg);
  processedPinkBalloonImg = removeWhiteBackground(pinkBalloonImg);
  processedYellowBalloonImg = removeWhiteBackground(yellowBalloonImg);
  processedOrangeBalloonImg = removeWhiteBackground(orangeBalloonImg);
  
  // Initialize player balloon
  player = new Balloon(width/2, height - 80 * scaleFactor);
  
  // Create all button objects with proper positions
  setupButtons();
  
  // Set the pixel font for all text
  textFont(pixelFont);
}


// remove White Background 
function removeWhiteBackground(img) {
  // Create a graphics buffer to manipulate pixels
  let pg = createGraphics(img.width, img.height);
  pg.image(img, 0, 0);
  pg.loadPixels();
  
  // Loop through all pixels and make white ones transparent
  for (let i = 0; i < pg.pixels.length; i += 4) {
    let r = pg.pixels[i];     // Red channel
    let g = pg.pixels[i + 1]; // Green channel
    let b = pg.pixels[i + 2]; // Blue channel
    
    // If pixel is mostly white (R, G, B all > 200), make it transparent
    if (r > 200 && g > 200 && b > 200) {
      pg.pixels[i + 3] = 0; // Set alpha to 0 (transparent)
    }
  }
  pg.updatePixels();
  return pg; // Return the processed image
}

I also really liked the energy bar idea. I was really struggling with coming up with ideas, and my friend Nigina gave some feedback on my game and suggested to add this feature, which prevents the players from skipping the hearts.

function drawEnergyBar() {
  // Position in top-right corner
  let barX = width - energyBarWidth * scaleFactor - 20 * scaleFactor;
  let barY = 20 * scaleFactor;
  let barW = energyBarWidth * scaleFactor;
  let barH = energyBarHeight * scaleFactor;
  
  // Draw outer frame 
  stroke(255); // White border
  strokeWeight(3 * scaleFactor);
  noFill();
  rect(barX, barY, barW, barH);
  
  // Calculate fill width based on current energy percentage
  let fillWidth = (energyBar / 100) * barW;
  
  // Determine fill color based on energy level
  let barColor;
  if (energyBar > 60) {
    barColor = color(0, 255, 0); // Green when high
  } else if (energyBar > 30) {
    barColor = color(255, 255, 0); // Yellow when medium
  } else {
    barColor = color(255, 0, 0); // Red when low 
  }
  
  // Draw filled portion of energy bar
  noStroke();
  fill(barColor);
  rect(barX, barY, fillWidth, barH);
  
  // Draw "ENERGY" label above bar
  fill(255);
  textAlign(CENTER, BOTTOM);
  textSize(16 * scaleFactor);
  text("ENERGY", barX + barW / 2, barY - 5 * scaleFactor);
}

 

Design

Visually, I wanted it to feel airy and positive, so I used soft pastel colors, smooth cloud movement, and rounded buttons. Each page has its own layout — right-aligned buttons on the main page and left-aligned elements on the Game Over screen — to make navigation easy.

Challenges

The hardest part of my code was definitely managing how all the game elements work together  (the bombs, hearts, clouds, timer bar, and different pages). Getting everything to appear, move, and disappear smoothly without glitches took a lot of trial and error.Sometimes bombs appeared too frequently or hearts overlapped with them. I fixed this by randomizing positions with distance checks.

if (bombs.length < maxBombs && frameCount % 50 === 0) { bombs.push(new Bomb(random(width), -20)); } if (score % 10 === 0 && !heartExists) { hearts.push(new Heart(random(width), -20)); heartExists = true; }

The collision detection between the balloon and falling bombs was tricky too, since I had to make sure it felt fair and accurate using circular hitboxes. Another challenging part was balancing the gameplay, making bombs fall fast enough to be fun but not impossible, while also keeping the hearts from overlapping with them. On top of that, managing all the page transitions (main menu, instructions, weather, skins, game over) and keeping the selected settings consistent made the logic even more complex. Overall, the hardest part was making everything work together in a way that felt natural and didn’t break the flow of the game.

Future Improvements

In the future, I’d like to make the game feel more complete by adding real background music and more sound effects for popping, collecting hearts, and clicking buttons. Another improvement would be to make the difficulty change as the score increases, for example, bombs could fall faster or spawn more frequently the longer you survive. I’m also thinking of adding new power-ups like shields or magnets to make the gameplay more interesting. On the design side, animated buttons and smoother page transitions could make the menus feel more polished. Eventually, I’d love to include a high score system to track progress and make players more competitive.

 

Midterm Project: OOO – EEE

I made a game before that was controlled with keyboard inputs so this time, I wanted to create a game that used different input.
As I was scrolling through Youtube Shorts to find inspiration of my project, I came across a simple game playable with user’s pitch levels. In the video I watched, the character moved up and forward depending on the level of pitch. With this in mind, I tried making simple programs that took user input.

 

First, I built a program that detected a specific pitch, in this case “C”. If the user sings the pitch, the block is moved upwards and if the user maintains the same level for certain amount of time, the block permanently moves upwards. I made this because my initial strategy was to make a adventure game where the character travels a 2D map and make certain interactions that triggers things such as lifting a boulder with a certain note. This little exercise allowed me to get familiar with sound input and how I can utilize it in the future.

For my midterm, I decided to create a simple game that uses paddles that move left and right. The goal of the game is to catch the falling objects with these moving paddles. The hardest part about the game was obviously moving the paddles depending on the user’s pitch. At first, the paddles were so sensitive to the point that its movement was all over the place even with a slight input of sound. Adjusting that and making it so that it moves smoothly was the key in my game.

While I was testing for movements, I realized that I was making sounds that resembled a monkey. I was making the sounds OOO for the low volumne and EEE to make high pitched noise. So I came up with a clever idea to make the game monkey-themed with falling objects being bananas and the paddles as monkey hands. It made me laugh thinking that the users would have to immitate monkeys in order to play the game. Also, I added a little feature in the end to replay the sound that the players made while playing my game so that they can feel a bit humilliated while playing the game. I thought this was a great idea to bring some humor in my game. I also had to test this multiple times making the game and had to experience it beforehand.

My game is divided into 4 major stages: start screen, instructions, gameplay, and gameover screen. As explained in class, I utilized the different stages so that resetting was easiler.

The startscreen is the title screen. It has 3 buttons: instructions, play and full screen button. Clicking the buttons make a clicking sound. That is the only sound feature I have since my gameplay is hugely effected by sound. Any background music or sound effects effect how the game is played so I kept it to the minimum. Also, making the game full screen effects the game play, so I had the fullscreen feature to fill everywhere else with black.

Before playing the users can click the instructions page to find out the controls and calibrate their pitch range. I deliberately say to use OOOs and EEEs for the pitches so that they can sound like monkeys. These pitch ranges are adjustable with up and down arrows and are stored in the local storage so that the settings remain even after resetting the game. I also show a live paddle so that the users can see how their voice with move the hands.

Once they hit play, the core loop is simple: bananas spawn at the top and fall; the goal is to catch them with the monkey “hands” (the paddle) at the bottom. I map the detected pitch to x-position using the calibrated min/max from instructions, I clamp the raw frequency into that window, map it to the screen’s left/right bounds (so the hands never leave the canvas), then smooth it. To keep control stable I added a small noise gate (ignore very quiet input), a frequency deadzone (ignore tiny wiggles), linear smoothing with lerp, and a max step cap so sudden jumps don’t overshoot. the result feels responsive without the little movement I had early on. The player scores when a banana touches the hands and loses a life on a miss; three misses ends the round.

When the run ends, the gameover screen appears with the background art, a big line like “you got x bananas!”, and two buttons: “play again” and “did you sound like a monkey?”. during gameplay i record the same mic that powers pitch detection; on gameover I stop recording and let the player play/stop that clip. it’s a tiny feature, but it adds a fun (and slightly embarrassing) payoff that matches the monkey concept.

 

I’m especially proud of how I handled pitch jumps. early on, tiny jitters made the hands twitchy, but big interval jumps still felt sluggish. I fixed this by combining a few tricks: a small deadzone to ignore micro-wiggles, smoothing with lerp for steady motion, and a speed boost that scales with the size of the pitch change. when the detected frequency jumps a lot in one frame (like an “ooo” to a sharp “eee”), I temporarily raise the max movement per frame, then let it settle back down. that way, small fluctuations don’t move the paddle, normal singing is smooth, and deliberate leaps produce a satisfying snap across the screen without overshooting. Getting this balance right made the controls feel musical.

For future improvements on the game itself, I want to smooth the frustration without losing the funny chaos. Bananas don’t stack, but several can arrive in different lanes at the same moment, and with smoothing plus a max step on the hands, some patterns are effectively unreachable. I kept a bit of that because the panic is part of the joke, but I’d like the spawner to reason about landing time instead of just spawn time, spacing arrivals so that at least one of the simultaneous drops is realistically catchable. I can still sprinkle in deliberate “double-arrival” moments as set pieces, but the baseline should feel fair.

Midterm Project Documentation: All Day Breakfast

Sketch(f for fullscreen): https://editor.p5js.org/joyzheng/full/tb0uwj2nP

Overall Concept

As a visiting student at NYUAD, I found the made-to-order dining system, particularly at the All Day Breakfast counter, to be very confused. Unlike the pre-made options I was used to, the text-only menus made it difficult to visualize my order. I always confused what and how many did I ordered if there’s no picture (some are arabic food I don’t know) and I often found myself pulling out a calculator to see if my selections added up to a full meal plan.

These frictions made me want to digitalize the experience to an interactive game that aims to gamify the ordering process. The core goal is to provide a more intuitive and visual way for players to assemble a meal, manage inventory, understand the costs, and manage their spending. By turning the process into a game with clear steps and rewards (badges), the project transforms a problem/demand discovered in my life into an engaging and replayable experience.

How It Works

The game guides the player through a six-scene narrative that mirrors the real-life process and menu of getting food at the D2 dining hall A LA BRASA All Day Breakfast Counter.

UI Prototype:

UE:

Scene 1:

Start Screen: The player is presented with the All Day Breakfast counter and prompted to “Ready to Order?”. Clicking the triangle button begins the game. The badge board is also displayed here, showing the player’s progress.

Scene 2:

Choose Food: The player is shown a grill with all available food items. They must first click to pick up a pair of tongs, which then attaches to their mouse. They can then click on food items to pick them up and click on the plate to add them to their meal. The total cost is updated in real-time.

Scene 3:

Scan Items: The player takes their plate to the cashier. They must pick up the scanner tool and move it over each food item on the plate. As each item is scanned, a beep sound plays, and the item is added to a virtual receipt.

Scene 4:

Payment: The cashier opens, revealing a coin tray. The player must pay the total amount shown on the receipt by clicking on coins from a palette and dropping them into the tray.

Scene 5:

Eat: The player sits down to eat. They must pick up a fork and use it to pick up food from their plate and bring it to the character(NYUAD Girl)’s mouth to “eat” it, which plays a sound and makes the food disappear.

Scene 6:

End Screen & Badges: After the meal, the game checks if the player’s actions have met the conditions for any new badges. If so, a special animation plays. The player is then given the option to “Dine in AGAIN!”, which resets the game and starts a new session.

Technical Decisions & Game Design I’m Proud of

I am proud of completing a fully functional and well-designed game within the project timeline, especially after iterating on the initial idea. A key technical challenge was to build the entire game to be fully responsive. The core of the responsive design is a set of helper functions (updateLayoutDimensions, scaleRectangle, scaleValue) that calculate scaling factors based on the current window size versus the original 700×500 design grid. This allows every element to reposition and resize dynamically, ensuring the game is playable on any screen.

It’s also helpful to discuss with Professor Mang to improve the interactivity and replayability of the game. We came up the ideas of implementing the stock management system and humorous badge reward that every NYUAD students who went to this dining hall could resonate with(e.g., never being able to spend a whole meal plan; why is 1 meal plan 33.6? Is that 0.1 for service fee?). I design the inventory as the same as how it usually would be in the counter, for instance, there’s always only a few avocado toast and I just never being able to get tofu omelet till now. Overall, this is also very meditating and educational (in some sense) that it reminds people to feed themselves well in dining hall even when you are rushing in classes and encourage user to do a balanced meal with enough amount of fiber everyday.

// =======================================
// SCENE 2: CHOOSE FOOD
// this function calculates the responsive positions for all food items in scene 2
function buildScene2FoodGrid() {
  // clears the array of food objects to ensure a fresh start each time the grid is rebuilt (e.g., window resize)
  scene2FoodObjects = [];

  // constants that define the original pixel dimensions of the background art and the specific rectangular area within it where the food is displayed
  const sourceImageSize = { w: 1536, h: 1024 };
  const sourceFoodArea = { x: 124, y: 138, w: 1284, h: 584 };
  
  // responsive calculation
  // current on-screen position and size of the food area
  // by finding the scaling ratio between the current canvas and the original background image
  // so the grid always perfectly overlays the correct part of the background art
  const foodGridRect = {
      x: sourceFoodArea.x * (canvasWidth / sourceImageSize.w),
      y: sourceFoodArea.y * (canvasHeight / sourceImageSize.h),
      w: sourceFoodArea.w * (canvasWidth / sourceImageSize.w),
      h: sourceFoodArea.h * (canvasHeight / sourceImageSize.h)
  };
  
  // the calculated grid area is then divided into cells (8 columns by 2 rows) to position each food item
  const columns = 8;
  const rows = 2;
  const cellWidth = foodGridRect.w / columns;
  const cellHeight = foodGridRect.h / rows;
  
  // the size of each food item is based on the smaller dimension (width or height) of a grid cell
  // this prevents the food images from looking stretched
  scaled by 70% to add padding
  const itemSize = min(cellWidth, cellHeight) * 0.7;
  
  // this loop iterates through every food item defined
  for (let i = 0; i < ALL_FOOD_ITEMS.length; i++) {
    // math.floor() and % convert the 1d loop index (i) into a 2d (row, col) grid coordinate
    let row = Math.floor(i / columns);
    let col = i % columns;
    
    // calculates the final top left (x, y) coordinate for each food item
    // starts at the grid's origin
    // adds the offset for the column/row
    // adds a centering offset
    let itemX = foodGridRect.x + col * cellWidth + (cellWidth - itemSize) / 2;
    let itemY = foodGridRect.y + row * cellHeight + (cellHeight - itemSize) / 2;
    
    // a new food object is created with its calculated position and size
    // added to the array to be drawn
    scene2FoodObjects.push(new FoodItem(ALL_FOOD_ITEMS[i], itemX, itemY, itemSize));
  }
}

 

The most complex piece of code, and the one I’m most proud of, is the logic in the buildScene2FoodGrid() function. Unlike other elements that scale relative to the canvas, this grid must scale relative to the background image itself to ensure the food items are perfectly aligned with the artwork.

This logic calculates a scaling ratio based on how the background image has been stretched to fit the screen, and then applies that same ratio to the coordinates of the food grid. It’s a powerful piece of code that makes the experience feel seamless.

Challenges & Improvements

The development process was a valuable relearning game development. I’m surprised by the amount of free assets resources and tutorials for game development online. I’m also inspired by the Coffee Shop Experience example of how to use p5js to manage a game and toggle between scenes.

One of the most surprisingly time-consuming challenges was a simple debugging session that lasted hours, only to discover I had misspelled “diarrhea” as “diarreah” or “diareah” in different location. This taught me the importance of meticulous checking and creating simple debugging tools to isolate issues early.

I also got the opportunities to explore AI created assets through this project. For this huge amount of assets, AI assets might be the best option for me in order to finish on time. However, I still spent at least half of the game development just to get back and forth for “drawing a good card” of images. To be honest, I want to say Nano Banana didn’t worth the hype for image creation. For game assets development, ChatGPT might be the best choice after trying a few different apps like Midjourney or Canva. This is very lightweight and it also supports transparent background with png, so it could be directly use without manually removing the background.

For the future, I have several ideas for improvement:

  1. Expand to Other Counters: I would like to implement a similar ordering system for the D1 dining hall, which also has a confusing menu.
  2. UI Enhancements: I plan to add a toggle to hide or show the badge board, giving the player more control over their screen space.
  3. More Badges: Adding more creative badges would further increase the incentive for players to try different food combinations and spending strategies.
  4. Scene Refinement: Some scenes are quite dense with assets. In a future version, I might split complex steps into more scenes to make the layout feel cleaner and less cluttered.
  5. Real Implementation: After Midterm, I will demo this to dining hall manager to see if they want to adopt this ordering system or just using a more intuitive and interactive menu to run the dining hall more efficient.