Week 11: Reading Response

Design Meets Disability

I never really thought about disability and design in the same breath until reading Design Meets Disability. But the more I read, the more I realized something: design isn’t just about making things pretty or functional. It’s about shaping identity, dignity, and belonging. And disability, instead of limiting design, has actually pushed it forward in powerful ways.

One of the first images that stayed with me was the Eames leg splint. It surprised me that something meant for wartime medical emergencies became a milestone in modern furniture design. It made me rethink how innovation often begins in uncomfortable or overlooked places. It reminded me that creativity isn’t always glamorous, sometimes it grows out of necessity, pain, or urgency. And yet, from that, something beautiful emerges.

I never really thought about disability and design in the same breath until reading Design Meets Disability. But the more I read, the more I realized something: design isn’t just about making things pretty or functional. It’s about shaping identity, dignity, and belonging. And disability, instead of limiting design, has actually pushed it forward in powerful ways.

One of the first images that stayed with me was the Eames leg splint. It surprised me that something meant for wartime medical emergencies became a milestone in modern furniture design. It made me rethink how innovation often begins in uncomfortable or overlooked places. It reminded me that creativity isn’t always glamorous sometimes it grows out of necessity, pain, or urgency. And yet, from that, something beautiful emerges.

Week 11: Exercise

1st Exercise

// Serial port object
let port;

// Latest sensor value
let sensorVal = 0;

function setup() {
  createCanvas(400, 300);
  background(220);

  // Create the serial connection object
  port = createSerial();

  // If a port was used before, auto-reconnect
  let used = usedSerialPorts();
  if (used.length > 0) {
    port.open(used[0], 9600);
  }
}

function draw() {
  background(220);

  // Read one line of text until newline "\n"
  let str = port.readUntil("\n");

  // Make sure we actually received something
  if (str.length > 0) {

    // Convert the string into an integer
    let val = int(str.trim());

    // Map sensor value (0–1023) to screen X position (0–400)
    let x = map(val, 0, 1023, 0, width);

    // Draw a circle at mapped position
    ellipse(x, height / 2, 40, 40);

  } else {

    // If empty data is received, print it for debugging
    console.log("Empty:", str);
  }
}

Arduino  Code:

const int sensor=A2;
int sensorValue;
void setup() {
  // put your setup code here, to run once:
  pinMode(sensor,INPUT);
  Serial.begin(9600);
  // while (Serial.available() <= 0) {
  
    Serial.println(0); // send a starting message
   
  
}



void loop() {
  // put your main code here, to run repeatedly:
sensorValue=analogRead(sensor);
delay(60);
Serial.println(sensorValue);
  
  
}



 

Week 10: Reading Response

A Brief Rant on the Future of Interaction Design

After finishing reading the article, I caught myself staring at my hands and realizing how much I take them for granted. It’s strange how technology, which is supposed to make us feel more connected, can also make us feel so detached from the world right in front of us.

I’ve spent years glued to glowing screens typing, swiping, scrolling without thinking about how unnatural it all feels. Victor’s phrase “Pictures Under Glass” stuck with me like a quiet accusation. That’s exactly what these devices are: smooth, cold panes that flatten our world into something we can only look at, not touch. I thought about how I used to build things as a kid—Legos, paper circuits, even sandcastles. I remember the feeling of resistance when stacking wet sand, the satisfaction of shaping something that pushed back. Now, most of my creations live behind glass, where nothing pushes back.

What really struck me was Victor’s idea that true tools amplify human capabilities. We’ve built technologies that expand what we can see and compute, but not what we can feel. I realized that so many “futuristic” designs we celebrate are really just shinier versions of what we already have more pixels, more gestures, less connection. It’s like we’ve mistaken slickness for progress.

Reading this made me wonder what kind of future I want to help design as someone in tech. I don’t just want to make tools people use. I want to make tools people feel. Tools that understand how deeply human interaction is tied to touch, weight, and presence. Maybe that means haptic design, maybe something beyond it. But I know it means remembering that we are not just eyes and brains; we are hands, bodies, and motion.

 

Responses: A Brief Rant on the Future of Interaction Design

When I read Bret Victor’s Responses to A Brief Rant on the Future of Interaction Design, it felt like listening to a passionate inventor defending an idea most people didn’t quite understand. His frustration was almost funny at times how he keeps saying “No! iPad good! For now!” but underneath that humor was something deeper: a kind of hope that we can still dream bigger about how we interact with technology.

What struck me most was his insistence that the problem isn’t the devices themselves. It’s our lack of imagination. We celebrate flat screens as the height of innovation, but Victor reminds us they’re only a step in a much longer journey. Reading this, I realized how often I accept technology as inevitable, as if it just happens to us. But Victor insists that people choose the future. We decide what gets built, what gets funded, what gets normalized. That realization hit me hard, because it turned passive admiration into personal responsibility.

His section on “finger-blindness” especially stuck with me. The idea that children could lose part of their sensory intelligence by only swiping at glass felt unsettling. I thought about how often I see kids with tablets, and how natural it looks yet maybe that’s exactly the danger. Our hands were made to shape, feel, and learn through texture. If we stop using them that way, we’re not just changing how we play; we’re changing how we think.

What I admire about Victor’s writing is that he doesn’t reject technology. He just wants it to grow up with us. He’s not nostalgic for the past or obsessed with sci-fi fantasies; he’s practical, grounded in the body. When he says that the computer should adapt to the human body, not the other way around, it reminded me that innovation should honor what makes us human, not erase it.

Reading his response made me feel both small and inspired. Small, because it reminded me how little we really understand about designing for human experience. Inspired, because it reminded me that design is not just about what looks cool. It’s about what feels alive.

Maybe the real future of interaction design isn’t in inventing smarter machines, but in rediscovering the intelligence already built into our own bodies.

Week 10: Musical Instrument

For this week’s assignment, we had to make a musical instrument involving at least one digital sensor and one analog sensor. Aditi and I decided to create a simple piano-like instrument with lights whose pitch level can be controlled by a potentiometer. There are 4 buttons (switches) that act as the piano “keys” and play different sounds, while the potentiometer has been mapped to three different levels so that the keys produce a high-pitched, middle-pitched, and low-pitched sound.

Materials

  • Analog sensor: 10K Trimpot
  • Digital Switch: Red, Blue, Yellow, and Green tactile buttons
  • Output: Piezo Buzzer to produce the sound and LEDs for light output

Schematic

Video

link: drive

Code

const int button_yellow=8;
const int yellow_light=7;
const int button_blue=9;
const int blue_light=6;
const int green_light=5;
const int button_green=10;
const int red_light=4;
const int button_red=11;
#define BUZZER 12
int potValue=0;
const int potPin=A0;
int melody[]={262,294,330,349,392,440,492,523}; // notes from C4-C5
int melody2[] = {523, 587, 659, 698, 784, 880, 988, 1047}; // notes from C5–C6
int melody3[] = {1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093}; // C6–C7
int nodeDurations=4;
int increase=0;
int potValue_p;

void setup() {
  // put your setup code here, to run once:
  pinMode(button_yellow, INPUT);
  pinMode(yellow_light,OUTPUT);
  pinMode(button_blue, INPUT);
  pinMode(blue_light, OUTPUT);
  pinMode(button_green, INPUT);
  pinMode(green_light,OUTPUT);
  pinMode(button_red, INPUT);
  pinMode(red_light, OUTPUT);

  pinMode(BUZZER, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  potValue = analogRead(potPin);
  
  if (digitalRead(8) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[1]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[1]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[1]);
    }
    digitalWrite(yellow_light, HIGH);

  } else if (digitalRead(9) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[2]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[2]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[2]);
    }
    digitalWrite(blue_light, HIGH);

  } else if (digitalRead(10) == HIGH) {
    Serial.print("green");
    if (potValue <= 300) {
      tone(BUZZER, melody[3]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[3]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[3]);
    }
    digitalWrite(green_light, HIGH);

  } else if (digitalRead(11) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[4]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[4]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[4]);
    }
    digitalWrite(red_light, HIGH);

  } else if(digitalRead(8)==LOW && digitalRead(9)==LOW && digitalRead(10)==LOW && digitalRead(11)==LOW) {  
    Serial.println("Pin is LOW");
    noTone(BUZZER);
    digitalWrite(yellow_light, LOW);
    digitalWrite(blue_light, LOW);
    digitalWrite(green_light, LOW);
    digitalWrite(red_light, LOW);
  }
}

Reflection & Future Improvements

We had fun making this project, however it was also challenging to make all of the connections. Initially, we planned to have at least 6-7 switches, but the setup became so crowded with just 4. I started drawing the schematic diagram on paper before we began to build the connections physically, and this made the process much more manageable. We definitely could not have done it without having the diagram in front of us first. Sometimes, the setup for a particular switch would not work and we would face trouble in figuring out whether the issue was in the code or the wiring. In future versions, we would love to have more keys, as well as a tiny screen that displays the letter for the current note that is being played. We would also want to research on more “melodic” or pleasant sounding notes to add to the code.

Week 9: Reading Response

 

Physical Computing’s Greatest hits and misses

When I read Physical Computing’s Greatest Hits (and Misses) by Tom Igoe, I found it fascinating how creativity often repeats itself, not out of laziness, but because some ideas are simply too human to let go of. Whether it’s waving your hand to make music like a theremin, or building dancing floor pads inspired by Dance Dance Revolution, these projects remind me that interaction is a language our bodies already know.

Igoe’s writing made me realize that physical computing isn’t just about wiring sensors or programming microcontrollers. It’s about finding meaning in movement and connection. His examples like gloves that make drum sounds or video mirrors that reflect your gestures  all blur the line between play and design. I like how he admits that some of these projects are “pretty but shallow.” It’s honest. It’s easy to get lost in flashing LEDs and forget to ask: what experience am I creating for the user?

The part that stayed with me most was his reminder that originality doesn’t mean doing something no one’s ever done before  it means doing something familiar in a new way. That’s comforting. As someone who’s just learning, I often feel like my ideas are too simple or already “done.” But after reading this, I see that even a basic project, a light sensor or a glove, can become unique if I bring my own perspective, story, or purpose to it.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

When I was reading Tom Igoe’s Making Interactive Art: Set the Stage, Then Shut Up and Listen, I was struck by how simple yet challenging his message was. He tells artists and creators to stop explaining their own work  to resist the urge to direct, interpret, or control what others should feel. It’s funny how that sounds easy, but for anyone who’s ever made something personal, it’s actually the hardest thing to do. We want to be understood. We want people to “get it.”

Igoe reminds us that interactive art isn’t about showing, it’s about inviting. The artist’s job is not to tell a story, but to create a space where others can find their own. When he says, “Once you’ve built it, shut up,” it hit me like a creative reality check. In most traditional art forms painting, photography, sculpture  the work speaks through form and color. But with interactive art, the audience literally completes the piece through their movement, curiosity, and touch. That’s terrifying and beautiful at the same time.

I also loved how Igoe compares designing interaction to directing actors. You can’t tell an actor how to feel; you just set the stage, give them a prop, and trust that they’ll find emotion in the act itself. That idea applies perfectly to physical computing and interactive design. The best projects don’t just respond to people  they encourage discovery. They leave space for mistakes, surprises, and personal interpretation.

Reading this made me think differently about my own work. I realized that in some of my projects, I’ve tried to control the experience too tightly giving too many instructions or forcing a specific reaction. But now, I see that a good interactive project is a conversation, not a lecture. It listens as much as it speaks.

Week 9: Smart Pedestrian Light System

Concept:

In my home town, we don’t usually have automatic traffic lights for pedestrian (zebra) crossings. When I came to the UAE, I found it really interesting that many crossings have a touch-sensitive button on the pole, when a pedestrian presses it, the system detects the request and changes the traffic light to red for vehicles, allowing people to cross safely.

263 Abu Dhabi Road Signs Stock Video Footage - 4K and HD Video Clips | Shutterstock

I wanted to mimic that concept using simple electronic components. In my prototype, a light sensor (LDR) acts as the pedestrian touch detector. When I place my finger over the sensor,  “green” light turns on (in my case, a blue LED, since my green ones were damaged), signaling that a pedestrian wants to cross. When the sensor is not covered, meaning the LDR value stays above a certain threshold (around 400), it represents that no one is waiting to cross, so the “red” light for pedestrians remains on.

Additionally, I included a digital switch that simulates a traffic officer controlling the lights manually. Whey they clicked the red button, red light turns on and force they vehicle to stop.

Video:

 

Code:

const int LDR= A2;
const int LDR_LED=10;
const int red_LDR=8;
void setup() {
  // put your setup code here, to run once:

  Serial.begin(9600);
}

void loop() {
  // put your main code here, to run repeatedly:
  int value= analogRead(LDR);
  Serial.println(value);
   
  if(value<=500){
    digitalWrite(LDR_LED,HIGH);
    Serial.println("yes");
   
    digitalWrite(red_LDR,LOW);
   
  }
  else{
    digitalWrite(LDR_LED,LOW);
    digitalWrite(red_LDR,HIGH);
  }
  
}

 

Schematics:

Week 8: HeartFelt Connection

There is something profoundly magical about receiving affection: a hand that reaches out, a presence that wraps you in warmth, a small gesture that makes the world feel less heavy. Love, in its quietest form, has the power to make us feel seen, valued, and alive. Through my project, I wanted to explore that invisible current the emotional electricity that flows between two people who care for each other.

The interactive piece symbolizes the way love and connection illuminate us. When you touch the hand of someone you love, or even move close enough to feel their presence, your heart lights up  not just metaphorically, but literally within the work. The glowing heart represents how affection has the power to activate something deep within us, a light that reminds us we are not alone.

Demo Video:

 

Future Improvement:

If I had more time, I would have expanded the project to include sound and more expressive lighting to create a deeper emotional experience. I would have added soft heartbeats, ambient sounds, or even gentle whispers that respond to movement and proximity. As two people approach each other, the soundscape could evolve, symbolizing how connection changes both our inner feelings and the atmosphere around us.

I also would have made the heart light more dynamic by allowing it to change color and rhythm based on the kind of relationship or emotion being expressed. Warm tones could represent comfort and love, cooler shades could suggest distance or nostalgia, and rhythmic pulses could express moments of excitement and joy. These additions would transform the project into a poetic language of sound and light, a more immersive way to express how love and presence illuminate our lives.

Week 8: Reading Reflection

Attractive things Work Better

I’ve always noticed how looks can open doors. It’s something people like to call pretty privilege. In most industries, they prefer someone who has good looks. You see it everywhere at the airport check-in counter where the smiling, well-dressed staff makes you feel calm, or in a company that hires someone polished to sit at the front desk because, somehow, they represent the brand better.

We are naturally drawn to beauty, whether it’s a charming smile or a sleek phone screen. Companies know this; that’s why a tech gadget with a glowing logo or a car with smooth curves often sells better than a clunky but equally powerful competitor. Don Norman’s idea that “attractive things work better” isn’t just about vanity. it’s about psychology.

A MacBook isn’t the fastest laptop in the world, and yet when we open that smooth aluminum case and feel the satisfying click of the keys, we tend to believe it’s faster, smarter, and somehow more capable. The elegant design creates a sense of trust and delight that shapes how we experience the product. The beauty of it literally makes us think it works better.

Attractive design doesn’t just sell. It softens us. It makes us more patient with flaws, more forgiving of errors, and more willing to explore. That’s the secret power of beauty: it opens the door for better experiences.

Norman explains this beautifully in “Attractive Things Work Better.” He talks about how our emotions influence the way we interact with the world. When something looks good, it makes us feel good and when we feel good, we actually perform better. He even gives a simple example: imagine walking on a wooden plank that’s safely on the ground you’d stroll across without a second thought. But put that same plank high in the air, and suddenly, your fear takes over, your body stiffens, and it becomes ten times harder. The situation didn’t change; your feelings did.

That’s what happens with design too. When we see something beautiful like a teapot with elegant curves or a smartphone that feels right in our hands it puts us in a positive mood. We become more relaxed, more open, and surprisingly, even more creative. Norman calls this the power of positive affect. It’s why we forgive a beautiful product for small flaws but get frustrated quickly with an ugly one, even if it works just as well.

So when I think about “pretty privilege,” it’s not just about faces or appearances it’s about how beauty changes behavior. Attractive people, like attractive designs, create comfort and trust before a single word is spoken or a single button is pressed. And through this text, I feel like it helped me see that aesthetics aren’t shallow; they’re psychological. Beauty works because it changes us somehow.

Her Code Got Humans On The Moon — And Invented Software Itself

When I was in eighth grade, our school hosted a  movie event through a club called Global Female in STEM (Science Technology Engineering and Mathematics). They showed movie called “Hidden Figures” which is  the story of three brilliant African-American women, Katherine Johnson, Dorothy Vaughan, and Mary Jackson, whose work made NASA’s early space missions possible. I remember sitting there, completely mesmerized, realizing for the first time how many women had shaped history from behind the scenes, only to have their names forgotten.

Since that day, I’ve carried a quiet admiration, and maybe a little fire, for women in STEM who were never given the recognition they deserved. Now, as a Computer Science major myself, I’ve felt small moments of that same bias. Sometimes, it’s a look that says “are you sure you can do this?” It’s subtle, but it stings like being told you have to prove your worth twice before anyone believes it once.

That’s why reading about Margaret Hamilton hit differently and fill me up with pride. Here was a woman in the 1960s, leading a team at MIT, writing code that would land humans on the moon  all while raising her daughter.

The image of her bringing her little girl to the lab late at night felt so familiar. My mom, also a working mother in STEM, used to take me to her computer lab  when I used to be alone. Margaret’s story reminded me of my mother of every woman who’s ever juggled passion, work, and motherhood with quiet strength.

What moved me most was how Margaret didn’t just write code; she invented software engineering itself. She gave structure, respect, and permanence to something the world didn’t even consider a real science yet. And when her code saved the Apollo 11 mission, it wasn’t just a victory for NASA. It was proof that brilliance has no gender.

Reading her story filled me with pride. It made me realize that every time a woman like Margaret Hamilton, or my mother, or even I sit behind a computer and type, we’re not just writing code. We’re continuing a legacy.

Midterm Project: Worm vs Sanity

Concept

Food has always been something deeply emotional forme  a way to heal, connect, and recharge after long or lonely days. Whether it’s sharing a meal with friends and family or eating quietly in solitude, food always finds a way to lift the spirit. For me, food is more than just fuel; it’s comfort, joy, and sometimes even memory itself. Every dish I eat reminds me of a moment, a feeling, or a person.

But, of course, there’s another side to that relationship  those unforgettable moments when something unexpected shows up in your food: a hair, a fly, a worm, even a tiny stone. It’s disgusting, sometimes shocking, and yet  over time  it becomes something you laugh about. The idea for this project actually struck me when I once found a fly in my food. In that split second, my emotions bounced between anger, disgust, and disbelief and then, later, laughter. I realized how something so small could completely shift my mood and turn an ordinary meal into a story I’d never forget.

It also reminded me of moments with my grandmother. She used to cook for the whole family, and occasionally, there would be a stray hair in the food. Instead of getting angry, I’d turn it into a lighthearted joke so everyone could laugh. Those moments became cherished  not because the food was perfect, but because the imperfections made it real, made it ours. They were messy, human, and full of love.

Through my project, I wanted to recreate those shifting emotions  from disgust and frustration to humor and warmth. I tried to capture the entire emotional cycle we experience in those moments: the anger when something feels ruined, the creepiness of noticing something “off,” and the humor that comes when you finally laugh it off.

  • Anger is portrayed through intense, chaotic visuals  like the “deadly” appearance of the dining hall, the harsh red tones.

  • Creepiness comes through the eerie atmosphere  the bloody dining hall textures, dim lighting, and strange, almost horror-like visual style that makes you feel uneasy, the same way you feel when you find something in your food that shouldn’t be there.

  • Humor ties it all together. I added funny instructions like “Ultimate Guide to impress Worm” that turns disgust into comedy. It’s a playful reminder that these moments, while annoying, are also absurd and relatable  something we can laugh about later.

To make it more personal, I brought in imagery from NYUAD  specifically D2 and the Library, two of my favorite places on campus. They hold countless memories of food, laughter, and friendship, so I wanted to reimagine them in my project. I took photos and used ChatGPT to generate artistic, surreal versions of these spaces  blending reality and imagination. The result is an environment that feels both familiar and eerie, mirroring that strange feeling of discovering something unexpected in what you love.

Lastly, I chose to use hand gestures as one of the interaction method because I wanted the experience to feel physical and expressive, not just mechanical. In real life, our hands are what connect us to food. We cook with them, eat with them, react with them. So, using gestures like moving the left hand to go left, right hand to go right, and closing the fist to jump feels symbolically and emotionally right. It mirrors how our hands instinctively respond when we’re disgusted or startled. We pull back, push away, or clench up.

While it might not be the most conventional control scheme, that’s precisely what makes this project unique  and artistic  rather than a simple computer game. The goal wasn’t to make a polished arcade game, but to create a more embodied experience  one that makes the player aware of their own physical reactions.

 

How to Play:

At its core, the project is an interactive game centered around a simple but expressive idea: defeat the worms that are being generated from right of the screen before they reach the end of the screen on left.

Players can interact with the game in two different ways:

Keyboard controls — using the arrow keys to move and jump: → to go right, ← to go left, and ↑ to jump.

Hand gesture controls — raise your left hand to go left and raise your right hand to go right. By raise I mean make your hand visible to the camera and when u don’t want to go say left make you left hand unvisible to the sight of the camera. If you make a fist or close your finger the girl will jump.

The basic rule is simple: jump over the worms to eliminate them before they cross the screen. Players have three lives, and if they manage to survive until time >= 900 (meaning the draw function has run 900 times) with at least one life left, they win.

At first, it might feel unintuitive, but as you play, it becomes surprisingly fun and natural  like you’re physically fighting off those unwanted “guests” in your meal.

Parts I’m Proud Of

The part I’m most proud of is integrating Machine Learning into a project that’s not only technical but emotional and personal. As a Computer Science major, I’m always drawn to exploring how technology can express feeling and creativity. Implementing gesture control allowed me to bridge art and code  to make something that doesn’t just work, but feels alive.

I’m also proud of how I personalized the experience. By using NYUAD-specific places like D2 and the Library, I rooted the project in a world I know and love. It gives the game a familiar atmosphere one that other NYUAD students can relate to  while still transforming it into something strange and artistic.

Areas for Improvement 

While I’m proud of how the game turned out, there are several areas I’d like to refine. The hand gesture control, though innovative, can feel slightly clunky at first. I’d like to make it more responsive and intuitive  perhaps by training the ML model with more data or maybe using ML to detect body that can say if a person is turning left or right and the character themself move left or right.

I’d also love to expand the visual storytelling. Right now, the “bloody” D2 gives the right kind of creepiness, but I imagine adding more levels or moods maybe transitioning from a calm dining scene to a chaotic food fight as the difficulty increases.

Problems that you ran into

While building the project, I faced a few interesting technical challenges that pushed me to think creatively about how motion and input are detected and processed.

1. Detecting when the hand is closed (fist gesture):
My first major challenge was figuring out how to detect when the user’s hand is closed. I wanted the “fist” gesture to trigger a jump action, but at first, I wasn’t sure which hand landmarks to compare. Eventually, I decided to track the index fingertip (keypoint 8) and the base of the index finger (keypoint 5).

The idea was simple: if the y-coordinate of the fingertip (hand.keypoints[8].y) becomes greater than that of the finger base (hand.keypoints[5].y), it means the fingertip is lower in the camera frame  in other words, the finger is curled in, forming a fist.

I used console.log(hand.keypoints[8].y, hand.keypoints[5].y) to visualize the values and experimented by opening and closing my hand repeatedly to see when the condition triggered. This trial-and-error approach helped me fine-tune the threshold for reliable gesture recognition. It was satisfying to see the jump action respond accurately once the logic clicked.

 

 

2. Managing repeated function calls with hand gestures:
The second issue was with repeated trigger events when using gesture control. Unlike pressing a key  which calls the action just once per pressraising a hand is a continuous motion, so the detection function kept firing dozens of times per second.

For example, calling girl1.jump() or movement functions using hand gestures caused the action to repeat uncontrollably fast. To solve this, I implemented a counter-based system and used a modulus condition to limit how often the action executes. Essentially, if the function was being called too rapidly, I only allowed it to execute once every ten calls.

Similarly, I adjusted the character’s movement speed when controlled by gestures. Instead of moving by this.speed_x each frame (which made her move unrealistically fast), I scaled it down to this.speed_x * 0.005 inside the update_Ml() function. This made her movement smooth and proportional to the natural pace of a hand gesture, giving the game a more balanced and controlled feeling.

This also applied to animation strip changes by updating them every tenth frame, the animation stayed visually consistent without flickering or overloading.

 

My Sketch :

view only screen link: https://editor.p5js.org/aa11972/full/b224cudrh

 

Week 5: Midterm progress

My Midterm Project Concept

Last week, after a long and tiring day, I decided to take a short break and treat myself to a simple dinner. I made a fresh salad, seasoned it well, and added a generous scoop of hummus. I thought that a good meal would help me feel better. However, halfway through eating, I noticed a fly lying right in my food. The sight instantly ruined my appetite and left me feeling uneasy, worried I might end up with a stomach ache. I couldn’t help but think how much better the evening would have been if that fly hadn’t landed in my meal.

Interestingly, a friend later shared a similar unpleasant experience of finding a worm in their food. That conversation sparked an unusual but fun idea for a game: Worm Against Sanity. In this game, the player goes around the campus covering spots like the library, D1, D2, marketplace, and the Palms eliminating worms before they ruin the food.

.One of the most challenging parts of building Worm Against Sanity was making the game seamlessly switch between multiple screens while also animating the girl and worm sprites so that they moved realistically across the canvas. I wanted the opening screen, the play area, and the menu to feel like distinct spaces, but still connect smoothly when the player clicked a button. To achieve this, I kept track of a screen variable that updates whenever a mouse click falls within certain button coordinates. In the draw() function, I check the current value of screen and display the correct background and elements for that state. At the same time, I focused on fluid character and enemy movement. For the girl, I downloaded a running GIF and converted it into a sprite sheet, then wrote logic to cycle through the sprite frames every time an arrow key is pressed, flipping the image when she moves left. The worm uses a similar sprite-sheet approach, but it continuously advances across the screen on its own, updating its frame at regular time intervals and reducing the player’s life if it escapes. Coordinating these mechanics screen transitions, sprite-sheet animation, and frame-by-frame movement took careful planning and debugging, but it created a smooth and lively gameplay experience once everything clicked together.

I also experimented with adding interactive features, such as having the character jump on a worm when I move my hand or make a fist. Although I haven’t fully figured out how to implement motion-based controls yet, I’m actively exploring solutions and refining the concept.

In terms of visuals, I wanted the game to feel lively and unique, so I used AI tools to generate a cartoony illustration of the NYUAD campus to serve as the background for the different screens. This gives the game a playful, campus-specific atmosphere and saves time that would have gone into manual drawing.

 

My Work so Far