Week 10 – Musical instrument (Group Project with Isabella)

Concept

Mini Piano

Given that we had to make a musical instrument, we were inspired by a project we found in YouTube of what seemed like a piano. From this, we aimed to make a smaller scale of this project following the requirements for the assignment. The circuit consists of one piezo speaker, a potentiometer as an analog sensor, and a three buttons as digital sensors.

Circuit Illustration

Figure 2 ( Circuit design with code and simulation on “Magnificent Jaiks” by Abdelrahman

Final Results

VIDEO

 

What I’m Proud Of

One aspect of this project I’m particularly proud of is the octave multiplier implementation. Instead of having fixed notes, I used the potentiometer to create a continuous pitch control that multiplies the base frequencies by a factor between 0.5x and 2.0x. This simple line of code:

float octaveMultiplier = map(potValue, 0, 1023, 50, 200) / 100.0;

transforms a basic three-note instrument into something much more expressive and fun to play with. You can play the same melody in different registers, which makes the instrument feel more like a real musical tool rather than just a tech demo.

Challenges and Further Improvements

Despite the complexity of the circuit, Abdelrahman successfully managed to produce a design in Magnificent Jaiks which I managed to follow without any issues. After arranging the jumper wires and all the other pieces, the final result was as we aimed for. One of the biggest challenges we faced was finding a way to divide the work as a result of the distance. Nevertheless, we managed to make a plan after meeting on zoom, and finished our circuit on time. For future projects, one thing I would like to improve on is developing an instrument of this kind with more buttons in order to have a larger version of the mini piano done for this assignment.

Week 10 Reading Response

What I found most interesting in this reading was how the author explained that the point of his rant was not to give answers but to make people notice a problem. He wanted readers to realize that the way we design technology often ignores the human body and how it naturally interacts with the world. I liked how he compared the iPad to early black-and-white photography, saying that while it was revolutionary, it was also incomplete. That comparison made sense to me because it showed that something can be both amazing and limited at the same time. The author’s honesty about not having a solution made the whole thing feel more genuine. It felt less like a complaint and more like a challenge for people to think differently.

The part that stayed with me most was when he questioned what kind of future we are choosing if we keep creating tools that bypass the body. He described how we already spend so much of our lives sitting still in front of screens and how that could become permanent if we are not careful. I thought that was a powerful warning, especially when he said that children might become “finger-blind” by using touchscreens instead of exploring the world with their hands. It made me think about how technology can quietly change what it means to grow, learn, and create. By the end, I understood that his real message was about balance. Innovation should not mean leaving behind what makes us human.

Week 10 Reading Response

What really stuck with me from this reading is how much we’ve lost touch with using our hands in meaningful ways. The author points out that most of our modern “future” tech ignores what our hands are actually capable of, and that made me pause. I never thought about how numb the experience of using a phone screen really is until it was compared to holding a book or a cup. When I read that part, I actually picked up the notebook on my desk and noticed how I could feel its weight shift as I moved it. I don’t think I’ve ever noticed something that small before.

It reminded me of when I used to cook while during my leave of absence I took while living alone in Abu Dhabi. Chopping vegetables always felt natural, like my hands just knew what to do without thinking. It’s such a different kind of attention than what happens when I’m tapping on a touchscreen. That difference feels important now. I think the reading made me realize that technology doesn’t have to pull us away from the physical world. It can work with it. Maybe the future shouldn’t just look sleek or advanced, but should also let us feel connected to what we’re doing again.

Week 9 assignment

Concept

So basically, I made this little setup where I can control two LEDs using a knob and a button. One fades in and out smoothly when you twist the knob, and the other just pops on and off when you hit the button. It’s honestly pretty straightforward, but it was actually kind of fun to put together.

Demo
How It Works

The Knob: There’s a potentiometer, when you turn it, one of the LEDs gets brighter or dimmer. Twist one way and the light fades down, twist the other way and it gets brighter.

The Button: The other LED is even simpler. Press the button, light’s on. Let go, light’s off. That’s it.

Code

int pushButton = 2;
int led1 = 4;
int potentiometer = A5;
int led2 = 3;
void setup()
{
pinMode(led1, OUTPUT);
pinMode(led2, OUTPUT);
pinMode(pushButton, INPUT);
pinMode(potentiometer, INPUT);
}
void loop()
{
// digital sensor (switch) controlling ON/OFF state of LED
int buttonState = digitalRead(pushButton);
if (buttonState == HIGH){
digitalWrite(led1, HIGH);
}
else if (buttonState == LOW) {
digitalWrite(led1, LOW);
}
// analog sensor (potentiometer) controlling the brightness of LED
int potValue = analogRead(potentiometer);
int brightness = map(potValue, 0, 1023, 0, 255);
analogWrite(led2, brightness);
}

Future Improvements

Making It More Interactive: I could add more sensors to make it do cooler stuff. Like maybe a light sensor so the LEDs automatically adjust based on how bright the room is, or a temperature sensor that changes the colors based on how hot or cold it is.

Adding Colors: Right now it’s just basic LEDs, but I could swap them out for RGB LEDs. Then the potentiometer could control not just brightness but also what color shows up. Turn the knob and watch it cycle through the rainbow. That’d be way more visually interesting.

Week 9: Making Interactive Art: Set the Stage, Then Shut Up and Listen

Tom Igoe’s Making Interactive Art: Set the Stage, Then Shut Up and Listen made me think differently about what it really means to create something interactive. I liked how straightforward and almost blunt his advice was; don’t interpret your own work. That simple statement hit harder than I expected. I’ve noticed how easy it is to over-explain a project, especially when you’ve spent so much time building it. You want people to “get it,” so you tell them what it means, how to use it, and what to feel. But Igoe’s point is that doing that takes away the audience’s role completely. It turns interaction into instruction. That idea made me reflect on my own creative habits, especially how often I try to control an outcome rather than trust that people will find their own meaning through what I make.

I like how he compares interactive art to directing actors. A good director doesn’t feed emotions line by line; they set up the environment and let the actor discover the truth of the scene themselves. In the same way, an artist should set up a space or a system that invites exploration without dictating what’s supposed to happen. Igoe’s line about how the audience “completes the work” stayed with me because it reframes what success means in interactive art. It’s not about whether people interpret your piece exactly as you intended; it’s about whether they engage, react, and create their own story inside it. I think that takes a certain amount of humility and patience as an artist. You have to build something that’s open enough to allow surprise, even misunderstanding, and see that as part of the process instead of a failure.

Week 9: Physical Computing’s Greatest Hits (and misses)

Reading Tom Igoe’s Physical Computing’s Greatest Hits (and Misses) made me think about how creative patterns tend to repeat, especially in hands-on, tech-driven projects. Igoe doesn’t frame repetition as a lack of originality, which I appreciated. Instead, he treats these recurring themes—like theremin-like instruments, drum gloves, or video mirrors—as open-ended ideas that people keep revisiting because they’re fun, expressive, and adaptable. I related to that idea a lot because I’ve definitely hesitated before, thinking something wasn’t worth doing since it had “been done before.” But the more I read, the more I agreed with his point that originality isn’t about inventing from scratch; it’s about finding your own way into an existing form. What stood out to me were the projects that relied on the body as the main input—like gloves that create rhythm through tapping or instruments that react to gestures. Those projects feel personal and direct, and I like how they blend instinct with technology. Igoe’s descriptions made me realize that the best physical computing ideas don’t just respond to touch or movement; they build a relationship with the person using them.

Some parts of the reading also made me laugh or nod along because I’ve seen those same trends pop up in classes or exhibits. The “video mirrors,” for instance, are always visually striking but usually shallow in interaction—you wave, something moves, and that’s it. Igoe’s critique there made sense. It reminded me that while technology can catch attention, meaning comes from how people connect to it, not just how it looks. I was also drawn to the more poetic examples like “Mechanical Pixels” or “Fields of Grass,” where simple mechanisms create quiet, almost meditative experiences. Those pieces blur the line between machine and nature, which I find really compelling. Even the sillier categories, like “Things You Yell At,” showed how emotional interaction can be when it’s physical and immediate. Overall, the article made me think about how I might approach projects differently: not trying to avoid what’s been done, but trying to make it feel a bit more like me.

Week 8 – Unusual Switch

Concept

I’ve been thinking a lot about how we usually interact with electronics. It’s almost always with our hands; pressing buttons, turning knobs, typing. So I’m glad we got to try something different. My idea for this project is to have a foot-activated Arduino switch made with nothing more than aluminum foil, tape, socks, and (curiosity).

The idea is simple. When your feet touch, they complete a circuit, and when they move apart, the circuit opens again. Your body becomes the bridge that carries a tiny signal. I wrapped strips of foil around my feet (over socks), taped them so they wouldn’t slip, and connected one to pin 2 on the Arduino and the other to ground. When the two foils touch, the Arduino reads a HIGH signal, meaning the circuit is complete.

Demonstration

IMG_5452

Code

int footSwitchPin = 2;   
int ledPin = 13;         

void setup() {
  pinMode(footSwitchPin, INPUT);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  int footState = digitalRead(footSwitchPin);  //check if feet are touching

  if (footState == HIGH) {
    digitalWrite(ledPin, HIGH);   //feet together = LED ON
  } else {
    digitalWrite(ledPin, LOW);    //feet apart = LED OFF
  }
}

Challenges and Improvements 

The biggest challenge was stability. The foil sometimes slipped or wrinkled, breaking contact even when my feet were touching. The tape would loosen after a few tries, so I had to adjust it constantly.  On the creative side, it would be fun to connect it to a sound or visual program on a computer. For example, every time your feet meet, a sound plays or a color changes on screen. That could turn this tiny experiment into a music or art performance piece.

Week 8 Reading Reflection – Attractive Things Work Better

I didn’t expect to agree so much with Don Norman when I first read Attractive Things Work Better. The title itself almost sounds like clickbait; it feels like he’s just saying “make things pretty.” But as I went through the reading, I realized he wasn’t talking about surface-level beauty at all. He was talking about how our emotions actually change the way we think and interact with design. One thing that really stuck with me was his story about the three teapots. He describes one that’s intentionally unusable (the “coffeepot for masochists”), another that’s kind of ugly but works well, and a third that’s beautifully designed and practical at the same time. It sounds funny at first, but it captures something real about how we connect with the things we use.

Norman also connects emotion and cognition in a way that made a lot of sense. He explains how positive affect (basically being in a good mood) helps people think more creatively and handle problems better, while negative affect makes us focus and think more carefully but also more rigidly. I liked his point that when people are relaxed or happy, they become more flexible, patient, and open-minded; they’re even more forgiving of little design flaws. That feels true in life too. When I’m in a good mood, I don’t mind if my phone glitches or my m1 macbook gets loud; but, as you can imagine, when I’m stressed, every small thing feels like a disaster. When something looks and feels good to use, it changes our attitude toward it; we engage more willingly and think more clearly. I liked how Norman ended by saying that good design balances beauty and usability. A product shouldn’t be just functional or just nice to look at; it should feel right to use.

Week 8 Reading Reflection – Her Code Got Humans on the Moon

Well, that photo of Margaret Hamilton standing next to a pile of papers taller than she is has always amazed me as a kid. I remember seeing it everywhere online when I was younger; it circulated a lot in the Arab world, on Facebook and Tumblr especially. People shared it with captions about how her code took humans to the moon. Even before I knew who she was, I could feel that the image meant something special–this one person beside a literal mountain of her own work.

After reading Her Code Got Humans on the Moon, I finally understood why that picture had such an impact. Hamilton wasn’t just part of the Apollo program; she helped define what software engineering even was. Back in the 1960s, people didn’t really think of “software” as an important part of space missions; it wasn’t even included in NASA’s original budgets. But Hamilton and her team at MIT’s Instrumentation Lab changed that. They wrote the code that ran on the Apollo spacecraft, and their work made it possible for Neil Armstrong and Buzz Aldrin to land safely on the moon. What struck me most in the reading was how she handled the pressure. There’s a part where she talks about staying up late to fix a single line of code because she was afraid it could cause an error during a mission. And later, that actually happened—an error almost caused chaos during Apollo 11, but because of how Hamilton had designed the software to prioritize important tasks, the system recovered and saved the mission. That’s insane to think about; one person’s attention to detail made the difference between failure and success.

I also liked how the reading mentioned her bringing her daughter, Lauren, to the lab on weekends. It was such a human detail—this image of a mother working on code that would go to space while her kid slept next to her. People back then questioned her for it, asking how she could leave her daughter to work, but she just did what she believed in. That kind of dedication hit me.

Midterm Project; Operation: Campus Cat

Project Concept
Operation: Campus Cat is a fast-paced game inspired by the beloved community cats of NYU Abu Dhabi. Set against the backdrop of a stylized campus map, players must protect their food from a hungry, mischievous orange cat who roams freely and relentlessly across the scene. It’s a tongue-in-cheek interpretation of a very real situation many NYUAD students have experienced: trying to eat while a campus cat watches… and slowly approaches.

While planning this game, I intended to blend together light strategy, reflex-based mechanics, and playful visuals that are based on NYUAD’s. As you can see, he tone is humorous but is still grounded in campus life, and, quite frankly, don’t expec  a fantasy game about fighting cats, but rather a funny tribute to the cats who rule the Interactive Media garden and food court. Operation: Campus Cat aims to turn a slice of real NYUAD culture into an accessible, replayable p5.js browser game. So, if you happen to be one of our campus cats’ victims, if they stole your food, I hope this makes you feel better in some way!

How the Game Works
Well, the core loop is pretty simple: food spawns randomly on the screen every few seconds, and the player must reach the food before the cat reaches it. Each successful click earns 5 points. But if the cat eats a food item, the player loses 2 points and it adds to the “cat ate” counter. Once the cat eats 5 items in a round, or if the round timer hits 0, the player loses one of their 3 lives. Once all lives are gone, the game ends with a final score.

The cat’s movement isn’t passive; it actively chases the nearest food using simple vector math. It glides across the campus map toward its next target, making the player prioritize which items to save. Clicking on the cat itself instead of the food will make it temporarily disappear (a “Signal Lost” message appears in its place), but doing so costs 3 points. Imagine you’re using a satellite to track the cats’ movement. This is basically it! This mechanic creates a high-stakes trade-off: delay the cat briefly, or focus on clearing food? Rounds last 60 seconds, and the player must keep moving fast and making strategic decisions.

A full-screen responsive shows score, remaining lives (as hearts), the number of missed food items in the current round, and a countdown timer. The game also features a start screen, instruction screen, and a game over screen, with appropriate transitions and buttons to replay or restart the session.

 Code Snippet
Here’s the logic for the cat’s food-chasing behavior, which uses distance checks and angle math:

const angle = atan2(nearestFood.y - this.y, nearestFood.x - this.x);
this.x += cos(angle) * this.speed;
this.y += sin(angle) * this.speed;

The Food class includes a pulse animation using a sine wave to make items feel more alive and clickable:

const pulse = map(sin(frameCount * 0.1 + index), -1, 1, 0.85, 1.15);

The game is organized using object-oriented design. The core classes are:

  1. Game: Overall state manager (start, play, game over)
    2. Cat: Handles cat behavior, movement, hiding state
    3. Food: Controls food spawning, visuals, and interaction
    4. HUD: Manages the interface and gameplay data display
    5. Button: A reusable UI component for menus and controls

Assets like images and sound effects are loaded via the Assets object, with fallback logic in case of load failure (e.g., drawing simple shapes instead of broken images). This ensures that even with missing files, the game still runs and remains playable.

What I’m Proud Of

  1. Game’s background

    I made this in freshman year while taking a core design class here at NYUAD. When I was looking for a drawing that shows our campus from above, this one worked perfectly! The only thing that was time consuming about it was finding the right palette that is both relatable and playful to suit the mood of the game. I decided to make it more green, make the top of campus center to resemble/imply the head of a cat (not sure if it shows). As I said, choosing the colors was challenging, and so ChatGPT helped me with the color choice as well.
  2. One section of the code I’m particularly proud of is the pulsing animation inside the Food class. It’s a small visual detail, but it adds a lot of liveliness to the screen. Each food item subtly “breathes” using a sine wave function, making it feel dynamic and easy to spot. This animation helps guide player attention and makes the gameplay feel more polished.

    // Inside Food.draw()
    const pulse = map(sin(frameCount * 0.1 + index), -1, 1, 0.85, 1.15);
    const img = Assets.img.food[this.imageKey];
    
    if (img && img.width > 0) {
      push();
      translate(this.x, this.y);
      scale(pulse);
      imageMode(CENTER);
      image(img, 0, 0, this.size * 2, this.size * 2);
      pop();
    }
    

    This little animation uses sin(frameCount * 0.1) to smoothly oscillate each food’s scale over time, creating a soft pulsing effect. I like this snippet because it shows how much visual impact can come from just a few lines of math and thoughtful timing, no extra assets or libraries needed. It makes the entire game feel more animated and alive without adding any performance cost.

    Challenges & Areas for Improvement

    One of the biggest challenges was cat movement; initially the cat was too fast or too slow, or would teleport unexpectedly. I had to tune the speed and collision radius multiple times to make it feel fair. Similarly, I ran into trouble with image preloading: sometimes food items or the campus map would fail to load. I added fallback logic so that the game shows a colored circle if images fail.

    In terms of gameplay, it currently doesn’t scale difficulty; every round is the same length, and the cat moves at a constant speed. In future updates, I’d like to introduce progressive rounds where spawn intervals shorten and the cat gets faster. Other ideas include adding multiple cats, special food items, or power-ups like “cat repellent” or “freeze time.”

    Lastly, while the game runs in fullscreen and resizes on window change, it’s not yet optimized for mobile/touch input, which would make it more accessible to a wider audience. Touch support and gesture input would be a major next step.