Week 10 Assignment

Concept

This week, Jayden and I  made a small musical instrument inspired by the piano. The piano is easy to understand and fun to play, so it was perfect for our Arduino project. We used three switches to play notes.  We also added a potentiometer that can change the pitch of the notes while playing. This means the player can make higher or lower sounds with the same switches. Using the switches and potentiometer together makes the instrument more interactive and fun, giving the player control over both the notes and their frequency.

Video : Link

Schematic

Code:

// Mini Piano: 3 switches, Piezo buzzer, optional pitch adjustment

const int buzzerPin = 8;      // Piezo buzzer
const int switch1 = 2;        // Key 1 (C)
const int switch2 = 3;        // Key 2 (E)
const int switch3 = 4;        // Key 3 (G)
const int potPin = A0;        // Optional: pitch adjust

// Base frequencies for the notes (Hz)
int note1 = 262;  // C4
int note2 = 330;  // E4
int note3 = 392;  // G4

void setup() {
  pinMode(buzzerPin, OUTPUT);
  pinMode(switch1, INPUT);
  pinMode(switch2, INPUT);
  pinMode(switch3, INPUT);
  Serial.begin(9600); // optional for debugging
}

void loop() {
  // Read potentiometer to adjust pitch
  int potValue = analogRead(potPin);        // 0-1023
  float multiplier = map(potValue, 0, 1023, 80, 120) / 100.0;  // 0.8x to 1.2x

  bool anyKeyPressed = false;

  // Check switches and play corresponding notes
  if (digitalRead(switch1) == HIGH) {
    tone(buzzerPin, note1 * multiplier);
    anyKeyPressed = true;
  }

  if (digitalRead(switch2) == HIGH) {
    tone(buzzerPin, note2 * multiplier);
    anyKeyPressed = true;
  }

  if (digitalRead(switch3) == HIGH) {
    tone(buzzerPin, note3 * multiplier);
    anyKeyPressed = true;
  }

  // Stop sound if no switch is pressed
  if (!anyKeyPressed) {
    noTone(buzzerPin);
  }

  delay(10); // short delay for stability
}

Github link : Paino

Reflection

This week’s assignment felt highly interactive, building upon previous projects while introducing multiple input elements and user-controlled parameters. We learned how to combine both digital and analog inputs to create a responsive musical instrument. For future improvements, we would like to implement a more realistic note duration system, where each note fades out naturally after being played, similar to a real piano. Additionally, adding more switches and possibly multiple buzzers could allow for more complex melodies and chords, enhancing the expressive possibilities.

Week 10: Musical Instrument

Concept

The musical instrument that inspired us to create our project for the week, is the xylophone. An instrument that is both interesting and easy to navigate, which made for the perfect interactive musical instrument to create using arduino. We created it using three tinfoil strips that each play a different note when they are pressed, with a potentiometer controlling the frequency of every note adding dimension.  The ability to control two different elements such as the note played and it’s frequency provides the user of the participant the ability to create a range of sounds, diversifying the possible experiences that one can create with our instrument.

Demonstration

Schematic

Code Snippet

 if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    Serial.print("Strip "); Serial.print(activeIndex+1);
    Serial.print(" freq=");
    Serial.println(freq);

    delay(50);
  } else {
    noTone(buzzerPin);
}

This part of code was one that was the most challenging, as it controls all important elements of our instrument. I would say it is where all the magic happens, bringing together both the analog and digital input to create the right sound intended by the player of the instrument. It calculates the final tone by multiplying the base frequency by a set multiplier, keeping the result within a safe range, and then sends this signal to the buzzer. When no input is detected, the sound stops. Essentially, it acts as the core logic that translates user interaction into audible output.

Full Code

Github Link

Reflection

This week’s assignment felt like a more interactive work, building up on our previous projects and adding dimension using more elements. For future improvements we would like to add a limit to the duration a note plays similar to an actual xylophone where a note stops after a certain time after playing a note. This imitates an actual instrument creating a more natural and realistic feel to the work, even playing multiple notes allowing the connection of more than one foil strip and more buzzers. For future projects, I’d like to focus more on aesthetics and adding more visual elements to the works.

Week 10 – The Diet Drum (Deema and Rawan)

Our Concept: 

Our project drew inspiration from last week’s readings on human-computer interaction, particularly the ways in which technology can respond to subtle human behaviors. We explored how interactive systems often mediate our engagement with the environment and even with ourselves, creating experiences that feel responsive, social, or even playful.

With this perspective, we asked ourselves: what if an instrument didn’t just make sound, but responded directly to human behavior? Instead of rewarding interaction, it could intervene. Instead of passive engagement, it could create a performative, almost social response.

From this idea, the Diet Drum emerged — a device that reacts whenever someone reaches for a snack. The system is both humorous and relatable, externalizing the human struggle of self-control. When a hand approaches the snack bowl, a servo-powered drumstick strikes, accompanied by a short melody from a passive buzzer. The result is a playful, judgmental interaction that transforms a familiar, internal tension into an amusing and performative experience.

How It Works

  • Photoresistor (LDR): Detects hand movements by monitoring changes in light. As a hand blocks the sensor, the reading drops.

  • Servo motor: Moves a drumstick to perform a percussive strike, physically reinforcing the “warning” aspect of the interaction.

  • Passive buzzer: Plays a short melody as a playful, auditory cue.

  • Arduino Uno: Continuously monitors the sensor and triggers both motion and sound.

When the LDR senses that a hand has blocked the light, the Arduino makes the servo play the melody and hit the drum. This creates a clear, immediate connection between what a person does and how the system responds, showing ideas from our readings about how devices can react to gestures and sensor input.

Video Demonstration

assignment10

Challenges

Throughout development, we encountered several challenges that required both technical problem-solving and design fixing:

  • System reliability: While the setup initially worked smoothly, leaving it for some time caused it to fail. Figuring out the problem took us some time because we didn’t know what went wrong and whether it was the setup or the code. So we had to use AI, to help us figure out what to do in order to know if the problem was from the wiring or the code, and then after knowing it’s from the wiring, we had to partially rebuild and retune the system to restore functionality.

  • Mechanical stability: Keeping the drumstick steady during strikes was more difficult than anticipated. Any slight movement or misalignment affected the accuracy and consistency of the strikes, requiring several adjustments.

  • Audio timing: The melody initially played too long, delaying servo motion and disrupting the intended interaction. Shortening the audio ensured that the strike and sound remained synchronized, preserving the playful effect.

  • We used AI, to help with some code difficulties we had, to fit with our original idea.

Code Highlights

One part of the code we’re especially proud of is how the sensor input is mapped to the servo’s movement.

float d = constrain(drop, MIN_DROP, MAX_DROP);
float k = (d - MIN_DROP) / float(MAX_DROP - MIN_DROP); 
int hitAngle = SERVO_HIT_MIN + int((SERVO_HIT_MAX - SERVO_HIT_MIN) * k);
unsigned long downMs = STRIKE_DOWN_MS_MAX - (unsigned long)((STRIKE_DOWN_MS_MAX - STRIKE_DOWN_MS_MIN) * k);

strikeOnce(hitAngle, downMs);

This makes the drumstick respond based on how close the hand is, so each action feels deliberate rather than just an on/off hit. It lets the system capture subtle gestures, supporting our goal of reflecting nuanced human behavior. AI helped us with this in terms of knowing when exactly and how hard the strike should hit.

Future Improvements

Looking forward, we see several ways to expand and refine the Diet Drum:

  • Adaptive audio: Varying the melody or warning tone based on how close the hand is could enhance the playfulness and expressiveness.

  • Mechanical refinement: Improving the stability of the drumstick and optimizing servo speed could create smoother strikes and more consistent feedback.

  • Compact design: Reducing the size of the device for easier placement would make it more practical for everyday use.

  • Visual cues: Adding optional LEDs or visual signals could enhance the  feedback, making the system even more engaging.

Github Link:

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/assignment10.ino

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/pitches.h

Week 10: Group Project “NYUAD DJ Kit”

Main Concept:

The main concept for our group project is a DJ, since we wanted to experience what it feels like to be one. A DJ needs to handle many sounds and instruments using their unique artistic skills to create music that makes people happy and excited. Thus, we crafted this device called “NYUAD DJ Kit.” By using it, you can choose different kinds of songs with various speeds and a base sound produced by a wooden stick. This is a unique way to compose new kinds of songs as a DJ.

Demonstration Video

Schematic:

Code we’re particularly proud of:

The parts of the code we’re most proud of are the one shown below. The if else statement allows us to move to the next song and play it. When the button is pressed, meaning the pin is low, we set buttonPressed to true and noteIndex to 0 so that the song plays from the beginning. We also used the modulo operator to ensure that we always go back to the first song after the last one. The else if statement resets the buttonPressed state to false, so that the next time we press the button, it plays the next song

//handle music switching using modulo
if (digitalRead(BUTTON_PIN) == LOW && !buttonPressed) {
    buttonPressed = true;
    //move to the next song
    currentSong = (currentSong + 1) % 3; 
    //set note to 0 so that a song plays from the beginning
    noteIndex = 0; 
    isPlaying = false;
    noTone(BUZZER_PIN);
    delay(250); //delay for 250 milliseconds
  } else if (digitalRead(BUTTON_PIN) == HIGH) {
    //set buttonPressed to false to play next song 
    buttonPressed = false;
  }

The second snippet of code allows the servo to move every servoDelay milliseconds, controlling its speed and angle. We applied the concept we learned in class called “non-blocking” to ensure that this operation does not affect the rest of the program. Inside the if statement, we use the write() function from the Servo library to change the servo’s angle each time it runs. This way, the servo continues changing its angle until it reaches either 180° or 0°, incrementing by a step each time servoDelay milliseconds have passed. We’re happy that we were able to apply multiple concepts we learned in class, such as non-blocking and modulo, to our DJ project. As references, we used the example file ToneMelody from the Digital section and the Knob example from the Servo library. We also used ChatGPT to help us figure out how to apply the non-blocking concept so that the melody can move to the next one without affecting the rest of the program, which allows the servo to continue moving smoothly.

//Use non-blocking to not affect the rest of the program
if (currentTime - lastServoUpdate >= servoDelay) { //if servoDelay mills has passed
    lastServoUpdate = currentTime;
    //Change the angle of the servo by servoPos 
    myservo.write(servoPos);
    servoPos += servoStep;
    //Start decrementing if servo reaches 0 or 180 degrees
    if (servoPos >= 180 || servoPos <= 0) servoStep = -servoStep;
  }

Reflections & Future Improvements:

In terms of reflections, we struggled a lot to make the base work because we needed to attach the wooden stick to the servo and it was not stable at all at first. The way we attached it was by using tape, which was the primary cause of why it was unstable. As a result, every time we ran the servo fast, the stick fell off or the servo stopped working for some reason. We eventually managed to make the stick and servo stable by placing some weight on top of the setup so that no matter how fast the servo moved, the stick remained stable.

As for future improvements, we want to enhance the quality of the base because right now we’re just using a wooden stick, and it doesn’t produce a loud enough sound for a party situation. Furthermore, as the stick swings faster, its swing range becomes smaller, so we need to move the bottle manually to allow the stick to reach it. We believe this happens because the servoDelay becomes too small, reaching about 1 ms, so the servo can’t physically keep up. Therefore, next time we should use constrain() on the mapped value to prevent electrical noise from going out of range. This way, we can allow the servo to catch up with the speed that we want.

Week 10: Group Project “NYUAD DJ Kit”

Main Concept:

The main concept for our group project is a DJ, since we wanted to experience what it feels like to be one. A DJ needs to handle many sounds and instruments using their unique artistic skills to create music that makes people happy and excited. Thus, we crafted this device called “NYUAD DJ Kit.” By using it, you can choose different kinds of songs with various speeds and a base sound produced by a wooden stick. This is a unique way to compose new kinds of songs as a DJ. 

Demonstration Video

 

Schematic:

 

Code we’re particularly proud of:

The parts of the code we’re most proud of are the one shown below. The if else statement allows us to move to the next song and play it. When the button is pressed, meaning the pin is low, we set buttonPressed to true and noteIndex to 0 so that the song plays from the beginning. We also used the modulo operator to ensure that we always go back to the first song after the last one. The else if statement resets the buttonPressed state to false, so that the next time we press the button, it plays the next song. 

//handle music switching using modulo
if (digitalRead(BUTTON_PIN) == LOW && !buttonPressed) {
    buttonPressed = true;
    //move to the next song
    currentSong = (currentSong + 1) % 3; 
    //set note to 0 so that a song plays from the beginning
    noteIndex = 0; 
    isPlaying = false;
    noTone(BUZZER_PIN);
    delay(250); //delay for 250 milliseconds
  } else if (digitalRead(BUTTON_PIN) == HIGH) {
    //set buttonPressed to false to play next song 
    buttonPressed = false;
  }

 

The second snippet of code allows the servo to move every servoDelay milliseconds, controlling its speed and angle. We applied the concept we learned in class called “non-blocking” to ensure that this operation does not affect the rest of the program. Inside the if statement, we use the write() function from the Servo library to change the servo’s angle each time it runs. This way, the servo continues changing its angle until it reaches either 180° or 0°, incrementing by a step each time servoDelay milliseconds have passed. We’re happy that we were able to apply multiple concepts we learned in class, such as non-blocking and modulo, to our DJ project. As references, we used the example file ToneMelody from the Digital section and the Knob example from the Servo library. We also used ChatGPT to help us figure out how to apply the non-blocking concept so that the melody can move to the next one without affecting the rest of the program, which allows the servo to continue moving smoothly.

//Use non-blocking to not affect the rest of the program
if (currentTime - lastServoUpdate >= servoDelay) { //if servoDelay mills has passed
    lastServoUpdate = currentTime;
    //Change the angle of the servo by servoPos 
    myservo.write(servoPos);
    servoPos += servoStep;
    //Start decrementing if servo reaches 0 or 180 degrees
    if (servoPos >= 180 || servoPos <= 0) servoStep = -servoStep;
  }

 

Link to GitHub: https://github.com/KimShota/Intro-to-IM/blob/13b494508781fc36c9b95d3b46e5145d18c06808/nyuad_dj.ino

Reflections & Future Improvements:

In terms of reflections, we struggled a lot to make the base work because we needed to attach the wooden stick to the servo and it was not stable at all at first. The way we attached it was by using tape, which was the primary cause of why it was unstable. As a result, every time we ran the servo fast, the stick fell off or the servo stopped working for some reason. We eventually managed to make the stick and servo stable by placing some weight on top of the setup so that no matter how fast the servo moved, the stick remained stable. 

As for future improvements, we want to enhance the quality of the base because right now we’re just using a wooden stick, and it doesn’t produce a loud enough sound for a party situation. Furthermore, as the stick swings faster, its swing range becomes smaller, so we need to move the bottle manually to allow the stick to reach it. We believe this happens because the servoDelay becomes too small, reaching about 1 ms, so the servo can’t physically keep up. Therefore, next time we should use constrain() on the mapped value to prevent electrical noise from going out of range. This way, we can allow the servo to catch up with the speed that we want.

Reading Reflection – Week 10

Bret Victor’s rant on the future of technology had a very interesting take on technological design for the future, which made me realize the things we often ignore. He mentions how a glass screen completely misses the purpose of what our hands can do, which is to feel and manipulate, and how future interaction designs seem to neglect that entirely by reducing interaction to swiping fingers across a flat screen.

Honestly, I can’t help but agree with him on this, but I also find it difficult to come up with a clear solution to this problem. Designing future technology in the “Pictures Under Glass” way is probably the easiest way to do so, which is likely why it became the default design. If we did take the abilities of our hands into account, depending on the purpose of the device you’re using, the design would differ drastically from one to another, making it extremely complex. Although I do agree with Victor’s frustration, I also think his critique shows how difficult innovation can be once a certain design dominates the industry.

In his follow-up, Victor admits that he didn’t offer a solution because the purpose of his rant was to encourage research rather than provide answers. I actually respect his honesty, it made his rant look more like an invocation rather than a complaint. I’m on board with his criticism to limiting designs and to instead imagine new possibilities of greater ones. Even though we don’t yet know what a “dynamic tactile medium” will look like, I think Victor’s ideas push us as readers to think deeply about how our bodies and technology should work together.

Week 10 – Reading Reflection

A Brief Rant on the Future of Interaction Design:

When I was reading Bret Victor’s “A Brief Rant on the Future of Interaction Design” I thought about how I perceive technology and design. I realized how easily I accept “innovations” like touchscreens as the peak of progress, even though, as Victor argues, they often limit our potential rather than expand it. His critique of “pictures under glass” especially resonated with me, because I use my phone and laptop every day, but I rarely think about how numb those interactions actually are. There is no real feeling, no texture, no sense of connection between my hands and what I’m creating.

I think, this reading challenged me to imagine interfaces that feel alive, that respond to our touch and movement in meaningful ways. Victor’s idea that tools should “amplify human capabilities” made me wonder whether I am designing for convenience or for human expression. I started thinking about how interaction could involve more of the body, maybe through gestures, pressure, or sound, so that users could experience technology in a fuller, more emotional way. I also liked Victor’s reminder that “the future is a choice.” It gave me a sense of agency and responsibility as a future designer. Instead of waiting for big tech companies to define how we interact, I can be part of shaping alternatives that are more tactile, intuitive, and human-centered. This reading did not just critique existing designs, it inspired me to dream bigger and to treat design as a way of expanding what people can truly feel and do.

A follow-up article

These responses challenged the way I think about technology and its relationship to the human body. His insistence that our current interfaces are “flat and glassy” made me realize how limited most digital experiences truly are. I started questioning how often I accept these limitations without noticing them. The idea that our tools should adapt to us not the other way around feels both radical and necessary.What I found most striking was his defense of our hands and bodies as essential to understanding and creativity. It made me see touch not as something trivial, but as a form of intelligence. The thought that we could lose part of that richness by constantly interacting with lifeless screens feels unsettling.

As someone studying Interactive Media, I see this as a call to design technologies that reconnect people with the physical world. Instead of chasing the newest gadget, I want to think about how digital experiences could feel more alive, how they could move, resist, or respond in ways that make us aware of our own presence. The reflections did not just critique modern design, they opened a space for imagining interaction as something deeply human, sensory, and expressive.

Week 10 – Reading Reflection

Bret Victor’s rant made me rethink what we even mean when we call something “the future.” He argues that touchscreens, gesture controls, and all these “advanced” interfaces are actually making us less connected to our own abilities. Our hands are one of the deepest ways we understand the world. They know tension, pressure, texture. They think with us. But we’ve decided progress means tapping around on cold glass. When I read that, the first thing I thought of was LEGO. There is this unspoken language when you build: the way your fingers already know which brick fits, the tiny resistance before a perfect click. That sound. That feeling. It’s not just play; it is intelligence happening through the body. No screen has ever replicated that.

I’ve tried the digital LEGO builders before, and they always feel wrong. You can assemble something on the screen, sure, but there is no weight, no friction, no small ritual of digging through pieces and recognizing one by touch alone. Same with crocheting. The yarn runs differently through your fingers depending on tension, mood, the hook, your posture. You feel progress. You feel mistakes. Your hands correct before your mind catches up. Victor’s point clicked for me here: creativity is not just in the mind. It is in the wrists, fingertips, joints, and muscle memory. When interfaces ignore the body, they are not futuristic. They are incomplete.

The responses page made it clear he is not saying we need to go backwards. He is saying we should refuse a future that flattens our senses. There are richer, more human possibilities if we let our full selves participate in the interaction. For me, the future I want is textured, clickable, tuggable, threaded, snapped together. A future that feels like LEGO: discovery through touch, play, accident, correction, and joy. Innovation that doesn’t just live on a screen, but lives in your hands.

W10: Instrument

Inspiration

The xylophone has always fascinated me. I loved watching the vibrant melodies come to life as each bar was tapped. This inspired me to create a digital version using everyday materials, giving the classic xylophone a modern, interactive twist.

Concept

The idea was simple yet playful: use Aluminum foil as the xylophone buttons. Each strip of foil represents a note, and tapping on it triggers a sound. To bring in the concept of tuning (something I deeply appreciate from my experience playing the violin) we incorporated a potentiometer. This allows the user to adjust the pitch of each note in real-time, just as a musician tunes their instrument before performing. By combining tactile interaction with the flexibility of pitch control, we aimed to create an instrument that feels both familiar and innovative.

 

Code I’m Most Proud Of

int potVal = analogRead(potPin);
float multiplier = map(potVal, 0, 1023, 60, 180) / 100.0;

if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    delay(50);
} else {
    noTone(buzzerPin);
}

What makes this snippet special is how it turns a simple analog input into musical expression. By mapping the potentiometer to a frequency multiplier, each foil strip produces a different tone that can be adjusted on the fly. Using constrain() ensures the sounds remain within a safe, audible range. It was rewarding to see how these functions, which we learned in class, could be combined to create a tactile, musical experience.

Future Improvements

Right now, the instrument plays a sound as long as the foil is touched. In the future, I’d like to add note duration control so that pressing a strip produces a single tone, similar to how a piano note behaves, with a possible fade-out effect when the note ends. This would make the interaction feel more natural and musical.

Another exciting improvement could be a wireless “stick” that triggers the foil strips remotely. This would allow the musician to move freely and perform more expressively, opening up new possibilities for live interaction and playability.

 

Week 10 – Music instrument

Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram:
Setup:Video:

Highlight of the code:
Full-code on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.