Week 10: Musical Instrument

Concept

The musical instrument that inspired us to create our project for the week, is the xylophone. An instrument that is both interesting and easy to navigate, which made for the perfect interactive musical instrument to create using arduino. We created it using three tinfoil strips that each play a different note when they are pressed, with a potentiometer controlling the frequency of every note adding dimension.  The ability to control two different elements such as the note played and it’s frequency provides the user of the participant the ability to create a range of sounds, diversifying the possible experiences that one can create with our instrument.

Demonstration

Schematic

Code Snippet

 if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    Serial.print("Strip "); Serial.print(activeIndex+1);
    Serial.print(" freq=");
    Serial.println(freq);

    delay(50);
  } else {
    noTone(buzzerPin);
}

This part of code was one that was the most challenging, as it controls all important elements of our instrument. I would say it is where all the magic happens, bringing together both the analog and digital input to create the right sound intended by the player of the instrument. It calculates the final tone by multiplying the base frequency by a set multiplier, keeping the result within a safe range, and then sends this signal to the buzzer. When no input is detected, the sound stops. Essentially, it acts as the core logic that translates user interaction into audible output.

Full Code

Github Link

Reflection

This week’s assignment felt like a more interactive work, building up on our previous projects and adding dimension using more elements. For future improvements we would like to add a limit to the duration a note plays similar to an actual xylophone where a note stops after a certain time after playing a note. This imitates an actual instrument creating a more natural and realistic feel to the work, even playing multiple notes allowing the connection of more than one foil strip and more buzzers. For future projects, I’d like to focus more on aesthetics and adding more visual elements to the works.

Week 10 – The Diet Drum (Deema and Rawan)

Our Concept: 

Our project drew inspiration from last week’s readings on human-computer interaction, particularly the ways in which technology can respond to subtle human behaviors. We explored how interactive systems often mediate our engagement with the environment and even with ourselves, creating experiences that feel responsive, social, or even playful.

With this perspective, we asked ourselves: what if an instrument didn’t just make sound, but responded directly to human behavior? Instead of rewarding interaction, it could intervene. Instead of passive engagement, it could create a performative, almost social response.

From this idea, the Diet Drum emerged — a device that reacts whenever someone reaches for a snack. The system is both humorous and relatable, externalizing the human struggle of self-control. When a hand approaches the snack bowl, a servo-powered drumstick strikes, accompanied by a short melody from a passive buzzer. The result is a playful, judgmental interaction that transforms a familiar, internal tension into an amusing and performative experience.

How It Works

  • Photoresistor (LDR): Detects hand movements by monitoring changes in light. As a hand blocks the sensor, the reading drops.

  • Servo motor: Moves a drumstick to perform a percussive strike, physically reinforcing the “warning” aspect of the interaction.

  • Passive buzzer: Plays a short melody as a playful, auditory cue.

  • Arduino Uno: Continuously monitors the sensor and triggers both motion and sound.

When the LDR senses that a hand has blocked the light, the Arduino makes the servo play the melody and hit the drum. This creates a clear, immediate connection between what a person does and how the system responds, showing ideas from our readings about how devices can react to gestures and sensor input.

Video Demonstration

assignment10

Challenges

Throughout development, we encountered several challenges that required both technical problem-solving and design fixing:

  • System reliability: While the setup initially worked smoothly, leaving it for some time caused it to fail. Figuring out the problem took us some time because we didn’t know what went wrong and whether it was the setup or the code. So we had to use AI, to help us figure out what to do in order to know if the problem was from the wiring or the code, and then after knowing it’s from the wiring, we had to partially rebuild and retune the system to restore functionality.

  • Mechanical stability: Keeping the drumstick steady during strikes was more difficult than anticipated. Any slight movement or misalignment affected the accuracy and consistency of the strikes, requiring several adjustments.

  • Audio timing: The melody initially played too long, delaying servo motion and disrupting the intended interaction. Shortening the audio ensured that the strike and sound remained synchronized, preserving the playful effect.

  • We used AI, to help with some code difficulties we had, to fit with our original idea.

Code Highlights

One part of the code we’re especially proud of is how the sensor input is mapped to the servo’s movement.

float d = constrain(drop, MIN_DROP, MAX_DROP);
float k = (d - MIN_DROP) / float(MAX_DROP - MIN_DROP); 
int hitAngle = SERVO_HIT_MIN + int((SERVO_HIT_MAX - SERVO_HIT_MIN) * k);
unsigned long downMs = STRIKE_DOWN_MS_MAX - (unsigned long)((STRIKE_DOWN_MS_MAX - STRIKE_DOWN_MS_MIN) * k);

strikeOnce(hitAngle, downMs);

This makes the drumstick respond based on how close the hand is, so each action feels deliberate rather than just an on/off hit. It lets the system capture subtle gestures, supporting our goal of reflecting nuanced human behavior. AI helped us with this in terms of knowing when exactly and how hard the strike should hit.

Future Improvements

Looking forward, we see several ways to expand and refine the Diet Drum:

  • Adaptive audio: Varying the melody or warning tone based on how close the hand is could enhance the playfulness and expressiveness.

  • Mechanical refinement: Improving the stability of the drumstick and optimizing servo speed could create smoother strikes and more consistent feedback.

  • Compact design: Reducing the size of the device for easier placement would make it more practical for everyday use.

  • Visual cues: Adding optional LEDs or visual signals could enhance the  feedback, making the system even more engaging.

Github Link:

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/assignment10.ino

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/pitches.h

Week 10: Group Project “NYUAD DJ Kit”

Main Concept:

The main concept for our group project is a DJ, since we wanted to experience what it feels like to be one. A DJ needs to handle many sounds and instruments using their unique artistic skills to create music that makes people happy and excited. Thus, we crafted this device called “NYUAD DJ Kit.” By using it, you can choose different kinds of songs with various speeds and a base sound produced by a wooden stick. This is a unique way to compose new kinds of songs as a DJ.

Demonstration Video

Schematic:

Code we’re particularly proud of:

The parts of the code we’re most proud of are the one shown below. The if else statement allows us to move to the next song and play it. When the button is pressed, meaning the pin is low, we set buttonPressed to true and noteIndex to 0 so that the song plays from the beginning. We also used the modulo operator to ensure that we always go back to the first song after the last one. The else if statement resets the buttonPressed state to false, so that the next time we press the button, it plays the next song

//handle music switching using modulo
if (digitalRead(BUTTON_PIN) == LOW && !buttonPressed) {
    buttonPressed = true;
    //move to the next song
    currentSong = (currentSong + 1) % 3; 
    //set note to 0 so that a song plays from the beginning
    noteIndex = 0; 
    isPlaying = false;
    noTone(BUZZER_PIN);
    delay(250); //delay for 250 milliseconds
  } else if (digitalRead(BUTTON_PIN) == HIGH) {
    //set buttonPressed to false to play next song 
    buttonPressed = false;
  }

The second snippet of code allows the servo to move every servoDelay milliseconds, controlling its speed and angle. We applied the concept we learned in class called “non-blocking” to ensure that this operation does not affect the rest of the program. Inside the if statement, we use the write() function from the Servo library to change the servo’s angle each time it runs. This way, the servo continues changing its angle until it reaches either 180° or 0°, incrementing by a step each time servoDelay milliseconds have passed. We’re happy that we were able to apply multiple concepts we learned in class, such as non-blocking and modulo, to our DJ project. As references, we used the example file ToneMelody from the Digital section and the Knob example from the Servo library. We also used ChatGPT to help us figure out how to apply the non-blocking concept so that the melody can move to the next one without affecting the rest of the program, which allows the servo to continue moving smoothly.

//Use non-blocking to not affect the rest of the program
if (currentTime - lastServoUpdate >= servoDelay) { //if servoDelay mills has passed
    lastServoUpdate = currentTime;
    //Change the angle of the servo by servoPos 
    myservo.write(servoPos);
    servoPos += servoStep;
    //Start decrementing if servo reaches 0 or 180 degrees
    if (servoPos >= 180 || servoPos <= 0) servoStep = -servoStep;
  }

Reflections & Future Improvements:

In terms of reflections, we struggled a lot to make the base work because we needed to attach the wooden stick to the servo and it was not stable at all at first. The way we attached it was by using tape, which was the primary cause of why it was unstable. As a result, every time we ran the servo fast, the stick fell off or the servo stopped working for some reason. We eventually managed to make the stick and servo stable by placing some weight on top of the setup so that no matter how fast the servo moved, the stick remained stable.

As for future improvements, we want to enhance the quality of the base because right now we’re just using a wooden stick, and it doesn’t produce a loud enough sound for a party situation. Furthermore, as the stick swings faster, its swing range becomes smaller, so we need to move the bottle manually to allow the stick to reach it. We believe this happens because the servoDelay becomes too small, reaching about 1 ms, so the servo can’t physically keep up. Therefore, next time we should use constrain() on the mapped value to prevent electrical noise from going out of range. This way, we can allow the servo to catch up with the speed that we want.

Week 10: Group Project “NYUAD DJ Kit”

Main Concept:

The main concept for our group project is a DJ, since we wanted to experience what it feels like to be one. A DJ needs to handle many sounds and instruments using their unique artistic skills to create music that makes people happy and excited. Thus, we crafted this device called “NYUAD DJ Kit.” By using it, you can choose different kinds of songs with various speeds and a base sound produced by a wooden stick. This is a unique way to compose new kinds of songs as a DJ. 

Demonstration Video

 

Schematic:

 

Code we’re particularly proud of:

The parts of the code we’re most proud of are the one shown below. The if else statement allows us to move to the next song and play it. When the button is pressed, meaning the pin is low, we set buttonPressed to true and noteIndex to 0 so that the song plays from the beginning. We also used the modulo operator to ensure that we always go back to the first song after the last one. The else if statement resets the buttonPressed state to false, so that the next time we press the button, it plays the next song. 

//handle music switching using modulo
if (digitalRead(BUTTON_PIN) == LOW && !buttonPressed) {
    buttonPressed = true;
    //move to the next song
    currentSong = (currentSong + 1) % 3; 
    //set note to 0 so that a song plays from the beginning
    noteIndex = 0; 
    isPlaying = false;
    noTone(BUZZER_PIN);
    delay(250); //delay for 250 milliseconds
  } else if (digitalRead(BUTTON_PIN) == HIGH) {
    //set buttonPressed to false to play next song 
    buttonPressed = false;
  }

 

The second snippet of code allows the servo to move every servoDelay milliseconds, controlling its speed and angle. We applied the concept we learned in class called “non-blocking” to ensure that this operation does not affect the rest of the program. Inside the if statement, we use the write() function from the Servo library to change the servo’s angle each time it runs. This way, the servo continues changing its angle until it reaches either 180° or 0°, incrementing by a step each time servoDelay milliseconds have passed. We’re happy that we were able to apply multiple concepts we learned in class, such as non-blocking and modulo, to our DJ project. As references, we used the example file ToneMelody from the Digital section and the Knob example from the Servo library. We also used ChatGPT to help us figure out how to apply the non-blocking concept so that the melody can move to the next one without affecting the rest of the program, which allows the servo to continue moving smoothly.

//Use non-blocking to not affect the rest of the program
if (currentTime - lastServoUpdate >= servoDelay) { //if servoDelay mills has passed
    lastServoUpdate = currentTime;
    //Change the angle of the servo by servoPos 
    myservo.write(servoPos);
    servoPos += servoStep;
    //Start decrementing if servo reaches 0 or 180 degrees
    if (servoPos >= 180 || servoPos <= 0) servoStep = -servoStep;
  }

 

Link to GitHub: https://github.com/KimShota/Intro-to-IM/blob/13b494508781fc36c9b95d3b46e5145d18c06808/nyuad_dj.ino

Reflections & Future Improvements:

In terms of reflections, we struggled a lot to make the base work because we needed to attach the wooden stick to the servo and it was not stable at all at first. The way we attached it was by using tape, which was the primary cause of why it was unstable. As a result, every time we ran the servo fast, the stick fell off or the servo stopped working for some reason. We eventually managed to make the stick and servo stable by placing some weight on top of the setup so that no matter how fast the servo moved, the stick remained stable. 

As for future improvements, we want to enhance the quality of the base because right now we’re just using a wooden stick, and it doesn’t produce a loud enough sound for a party situation. Furthermore, as the stick swings faster, its swing range becomes smaller, so we need to move the bottle manually to allow the stick to reach it. We believe this happens because the servoDelay becomes too small, reaching about 1 ms, so the servo can’t physically keep up. Therefore, next time we should use constrain() on the mapped value to prevent electrical noise from going out of range. This way, we can allow the servo to catch up with the speed that we want.

Reading Reflection – Week 10

Bret Victor’s rant on the future of technology had a very interesting take on technological design for the future, which made me realize the things we often ignore. He mentions how a glass screen completely misses the purpose of what our hands can do, which is to feel and manipulate, and how future interaction designs seem to neglect that entirely by reducing interaction to swiping fingers across a flat screen.

Honestly, I can’t help but agree with him on this, but I also find it difficult to come up with a clear solution to this problem. Designing future technology in the “Pictures Under Glass” way is probably the easiest way to do so, which is likely why it became the default design. If we did take the abilities of our hands into account, depending on the purpose of the device you’re using, the design would differ drastically from one to another, making it extremely complex. Although I do agree with Victor’s frustration, I also think his critique shows how difficult innovation can be once a certain design dominates the industry.

In his follow-up, Victor admits that he didn’t offer a solution because the purpose of his rant was to encourage research rather than provide answers. I actually respect his honesty, it made his rant look more like an invocation rather than a complaint. I’m on board with his criticism to limiting designs and to instead imagine new possibilities of greater ones. Even though we don’t yet know what a “dynamic tactile medium” will look like, I think Victor’s ideas push us as readers to think deeply about how our bodies and technology should work together.

Week 10 — Reading Response

Bret Victor argues that hands do two things, feel and manipulate, and that most screen-first products ignore both. On the counter I judge texture, resistance, and weight, I adjust heat by feel, I correct errors through immediate tactile feedback. On the screen I scroll and tap with one finger, I convert rich physical cues into flat sequences of steps, accuracy falls and attention shifts from food to interface.

Fitness tracking shows a similar pattern. A watch counts reps and time, yet it cannot teach grip pressure, bar path, stance, or breath. Effective coaching speaks through the body, the right cue is a change in force or timing, not another chart. A better tool would offer variable resistance and haptic prompts, small vibrations for tempo, pressure feedback for grip, and state you can feel without looking.

Even productivity tools can illustrate the loss in “transaction”. Physical sticky notes on a whiteboard build spatial memory, clusters are recalled by location and reach, the body encodes the arrangement. Dragging cards on a screen removes proprioception, scanning columns replaces simple recall by place. Tangible controllers and deformable surfaces could restore some of that embodied structure, information would be carried in texture and force, not only pixels.

To improve this, I propose we treat touch as information but not just input. Design for affordances that speak through force, texture, and spatial arrangement. If a tool mediates physical tasks or spatial understanding, add haptic and tangible feedback before adding new visual layers.

Week 10 Group Music Instrument

For our interactive media sound project, my partner, Yiyang, and I decided to create a simple yet expressive instrument with a few sensors, and a buzzer on Arduino Uno. We wanted to build something that was intuitive to play and produced a unique, percussive sound. The result is this force-sensitive drum. Tapping different pads creates different notes, and a toggle switch shifts the entire instrument into a higher-pitched mode.

Concept

Our initial idea was inspired by the force sensors used in class to control sound. We thought, what if we could use multiple sensors to combine frequencies and create rhythms? We brainstormed a few possibilities. Could we assign different chords to each sensor, where pressing harder makes a certain chord more prominent? Or could the sensors act as modifiers for a continuous track?

Ultimately, we settled on a more direct approach for a playable instrument. We decided to have three Force Sensitive Resistors (FSRs) that would each trigger a distinct note, like pads on a drum machine. To meet the project requirements and add another layer of interactivity, we incorporated a digital two-way switch. Flipping this switch would transpose the notes of all three pads to a higher octave, giving the player two different sound palettes to work with.

Schematic

The build was straightforward, centered around an Arduino Uno and a breadboard.

Components Used:

  • 1x Arduino Uno

  • 1x Breadboard

  • 3x Force Sensitive Resistors (FSRs) – our analog sensors

  • 1x Two-way toggle switch – our digital sensor

  • 1x Piezo Buzzer

  • Resistors (for the FSRs and switch)

  • Jumper wires and Alligator clips

Each of the three FSRs was connected to a separate analog input pin on the Arduino. This allows the Arduino to read a range of values based on how much pressure is applied. The toggle switch was connected to a digital pin to give us a simple ON/OFF (or in our case, Mode 1/Mode 2) reading. Finally, the piezo buzzer was connected to a digital pin capable of PWM (Pulse Width Modulation) to produce the tones.

The Arduino code continuously checks the state of our mode switch and reads the pressure on each of the three force sensors. If a sensor is pressed hard enough to cross a defined hitThreshold, it calls a function to play a corresponding sound.

There was evolution of our instrument. We started with a basic concept (v0.1) and then refined it by adjusting the frequency gaps between the sensors for a more distinct and musical sound (v1.0a). Finally, we tweaked the delay to give it a more responsive and percussive, drum-like feel (v1.0b).

Video/Image Documentation

Code Snippet I’m proud of

To simulate it more as a drum effect, I made this for loop to create this pitch decay effect:

// drum pitch decay effect
  for (int f = baseFreq + 40; f > baseFreq; f -= 5) {
    tone(buzzer, f);
    delay(10);
  }

Future Improvements/ Problems Encountered

Our biggest physical challenge was the alligator clips. It was indeed a handy tool to create a prototype, but their exposed metal heads made it very easy to accidentally create a short circuit if they touched. We learned to be meticulous about checking that the rubber insulators were covering the clips properly before powering on the Arduino.

On the software side, getting the sound right was an iterative process. First, we spend time exploring the pitch gapsInitially, the pitches were too close together and didn’t sound very musical. By trial and error, we adjusted the base frequencies to create a more noticeable and pleasant musical gap between the pads. Second, rhythm and feel in hand needed to match a those of a “drum machine”. We played with the delay() value in the main loop. A shorter delay made the instrument feel much more responsive and rhythmic.

If we were to continue this project, we could add more sensors for a full octave, or perhaps use the analog pressure value to control the volume (amplitude) of the note in addition to triggering it. It would also be interesting to experiment with different waveforms or sound profiles beyond the simple tones.

Week 10 – Reading Reflection

Reading “Making Interactive Art” made me realize what I created for this week, needs prior explanation before the user can figure out what the device is about. The buttons I made do not have any signs or words attached so the users will need some time to process and play around witht he project before realizing that what I made is a beat memorizer. However, since I took account for possible actions that the user might do, the system won’t crash. I can essentailly, set the stage, shut up ad listen to what the user will do when given my project. In those terms, I can say that I created a successful project that follows what the reading describes.

For physical computing reading, I was able to relate to many of his projects but especially “Things you yell at”. It reminded me of my midterm project because it also used voices to control the system. Pitch detection and voice recognizition is hard at first, but the result is worth the process.

Week 10 — Electronic Drum

Concept

For our interactive media sound project, my partner, Joy Zheng, and I decided to create a simple yet expressive instrument with a few sensors and a buzzer on Arduino Uno. We wanted to build something that was intuitive to play and produced a unique, percussive sound. The result is this force-sensitive drum. Tapping different pads creates different notes, and a toggle switch shifts the entire instrument into a higher-pitched mode.

Our initial idea was inspired by the force sensors used in class to control sound. We thought, what if we could use multiple sensors to combine frequencies and create rhythms? We brainstormed a few possibilities. Could we assign different chords to each sensor, where pressing harder makes a certain chord more prominent? Or could the sensors act as modifiers for a continuous track?

We settled on a more direct approach for a playable instrument. We decided to have three Force Sensitive Resistors (FSRs) that would each trigger a distinct note, like pads on a drum machine. To meet the project requirements and add another layer of interactivity, we incorporated a digital two-way switch. Flipping this switch would transpose the notes of all three pads to a higher octave, giving the player two different sound palettes to work with.

Arduino Build

The build was straightforward, centered around an Arduino Uno and a breadboard.

Components Used:

  • 1x Arduino Uno

  • 1x Breadboard

  • 3x Force Sensitive Resistors (FSRs), our analog sensors

  • 1x Two-way toggle switch, our digital sensor

  • 1x Piezo Buzzer

  • Resistors (for the FSRs and switch)

  • Jumper wires and Alligator clips

Each of the three FSRs was connected to a separate analog input pin on the Arduino. This allows the Arduino to read a range of values based on how much pressure is applied. The toggle switch was connected to a digital pin to give us a simple ON/OFF (or in our case, Mode 1/Mode 2) reading. Finally, the piezo buzzer was connected to a digital pin capable of PWM (Pulse Width Modulation) to produce the tones.

The Arduino code continuously checks the state of our mode switch and reads the pressure on each of the three force sensors. If a sensor is pressed hard enough to cross a defined hitThreshold, it calls a function to play a corresponding sound.

To simulate it more as a drum effect, we made this for loop to create this pitch decay effect:

// drum pitch decay effect
  for (int f = baseFreq + 40; f > baseFreq; f -= 5) {
    tone(buzzer, f);
    delay(10);
  }

Challenges and Improvement

There was evolution of our instrument. We started with a basic concept (v0.1) and then refined it by adjusting the frequency gaps between the sensors for a more distinct and musical sound (v1.0a). Finally, we tweaked the delay to give it a more responsive and percussive, drum-like feel (v1.0b).

Our biggest physical challenge was the alligator clips. It was indeed a handy tool to create a prototype, but their exposed metal heads made it very easy to accidentally create a short circuit if they touched. We learned to be meticulous about checking that the rubber insulators were covering the clips properly before powering on the Arduino.

On the software side, getting the sound right was an iterative process. First, we spend time exploring the pitch gaps. Initially, the pitches were too close together and didn’t sound very musical. By trial and error, we adjusted the base frequencies to create a more noticeable and pleasant musical gap between the pads. Second, rhythm and feel in hand needed to match a those of a “drum machine”. We played with the delay() value in the main loop. A shorter delay made the instrument feel much more responsive and rhythmic.

If we were to continue this project, we could add more sensors for a full octave, or perhaps use the analog pressure value to control the volume (amplitude) of the note in addition to triggering it. It would also be interesting to experiment with different waveforms or sound profiles beyond the simple tones.

Week 10: Arduino Loopstation (Musical Instrument)

This week Yongje and myself paired up to make our very own musical instrument.

I thought about the capabilities of the arduino speaker and was unimpressed with the sound “texture” of it, so we discussed what we could do with a rather limited range of sounds we could generate. I’m not much of a musician so I suggested what if we made a simple beat recorder, kinda like a metronome of sorts? Yongje informed me that what I was describing is called a “loopstation” and we got to designing.

Concept (With Visuals) – Hubert

After we planned what we wanted to do, I decided to visualize the user interaction side of the project first before designing the schematics and technical side.

The red button would be to start/stop the recording process. A red LED would indicate whether it was currently recording.

The blue button would be there for the user to tap in their beat.

When you are done with your beat, you can save it by clicking the red button once again. You can see whether it was properly stopped by the indicator turning off. Then you can press the green button to play your recorded beat.

Schematics & Planning – Hubert

Before we started connecting metal to metal, I made a schematic to quickly map out everything we needed to connect.

 

Code & Difficulties Encountered – Yongje

There are 3 main parts to the code.

The first is figuring out debouncing logic, which is used to remove the state when the system is bouncing between true and false when the switch is pressed. The second part is playback, actually playing back the recorded soundLastly, the third which is the hardest part: finding how to store the beat recording.

I’ll start by explaining the hardest part first, which is storing the beat recording.
The beat recording logic works by tracking the time of each button press and release while the device is in recording mode. Every time the beat button is pressed, the program calculates the gap since the previous press (gap = now – tRef) to capture the spacing between beats. When the button is released, it measures the duration the button was held (dur = now – lastPressTime) to record how long that beat lasted. Both values are stored in arrays (gaps[] and durs[]), building a timeline of when each beat starts and how long it plays. Figuring out this logic was the most difficult part.

Now onto explaining the playback logic. The playback logic is responsible for reproducing the rhythm that was recorded. It does this by reading through the stored arrays of gaps and durations in order. For each beat, the program first waits for the gap time, which is the delay before the next beat begins and then plays a tone on the speaker for the duration that was originally recorded. Because each recorded gap includes the previous beat’s duration, the playback code subtracts the previous duration from the current gap to get the true silent time between beats. This ensures that the playback matches the timing and spacing of the user’s original input, accurately reproducing both the rhythm and the length of each beat. I had to create a logic to turn negative silence time to positive because sometimes it gave errors when the inputs and the durations of beats were too short. This is explained in depth in the comment section of the code.

Finally, the debounce logic ensures that each button press or release is detected only once, even though mechanical switches naturally produce rapid, noisy fluctuations when pressed. When a button’s state changes, the program records the current time and waits a short period to confirm that the signal has stabilized. Only if the input remains steady for longer than this debounce delay does the program treat it as a valid press or release event. This filtering prevents false triggers caused by electrical noise or contact bounce, giving the system clean, reliable button inputs for recording and playback control. At first, I didn’t have this debounce logic implemented and had a hard time figuring out why the system sometimes failed to recognize button presses or seemed to trigger multiple times for a single press. Once the debounce logic was added, the button responses became stable and consistent.

Reflection

I believe this project turned out really well, and it was very interesting to work on our first group project of the semester.