Week 10 – Musical Instrument

Concept

For this week’s assignment, we decided to create a make-shift piano using the DIY aluminum foil compression button we used in previous assignments. The piano only plays one song; each time you press one of the piano “keys,” the next note in the sequence plays.

For the analog input, we used a potentiometer to control the pitch/frequency of the notes.

Demo:

Implementation:

We created 3 foil switches, taping one side of each switch to GND and the other to a digital pin. Then, we connected the buzzer and potentiometer.

For the code, we used sound files from the same Github repo we used in class. I used the notes for the song “It’s a Small World.”

On every loop iteration, it reads the state of all three buttons and checks if any of them just transitioned from unpressed to pressed, if a button was HIGH last loop and is LOW now, that counts as a single press.

void loop(){
//...

bool justPressed = (state_yellow == LOW && prev_yellow == HIGH) 
                      || (state_blue == LOW && prev_blue == HIGH) 
                      || (state_red == LOW && prev_red == HIGH);

//...
}

When a press is detected, it looks up the current note’s frequency and duration from two parallel arrays, scales the frequency up or down based on the potentiometer reading, plays it through the buzzer, waits for it to finish, then advances to the next note in the melody. The previous button states are saved at the end of every loop so the next iteration can do the same comparison, ensuring each physical press only ever triggers one note no matter how long you hold it down.

Initially, I had the code so that the note plays every time the button is pressed; however, this was causing the entire melody to continuously play as long as the button is pressed. So, we had to come up with a different way to ensure a note is only played when the button changes from unpressed to pressed.

Due to the way the foils are made, it does appear that only one note is playing when the button is pressed. Sometimes when I press the foil button, the foil may touch from one side by accident, or the wires may move which closes and opens the circuit without my intent. Therefore, the device appears to still be playing multiple notes at once but it eventually pauses.

We had to figure out a way to change the pitch of the notes without completely changing what the note was. Since there is a trend among the notes where if you want to move from a note from higher pitch to a lower or vice versa, the note’s pitch value/frequency would need to either be divided by 2 or multiplied by 2.

For example: standard Do sound = C4; a lower pitch would be C3. The difference between the pitch of the two is that C4 is double C3

For that we mapped the potentiometer’s input values (0, 1023) to (50, 200) and divided by 100, this allows us to shift between notes and their pitches smoothly using the potentiometer.

We multiplied this factor to the original note that would have been played and set the tone to include the multiplied value as the frequency in the tone statement.

void loop(){ 
float pitchMultiplier = map(pot_value, 0, 1023, 50, 200) / 100.0;
}

Reflection:

Reflecting on the issue of the foils touching and playing more notes than I want them to, I thought of a way to fix them using cloth pins. I can attach two foils to the inside of the end you use to open the pin, so when you are opening the pin, the foils touch and it counts as a “press.” I imagine this method to function more smoothly than the sponge compression I built for this assignment because the cloth pin is stable and forces the foils away from each other after every press.

Week 10 Assignment – Partner work Mhara & Mariam

Concept:

In this assignment, we had to create a musical instrument using digital and analog sensors. We used push buttons as our digital sensors and a potentiometer as our analog sensor. We then decided to create a mini piano-like device that plays the four basic piano notes C, D, E, and F, and allows the user to adjust the pitch of these notes. In this project, there are four push buttons, each assigned to one note that only plays when the button is pressed, and a potentiometer that changes the pitch of the notes when it is turned.

Code:

Arduino File on GitHub

Setup:

Mariam’s Setup

Mhara’s Setup

 

Demonstration:

Mhara’s video demo:

Mariam’s video demo:

Digital Circuit:

Schematic:

Process:

In the process of this assignment, we decided to combine four buttons (digital) with one potentiometer (analog) to control a piezo buzzer. Each button plays a different note, and the potentiometer slightly adjusts the pitch so the sound changes depending on how much it’s turned. We worked together on the idea and the wiring, but we divided the coding so each of us focused on one part. Mariam handled the digital part (the buttons and the notes), and Mhara worked on the analog part (the potentiometer and the pitch control). After both parts were working separately, we combined them into one full sketch of code.

We then tested the circuit in Tinkercad to make sure all the wiring and logic of the code were correct. This helped us confirm that the buttons were reading properly and that the potentiometer was giving smooth values. Running it in Tinkercad also made it easier to fix small mistakes before trying it on the physical Arduino board.

At first, the audio wasn’t changing when the potentiometer was turned because the mapping was happening after the tone was already being played, so we rearranged the order of the code and that finally made the pitch respond. After that, the sound became too noisy and robotic, so we added a small adjustment range (90 – 105) to each note to make the pitch change smoother and less harsh.

 

Code Snippets:

While building the project, there were a couple of code snippets that stood out to us because they played an important role in making the instrument work the way we wanted it to. 

tone(buzzerPin, noteC * map(sensorValue, 0, 1023, 90, 105) / 100);

This was the part we were most proud of because it solved the “robotic” and “noisy” sound problem. Instead of letting the potentiometer completely change the note, we used a small adjustment (90-150) to bend the pitch smoothly. And this showed how the digital and analog inputs can work together in one line of code. 

 

Another part of the code is :

pitch = map(sensorValue, 0, 1023, 200, 2000);

This line shows how the analog input (the potentiometer) controls the sound. It takes the raw value from 0-1023 and maps it into a pitch range that the buzzer can actually play. This was important because the potentiometer originally wasn’t affecting the sound at all, and fixing the order of the code made this line finally work the way we wanted it to. 

Areas of Improvement and Reflection:

After completing this assignment, we were able to learn and explore different sensors and sounds. It was easy and smooth to work as a pair, as each person focused on one part and then we combined our work together. As for areas of improvement, we could make the sound of the notes smoother and more musical, since it still sounds slightly robotic, or add more notes to make it closer to a real piano. Another idea is to implement other sensors, such as an ultrasonic sensor, to play different notes or even melodies based on motion or distance. Working with audio and sensors is a fun part of Arduino, and it allows us to create many different ideas for various purposes. Overall, we are satisfied with our final outcome and the entire process of this project.

 

References:

Looked back at Week 10 slides about sound to recap what we learned.

Reviewed specific code concepts using the Arduino documentation:

https://docs.arduino.cc/language-reference/en/functions/math/map/ 

  • How we used it: We used this to convert the potentiometer’s range into a smaller pitch-adjustment range that works smoothly with the buzzer.

https://docs.arduino.cc/language-reference/en/functions/advanced-io/tone/ 

  • How we used it: We used this page to understand how the tone() function works and how to send different frequencies to the buzzer. 

https://docs.arduino.cc/built-in-examples/digital/toneMelody/ 

  • How we used it: We looked at this example to understand how notes from pitches.h are used and how tone() can be combined with different frequencies to create musical sounds.

Ai usage:  Used ChatGPT to help navigate and resolve a major issue where the tones sounded too robotic and noisy. From this, we learned that using the map() function with a smaller range for each note helps create smoother, more controlled pitch changes.

Week 10: Musical Instrument

Concept:

In this assignment, we had to create a musical instrument using digital and analog sensors. We used push buttons as our digital sensors and a potentiometer as our analog sensor. We then decided to create a mini piano-like device that plays the four basic piano notes C, D, E, and F, and allows the user to adjust the pitch of these notes. In this project, there are four push buttons, each assigned to one note that only plays when the button is pressed, and a potentiometer that changes the pitch of the notes when it is turned.

Code:

Arduino File on GitHub

Setup:

MariamMhara

Demonstration:

Mariam

Mhara

Digital Circuit:

Schematic:

Process:

In the process of this assignment, we decided to combine four buttons (digital) with one potentiometer (analog) to control a piezo buzzer. Each button plays a different note, and the potentiometer slightly adjusts the pitch so the sound changes depending on how much it’s turned. We worked together on the idea and the wiring, but we divided the coding so each of us focused on one part. Mariam handled the digital part (the buttons and the notes), and Mhara worked on the analog part (the potentiometer and the pitch control). After both parts were working separately, we combined them into one full sketch of code.

We then tested the circuit in Tinkercad to make sure all the wiring and logic of the code were correct. This helped us confirm that the buttons were reading properly and that the potentiometer was giving smooth values. Running it in Tinkercad also made it easier to fix small mistakes before trying it on the physical Arduino board.

At first, the audio wasn’t changing when the potentiometer was turned because the mapping was happening after the tone was already being played, so we rearranged the order of the code and that finally made the pitch respond. After that, the sound became too noisy and robotic, so we added a small adjustment range (90 – 105) to each note to make the pitch change smoother and less harsh.

Code Snippets:

While building the project, there were a couple of code snippets that stood out to us because they played an important role in making the instrument work the way we wanted it to. 

tone(buzzerPin, noteC * map(sensorValue, 0, 1023, 90, 105) / 100);

This was the part we were most proud of because it solved the “robotic” and “noisy” sound problem. Instead of letting the potentiometer completely change the note, we used a small adjustment (90-150) to bend the pitch smoothly. And this showed how the digital and analog inputs can work together in one line of code. 

Another part of the code is :

pitch = map(sensorValue, 0, 1023, 200, 2000);

This line shows how the analog input (the potentiometer) controls the sound. It takes the raw value from 0-1023 and maps it into a pitch range that the buzzer can actually play. This was important because the potentiometer originally wasn’t affecting the sound at all, and fixing the order of the code made this line finally work the way we wanted it to. 

Areas of Improvement and Reflection:

After completing this assignment, we were able to learn and explore different sensors and sounds. It was easy and smooth to work as a pair, as each person focused on one part and then we combined our work together. As for areas of improvement, we could make the sound of the notes smoother and more musical, since it still sounds slightly robotic, or add more notes to make it closer to a real piano. Another idea is to implement other sensors, such as an ultrasonic sensor, to play different notes or even melodies based on motion or distance. Working with audio and sensors is a fun part of Arduino, and it allows us to create many different ideas for various purposes. Overall, we are satisfied with our final outcome and the entire process of this project.

References:

Looked back at Week 10 slides about sound to recap what we learned.

Reviewed specific code concepts using the Arduino documentation:

https://docs.arduino.cc/language-reference/en/functions/math/map/ 

  • How we used it: We used this to convert the potentiometer’s range into a smaller pitch-adjustment range that works smoothly with the buzzer.

https://docs.arduino.cc/language-reference/en/functions/advanced-io/tone/ 

  • How we used it: We used this page to understand how the tone() function works and how to send different frequencies to the buzzer. 

https://docs.arduino.cc/built-in-examples/digital/toneMelody/ 

  • How we used it: We looked at this example to understand how notes from pitches.h are used and how tone() can be combined with different frequencies to create musical sounds.

Used ChatGPT to help navigate and resolve a major issue where the tones sounded too robotic and noisy. From this, we learned that using the map() function with a smaller range for each note helps create smoother, more controlled pitch changes.

Reading Reflection – Week 10

In A Brief Rant on the Future of Interaction Design, it honestly feels like Bret Victor is calling everyone out for settling. He’s basically saying that what we think is “advanced” interaction is actually kind of limited, and the line about an interface being “less expressive than a sandwich” made it sound funny at first but also a bit embarrassing once you think about it. Like our hands can do so much in real life, but then we go to screens and everything becomes flat, just tapping and swiping on what he calls “pictures under glass.” It made me realize how normal that feels even though it’s actually such a reduced version of interaction.

Then in Responses to A Brief Rant on the Future of Interaction Design, it feels more grounded because people are basically asking, okay but what now? And he admits that “the solution isn’t known,” which I actually liked because it didn’t feel like he was pretending to have everything figured out. It made the whole thing feel less like complaining and more like pushing people to think further instead of just accepting what already works. When he says things like the iPad is “good! For now!” it kind of shows that he’s not rejecting current tech, just refusing to treat it like the final version.

Putting both together, it feels less like he’s saying everything is wrong and more like we got comfortable too fast. The idea that “the future is a choice” stuck with me because it makes it feel like if interaction stays limited, it’s not by accident, it’s because people stopped questioning it.

Week 10: two-player air piano

For this week’s assignment, I made a two-player musical instrument using Arduino. One person controls the pitch by moving their hand in front of the ultrasonic sensor, and the other person uses two buttons to trigger two different drum-like sounds. I liked this idea because it made the instrument feel more interactive, and it also made the difference between analog and digital input much clearer to me. For the demo, my sister acted as the second player so I could show how the instrument works with two people. While I controlled the ultrasonic sensor, she used the buttons to add the drum-like sounds.

video demo

Arduino File on GitHub

I used the ultrasonic sensor to control the pitch of the instrument. The closer the hand gets to the sensor, the higher the pitch becomes, and the farther the hand is, the lower the pitch becomes. The second player uses two buttons to trigger two different sounds. I did not use uploaded sound files for these. Instead, I created them directly in code by assigning different frequency values in hertz to the piezo speaker. One button produces a lower drum-like sound, and the other produces a higher and sharper drum-like sound.

The way the project works is that the ultrasonic sensor acts as the analog input because it gives a changing range of distance values, while the buttons act as digital inputs because they are either pressed or not pressed. The Arduino reads the distance from the ultrasonic sensor directly inside the loop() function and uses the map() function to convert that distance into pitch. At the same time, it also reads the two buttons and triggers the two drum-like sounds when they are pressed. All of that is sent to the piezo speaker, which acts as the output. I think this project made the difference between continuous input and on and off input much easier to understand because both were working in the same project.

For the circuit, the ultrasonic sensor is connected to 5V and GND, with the TRIG pin connected to pin 9 and the ECHO pin connected to pin 10. The two buttons are connected as digital inputs on pins 2 and 3, and they are also connected to ground. The piezo speaker is connected to pin 8 and ground, so it can play both the changing pitch and the two drum-like sounds. One problem I ran into was with the button wiring. At first, I connected both button wires to the top row of the breadboard, and the buttons did not work. I was confused because I thought the code was the issue, but later I realized the problem was actually the physical connection. Once I fixed the wiring and moved the ground connection to the correct place, the buttons started working properly.

The part of the code I am most proud of is the section where the ultrasonic sensor reading gets turned into pitch in real time. I like this part because it made the instrument feel much more alive and interactive. Instead of just choosing from fixed sounds, the player can move their hand and hear the pitch change immediately. That made the whole project feel closer to an actual instrument instead of just separate parts making sound.

if (blueButtonState == HIGH && redButtonState == HIGH) {
  
  // ultrasonic sensor
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  // echo time
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  
  Serial.println(distance);

  if (distance > 5 && distance < 50) {
    int pitch = map(distance, 5, 50, 1500, 200);
    pitch = constrain(pitch, 200, 1500);
    tone(speakerPin, pitch);
  } else {
    noTone(speakerPin);
  }
}

I am proud of this part because it is what made the ultrasonic sensor actually useful in the project. The code triggers the sensor, measures how long the echo takes to return, and then converts that into distance. After that, the map() function changes the distance into a frequency range, so hand movement becomes sound. The constrain() function was also important because it kept the pitch within a usable range and stopped it from becoming too extreme. I think this was the part that made the project make the most sense.

If I had more time, I would improve the project by adding more sounds and making the instrument feel a little more musical overall instead of only experimental. I also think adding some visuals, like LEDs flashing when the buttons are pressed, would make it feel like a DJ game. Still, I think the final project worked well.

Week10 Reading

The first thing that came to my mind after reading both was how accepting were re off the the new tech thrown at us. not objecting the the pace of development but the the limited interaction capabilities do have an effect on us. I never though about touch screens as described the author “Pictures Under Glass”. It made me realize how much interaction is reduced to simple swipes and taps which negligble to what our hands do everyday . It made me rethink whether convenience has come at the cost of richness in interaction.

I felt a bit conflicted about the lack of concrete direction. While I understand that the goal is to inspire research rather than propose a fixed solution, it still leaves a gap between critique and application. Still, the reading succeeds in making me more critical of existing designs and more aware that the “future” of interaction is not inevitable, but something we actively choose to shape.

Lastly the comparison of what humans can do and what technology can do was meaningful debate beacuse of the way it focus on everyday tasks which are often deemed small

Keyboard Piano

Concept

For this production assignment,  was working with Zaid. We came up  with a simple design of making a piano using the keyboard and the buzzer. the idea was to assign notes to each key and have the input form it. We also added the lcd to see which note is  being played, and the a potentiometer to control the contrast of the lcd.

Code

https://github.com/MuhammadArhumSraw/IntrotoIM-MuhammadArhumAzeem

The are two files for this assignment labelled as week9file1 and week9file2. The python file is reading input form the keyboard and while the other is Arduino logic. The part I am most proud of is the serial communication at 9600.

try:
    arduino = serial.Serial('COM5', 9600, timeout=1)
void setup() {
  Serial.begin(9600)

 

At the moment its a single click piano. I can;t handle multiple keys Pressed at once. This can a be a future improvement.

Demo

Week 10: Reading Response

This week’s first reading, A Brief Rant On The Future Of Interaction Design, made me realize something deeper about the things I do, and what everyone does every day. Starting by watching the video, I initially thought it was just about technology, and I found it interesting, especially with all the futuristic functions they included such as the window screen and others. What I didn’t notice or really think about was the fact that everything was done on screens, because that is our reality and something we do every day for almost everything. After reading the text, I started to realize how true this is, and how most of what I do happens through flat screens using my fingers, even for things that could be done physically, such as the reading example. I did agree with that point, that some things do not have to be done on a screen. However, I also thought about how people do not actually do everything through screens, as we still move, go outside, and use our bodies for different tasks. But since the author is talking about the future, I do agree that this should be considered so it does not reach a point where everything is done through a flat screen.

This week’s second reading, the responses to the brief rant, clarified many of the thoughts I had while reading, and also introduced aspects I had not considered, which made me go from partially agreeing to agreeing more with the author’s concern. I did think about the common idea that devices can be harmful, especially for children, but the responses helped me understand that while current technology is useful, it could become problematic if it becomes too dominant in the future. I also found the hologram example very helpful, as it made the main idea clearer, that technology should continue to develop, but in a way that is more interactive and three-dimensional, matching the environment we live in. Additionally, the quote by neuroscientist Matti Bergström about the effects of constant touchscreen use from a young age made me realize that this could become harmful if it reaches the future vision being discussed.

Both readings were interesting to read, especially because they address real situations and possible future developments. They made me think more about how we use screens in everyday life and how many things have already shifted to digital formats, such as borrowing books or using services that were once physical. I also thought about how it would be interesting and beneficial if future technologies included more physical interaction and engagement with the human body. This connects to the work we do in this class, where our Arduino projects involve physical interaction, while our p5 sketches are mostly screen-based using buttons and touchpads. Since our final project will combine both, I feel like that is a strong example of how technology can be improved by balancing physical interaction with screen-based systems.

Reading Reflection – Week 10

A Brief Rant on the Future of Interaction Design + follow-up

Reading this article made me rethink how I usually imagine the future of technology. I realized that I tend to accept touchscreens and the sleek interfaces without really questioning whether they actually improve human interactions. I really liked the author’s idea of pictures under glass, because it reframes devices like phones and tablets as limiting rather than advanced. I had never really considered how much tactile feedback shapes my everyday actions, like holding a cup or turning a page, and how it completely disappears when interacting with digital screens. This made me more aware of how much current technology prioritizes visual simplicity over physical engagement.

I wonder whether designers will continue to rely on interfaces that ignore the full capabilities of the human body. If our hands and bodies are so complex and expressive, why is most technology reduced to tapping and swiping? His point about finger blindness really is scary to think about, especially if we lose the ability to feel and understand our objects. This makes me wonder whether convenience and market trends are prioritized over innovation, or if designing for full-body interactions is simply too difficult to do for every single person. It is interesting to see how we adapt to our devices rather than devices adapting to us. Overall, the reading challenged my assumption that technological progress is always linear and improving.

I think the follow-up was quite interesting. I understand that his goal was to highlight a problem and push others to explore it. I am not sure why people expect an immediate answer rather than seeing critique as the starting point for innovation. I was also surprised that removing the body from interaction, like through voice or brain interfaces, might actually reduce human experience. I had not thought about how much physical interaction shapes understanding. To be honest, this made me reflect on how passive I have become in using technology and whether easier always means better.

Reading Reflection – Week 10

When I thought about touchscreens before this assignment, I always felt they were fun and easy to use. I liked exploring things by touching the screen instead of using buttons or a mouse. Because of that, I assumed touchscreens were already the best version of future technology. After reading the texts, I started to notice how limited screens actually are. They only let me use one finger, even though my hands and body can do much more. This made me think about how technology does not really use our full abilities.

One idea that made me think was how the author described touchscreens as flat and numb. I never thought about how everything feels the same when I touch my phone. There is no real physical feedback. When I read that, I realized that I also feel limited by screens and I hope technology develops into something more than just glass surfaces. It made me wonder what it would feel like if my phone could respond physically, not just visually.

The readings also made me think about creativity. I believe creativity can be fully digital, like movies, where you do not need to touch anything to understand or enjoy it. But at the same time, I noticed that when I work on physical computing projects in class, using sensors and real materials feels more engaging. It uses more of my body, not just my eyes.

Another part that stayed with me was the idea of children using iPads. I think it can be bad because of radiation, but it can also be very educational if used the right way. The reading made me think more deeply about how screens might affect development, especially if kids only interact with flat surfaces.

Overall, these readings made me question my assumptions about future technology. I used to imagine better screens, but now I imagine tools that involve more of the body. It made me think that maybe the future should not be limited to touchscreens, and that we should explore new ways of interacting that feel more natural and physical.