Week 10 —Piano keys

Reflection on My Schematic

For this project, I decided to use three pushbuttons and a potentiometer because I wanted to keep the design simple but still interactive. My idea was to make the buttons act like keys—each one triggering a different tone on the piezo speaker. I also included the potentiometer to control the volume of the sound, which made the circuit more dynamic and gave me a sense of how analog and digital components can work together.

Designing the schematic was a fun and challenging part of the process. I had to carefully think about where each component should go and how to connect everything so it would be electrically correct. The piezo speaker was, of course, essential for producing sound it’s what brings the “instrument” to life.

What I enjoyed most was the problem-solving aspect figuring out how all the parts would communicate through the Arduino. It required patience and logical thinking, but once I understood the flow of current and how each input affects the output, it all made sense. Seeing the schematic come together gave me a real sense of accomplishment because it represented both my planning and my creativity.

IMG_0620

Reflection on the Arduino Build

The Arduino part of this project was honestly very easy and smooth, especially because I had already planned everything out in my schematic. All I had to do was follow my own plan carefully and make sure that every wire and component was connected in the correct place.

One of the first things I did was connect the GND pin on the Arduino to the negative railon the breadboard. That allowed me to connect all my components that needed ground like the potentiometer, buttons, and piezo speaker to the same shared ground.

I had to be precise and double-check my connections, especially where the potentiometer linked to the speaker and where each button was wired. To make the process easier and more organized, I actually color-coded my wires. For example, I used red wires for all the connections that went to ground. That made it much simpler to trace the circuit visually and fix small mistakes when I needed to adjust something later.

Overall, the Arduino part went very smoothly because I had a solid plan and followed it carefully. Seeing everything come together and actually work just like I designed it in the schematic felt really rewarding.

//set the pins for the button and buzzer
int firstKeyPin = 2;
int secondKeyPin = 3;
int thirdKeyPin = 4;

int buzzerPin = 10;


void setup() {
  //set the button pins as inputs
  pinMode(firstKeyPin, INPUT_PULLUP);
  pinMode(secondKeyPin, INPUT_PULLUP);
  pinMode(thirdKeyPin, INPUT_PULLUP);

  //set the buzzer pin as an output
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  
  if(digitalRead(firstKeyPin) == LOW){        //if the first key is pressed
    tone(buzzerPin, 262);                     //play the frequency for c
  }
  else if(digitalRead(secondKeyPin) == LOW){  //if the second key is pressed
    tone(buzzerPin, 330);                     //play the frequency for e
  }
  else if(digitalRead(thirdKeyPin) == LOW){   //if the third key is pressed
    tone(buzzerPin, 392);                     //play the frequency for g
  }
  else{
    noTone(buzzerPin);                        //if no key is pressed turn the buzzer off
  }
}

Reflection on the Coding

The coding part of this project was very straightforward. I already had my circuit ready, so I just needed to make sure that everything in my code matched the pins I used in my schematic and breadboard. I started by defining all of my pins clearly one for each of the three buttons and one for the piezo speaker.

I wanted each button to act as a different key, so I chose three different frequencies (in hertz) to create distinct sounds. I adjusted the hertz values to my liking until each key sounded right and had its own tone. This made the instrument feel more realistic and musical.

I also made sure to define all the variables and constants I needed in my code so that everything was organized and easy to change later if I wanted to experiment with new sounds or pin connections. Overall, the coding was simple but satisfying because it brought the entire project to lifethe schematic, the Arduino wiring, and the code all worked together perfectly.

Week 10 – Music instrument

Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram:
Setup:Video:

Highlight of the code:
Full-code on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

 

Week 10 – Musical instrument

Group members: Aizhan and Darmen
Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram: 

 

 

 

 

 

 

Setup:

 

 

 

 

 

 

 

Video Demonstration

Coding:

Arduino file on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }
}

float getDistance() {
  float echoTime;                   //variable to store the time it takes for a ping to bounce off an object
  float calculatedDistance;         //variable to store the distance calculated from the echo time
  //send out an ultrasonic pulse that's 10ms long
  digitalWrite(TRIG_PIN, HIGH);
  delayMicroseconds(10);
  digitalWrite(TRIG_PIN, LOW);
  echoTime = pulseIn(ECHO_PIN, HIGH);      //pulsein command to see how long it takes for pulse to bounce back to sensor
  calculatedDistance = echoTime / 55.2;    //calculate  distance of object that reflected the pulse in cm 
  return calculatedDistance;
}

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

Week 10 – reading

A Brief Rant on the Future of Interaction Design – initial article

The part of the reading that I found most interesting was the idea that we should stop limiting human interaction to just the tips of our fingers. It made me question why so many interactive designs and technologies focus almost entirely on finger-based gestures. Why not explore other parts of the body as tools for interaction?

I think part of the reason is that we, as humans, feel we have the most control over our fingers – they’re precise, sensitive, and easy to coordinate. But that doesn’t mean interaction should stop there? How could we design experiences that involve other body parts, ones that are still easy to control, but that go beyond what’s been unimaginatively repeated in current design?

The article emphasises how much of our natural ability to sense and manipulate the world through touch has been lost in “Pictures Under Glass”, flat screens that give us no tactile feedback. That really stood out to me, because it highlights how technology, despite its innovation, can sometimes strip away what makes interaction human.

Overall, the reading made me realise how limited our current designs are compared to the potential of the human body. I think it helped me to challenge myself and image more creative ways to interactive with my projects.

Responses

I appreciate his point on the iPad being revolutionary but if it acts the same in 20 years, it won’t be as there has been no big enough change to stay revolutionary. But what are those next steps? Mind control, hand and arms altogether with holograms? It is difficult to come up with answers or suggestions to the rants that were mentioned but I also see the finger being a cap to how we interact with daily tech objects.

Shahram Chaudhry – Week 10 – Reading Response

Reading A Brief Rant on the Future of Interaction Design honestly made me think how much I take my hands for granted. The author talks about how our hands are designed to feel and manipulate things and it really clicked for me when he compared using a touchscreen to “pictures under glass.”  That idea stuck with me because it perfectly captures what phones and tablets feel like: flat, smooth, and completely disconnected from the sense of touch. I never thought about it before, but it’s true , we don’t really “feel” anything when we use them. It’s weird to realize that the same hands that can tie shoelaces or shape clay are now mostly used to swipe on a flat screen.

The part that resonated most with me was his point that technology should adapt to us, not the other way around, especially when he says that technology can change but human nature cannot change as much. I see it all the time,  people getting frustrated because a phone gesture doesn’t work or because an app “expects” you to use it a certain way. It’s like we’re the ones bending to fit the machine, instead of the machine fitting how we naturally act. Also , with older objects like books or physical instruments, there’s something really satisfying about physically turning a page or pressing real keys. You just feel connected to what you’re doing. With touchscreens, everything feels the same no matter what you’re interacting with.

One part I didn’t completely agree with was how strongly he dismissed “Pictures Under Glass.” I get his point, but I think those devices have opened up creativity in new ways,  like drawing on an iPad or making music digitally. It’s not tactile in the same sense, but it’s still expressive.

The follow-up also made me think about how disconnected technology can make us sitting all day, barely moving, everything done with a single finger. I guess, I am a little scared of a future where we become immobile because there’s too much automation, and we don’t need to do much anymore. I guess we shouldn’t be losing touch entirely with the world around us (pun intended).

Week 10 – Shahram Chaudhry – Musical Instrument

 

Concept

For this week’s assignment, Khatira and I made a small musical instrument using an Arduino. We decided to create a pressure-sensitive drum pad that lets you play different drum sounds depending on how hard you press it.

The main part of the project is a force-sensitive resistor (FSR) that acts as the drum pad. When you press on it, the Arduino reads how much pressure you applied and plays a sound through a small buzzer. The harder you hit it, the longer the sound lasts, kind of like playing a real drum.

We also added a button that lets you switch between three drum sounds: a kick, a snare, and a hi-hat. So pressing the pad feels interactive, and you can change the type of drum as you play. It’s a really simple setup, but it was fun to experiment with.

 

Schematic

Video Demo

 

const int FSR_PIN = A0;        
const int BUTTON_PIN = 2;     
const int PIEZO_PIN = 8;     

// Drum sounds 
int kickDrum = 80;             // low drum
int snareDrum = 200;           // mid drum
int hiHat = 1000;              // high drum

int currentDrum = 0;         
int lastButtonState = HIGH;    

void setup() {
  pinMode(BUTTON_PIN, INPUT);  
  pinMode(PIEZO_PIN, OUTPUT);
  Serial.begin(9600); 
}

void loop() {
  int pressure = analogRead(FSR_PIN);
  if (pressure > 20) {
    int duration = map(pressure, 10, 1023, 10, 200);
    // Play the drum sound
    if (currentDrum == 0) {
      tone(PIEZO_PIN, kickDrum, duration);
    } else if (currentDrum == 1) {
      tone(PIEZO_PIN, snareDrum, duration);
    } else {
      tone(PIEZO_PIN, hiHat, duration);
    }
    delay(50);  
  }
  int buttonState = digitalRead(BUTTON_PIN);
  //if button was just pressed, we need to change drum sound
  if (buttonState == LOW && lastButtonState == HIGH) {
    currentDrum = currentDrum + 1;
    if (currentDrum > 2) {
      currentDrum = 0;
    }
    delay(200); 
  }
  lastButtonState = buttonState;  // Store utton state 
}

Future Improvements

For future improvements, we’d like to add a potentiometer to control the sound more precisely, allowing the player to adjust tone or volume in real time while drumming. We could also include LEDs that light up based on which drum sound is active and how hard the pad is hit. These additions would make the drum pad feel more dynamic,  and visually engaging.

Week 10: Musical Instrument

Concept

For this week’s assignment, we had to make a musical instrument involving at least one digital sensor and one analog sensor. Aditi and I decided to create a simple piano-like instrument with lights whose pitch level can be controlled by a potentiometer. There are 4 buttons (switches) that act as the piano “keys” and play different sounds, while the potentiometer has been mapped to three different levels so that the keys produce a high-pitched, middle-pitched, and low-pitched sound.

Materials

  • Analog sensor: 10K Trimpot
  • Digital Switch: Red, Blue, Yellow, and Green tactile buttons
  • Output: Piezo Buzzer to produce the sound and LEDs for light output

Schematic

Video Documentation

The Code

const int button_yellow=8;
const int yellow_light=7;
const int button_blue=9;
const int blue_light=6;
const int green_light=5;
const int button_green=10;
const int red_light=4;
const int button_red=11;
#define BUZZER 12
int potValue=0;
const int potPin=A0;
int melody[]={262,294,330,349,392,440,492,523}; // notes from C4-C5
int melody2[] = {523, 587, 659, 698, 784, 880, 988, 1047}; // notes from C5–C6
int melody3[] = {1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093}; // C6–C7
int nodeDurations=4;
int increase=0;
int potValue_p;

void setup() {
  // put your setup code here, to run once:
  pinMode(button_yellow, INPUT);
  pinMode(yellow_light,OUTPUT);
  pinMode(button_blue, INPUT);
  pinMode(blue_light, OUTPUT);
  pinMode(button_green, INPUT);
  pinMode(green_light,OUTPUT);
  pinMode(button_red, INPUT);
  pinMode(red_light, OUTPUT);

  pinMode(BUZZER, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  potValue = analogRead(potPin);
  
  if (digitalRead(8) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[1]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[1]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[1]);
    }
    digitalWrite(yellow_light, HIGH);

  } else if (digitalRead(9) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[2]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[2]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[2]);
    }
    digitalWrite(blue_light, HIGH);

  } else if (digitalRead(10) == HIGH) {
    Serial.print("green");
    if (potValue <= 300) {
      tone(BUZZER, melody[3]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[3]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[3]);
    }
    digitalWrite(green_light, HIGH);

  } else if (digitalRead(11) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[4]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[4]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[4]);
    }
    digitalWrite(red_light, HIGH);

  } else if(digitalRead(8)==LOW && digitalRead(9)==LOW && digitalRead(10)==LOW && digitalRead(11)==LOW) {  
    Serial.println("Pin is LOW");
    noTone(BUZZER);
    digitalWrite(yellow_light, LOW);
    digitalWrite(blue_light, LOW);
    digitalWrite(green_light, LOW);
    digitalWrite(red_light, LOW);
  }
}

Reflection & Future Improvements

We had fun making this project, however it was also challenging to make all of the connections. Initially, we planned to have at least 6-7 switches, but the setup became so crowded with just 4. Aditi drew the schematic diagram on paper before we began to build the connections physically, and this made the process much more manageable. We definitely could not have done it without having the diagram in front of us first. Sometimes, the setup for a particular switch would not work and we would face trouble in figuring out whether the issue was in the code or the wiring. In future versions, we would love to have more keys, as well as a tiny screen that displays the letter for the current note that is being played. We would also want to research on more “melodic” or pleasant sounding notes to add to the code.

Week 10 – Reading Reflection

My initial impression regarding the”Brief Rant on the Future of Interaction Design” reading was that it was formatted in a very engaging way. As someone with ADHD, I tend to read long paragraphs a lot slower than individual sentences structured this way; this made me trust in the credibility of the author’s perspective on interaction design.

The way the author transitions from explaining the importance of touch to the complete lack of it in our touchscreen technology was really done, and I couldn’t agree more. Using touchscreens is really un-stimulating, and if anyone who has tried texting without haptics will know it feels incredibly unresponsive– but that also seems to be the future we’re headed towards. The images Bret Victor chooses to accompany his texts are hilarious too; there really aren’t many things we naturally need our hands to swipe on other than manmade touchscreens. Victor’s explanation of how humans naturally transition between grip-styles is pretty interesting to me too.

This reading gave me a lot to think about. One of the things that came to mind was the experience of using a controller versus mouse & keyboard when playing video games. For those unaware, let me explain the main differences between the two.

When you use a controller (or gamepad as some call it), you’re primarily using your thumbs for everything from the analog sticks to the surface-buttons. Using just your thumb to control your camera controls can be quite difficult if precise and delicate movements are required.

When you use a keyboard and mouse, your arm and wrist are capable of microadjustments while holding the mouse to input much more precise and delicate movements; not to mention your keyboard hand is using way more than just your thumb to control what’s happening on screen.

So going by what I’ve said, many would probably wonder why anyone would ever use a controller, but that’s the thing– I haven’t explained the one thing that makes this remotely a difficult decision.
Controllers give an extra layer of immersion by both letting the user relax their arms and lean back, but also provide the user with haptic feedback and vibrations in response to what’s happening in-game. Imagine you’re playing a game where explosions are involved– the controller would vibrate violently in your hands as explosions affect your character. This is why you turn to keyboard and mouse for precision but controllers for immersion.

Now onto Victor’s follow-up article– I thought his response to voice was pretty amusing, “I have a hard time imagining Monet saying to his canvas, “Give me some water lilies. Make ’em impressionistic.” It’s amusing because that’s literally how our modern generation approaches stuff they don’t know how to do.

One other thing that really caught my attention in the follow-up was this quote, “The density of nerve endings in our fingertips is enormous. Their discrimination is almost as good as that of our eyes. If we don’t use our fingers, if in childhood and youth we become “finger-blind”, this rich network of nerves is impoverished.” I wonder if late-gen Z and gen alpha have any indicators of finger blindness as so many of us grew up with touchscreen devices as our main source of entertainment.

 

Reading Reflection – Week 10

A Brief Rant on the Future of Interaction Design and A follow-up article

I have never really thought much about it, but if someone were to put me on the spot and ask me which is more important, the visual or the tactile senses, I would probably choose the visual. This rant by Bret Victor has successfully changed my mind, though. I think what really convinced me was the example he gave on tying shoelaces. Indeed, I could easily tie my laces with my eyes closed, but if I numbed my hands or fingertips, I wouldn’t be able to do it anymore. This enlightenment has made me realize that I’ve never even considered the tactile aspect of any of my works, even though the first part of the course didn’t have room for it. I’m excited to take this into account for my upcoming projects since they involve physical human interaction.

I really liked the way Victor described today’s technology of iPads and phones as Pictures Under Glass. When phrased in this way, it actually makes the technology seem so boring and unintuitive. Like what do you mean our “revolutionary” technology doesn’t use the human tactile sense at all, probably one of our greatest assets? It actually gets more absurd the more I think about it.

When I moved on to the follow-up article, I found some of the comments and his response to them quite funny. But amongst all of this, the line that really stood out to me was, “Channeling all interaction through a single finger is like restricting all literature to Dr Seuss’s vocabulary.” I believe this line perfectly sums up the point Bret was trying to bring across about iPad technology only using your fingers. Where real literature’s scope goes to Shakespeare and beyond, leaving a human being with words only from a Dr. Seuss book is the dumbest thing, anyone can see that. So why can’t people see that limiting one’s fingertips- the body part with the highest density of nerve endings- to a flat, smooth screen, is dumb too? Overall, this has been one of my favorite readings so far, very eye-opening.

W10: Reading Reflection

The Future of Interaction Design

Reading this piece immediately took me back to Steve Jobs’ keynote when he unveiled the iPhone and boldly declared that we don’t need a stylus; that our fingers are the best pointing device we’ll ever have. Jobs’ vision, at that time, was revolutionary because it simplified interaction and made technology more accessible. He recognised how naturally intuitive our sense of touch is which is the same quality the author values but he focused on usability on physical feel.

While the author criticises “Pictures Under Glass” for robbing us of sensory depth, I see it as a meaningful trade-off. It allowed us to consolidate multiple tools into one, replacing the clutter of physical devices with a single screen that could transform into anything we needed. The flatness of the glass became the canvas of endless interfaces. Even if it dulled the sensation of texture, it heightened the sense of control, mobility, and creative possibility.

That said, I agree that the future can move beyond this limitation. The author’s call to embrace our full tactile and bodily potential opens an exciting direction for technology. What if screens could morph in texture, shape, and resistance depending on the app in use, a photo that feels like paper, a drum pad that vibrates ? That would merge Jobs’ vision of simplicity with the author’s longing for physical depth.

Perhaps, then, “Pictures Under Glass” wasn’t the end of interaction design but a stepping stone.

Moving forward from his response to the comments, I really agreed with the author’s take on the “iPad bad” comment. I liked how he clarified that the iPad is actually good for now. It was a revolutionary invention that changed how we interact with technology. But I also agree with his warning that if, twenty years from now, all we have is the same flat, glassy device with minor haptic improvements, then it would be bad. His comparison to black-and-white film before color photography made a lot of sense to me. It’s a reminder that innovation should keep evolving rather than settling for what feels advanced in the moment.