Shahram Chaudhry – Week 10 – Reading Response

Reading A Brief Rant on the Future of Interaction Design honestly made me think how much I take my hands for granted. The author talks about how our hands are designed to feel and manipulate things and it really clicked for me when he compared using a touchscreen to “pictures under glass.”  That idea stuck with me because it perfectly captures what phones and tablets feel like: flat, smooth, and completely disconnected from the sense of touch. I never thought about it before, but it’s true , we don’t really “feel” anything when we use them. It’s weird to realize that the same hands that can tie shoelaces or shape clay are now mostly used to swipe on a flat screen.

The part that resonated most with me was his point that technology should adapt to us, not the other way around, especially when he says that technology can change but human nature cannot change as much. I see it all the time,  people getting frustrated because a phone gesture doesn’t work or because an app “expects” you to use it a certain way. It’s like we’re the ones bending to fit the machine, instead of the machine fitting how we naturally act. Also , with older objects like books or physical instruments, there’s something really satisfying about physically turning a page or pressing real keys. You just feel connected to what you’re doing. With touchscreens, everything feels the same no matter what you’re interacting with.

One part I didn’t completely agree with was how strongly he dismissed “Pictures Under Glass.” I get his point, but I think those devices have opened up creativity in new ways,  like drawing on an iPad or making music digitally. It’s not tactile in the same sense, but it’s still expressive.

The follow-up also made me think about how disconnected technology can make us sitting all day, barely moving, everything done with a single finger. I guess, I am a little scared of a future where we become immobile because there’s too much automation, and we don’t need to do much anymore. I guess we shouldn’t be losing touch entirely with the world around us (pun intended).

Week 10 – Shahram Chaudhry – Musical Instrument

 

Concept

For this week’s assignment, Khatira and I made a small musical instrument using an Arduino. We decided to create a pressure-sensitive drum pad that lets you play different drum sounds depending on how hard you press it.

The main part of the project is a force-sensitive resistor (FSR) that acts as the drum pad. When you press on it, the Arduino reads how much pressure you applied and plays a sound through a small buzzer. The harder you hit it, the longer the sound lasts, kind of like playing a real drum.

We also added a button that lets you switch between three drum sounds: a kick, a snare, and a hi-hat. So pressing the pad feels interactive, and you can change the type of drum as you play. It’s a really simple setup, but it was fun to experiment with.

 

Schematic

Video Demo

 

const int FSR_PIN = A0;        
const int BUTTON_PIN = 2;     
const int PIEZO_PIN = 8;     

// Drum sounds 
int kickDrum = 80;             // low drum
int snareDrum = 200;           // mid drum
int hiHat = 1000;              // high drum

int currentDrum = 0;         
int lastButtonState = HIGH;    

void setup() {
  pinMode(BUTTON_PIN, INPUT);  
  pinMode(PIEZO_PIN, OUTPUT);
  Serial.begin(9600); 
}

void loop() {
  int pressure = analogRead(FSR_PIN);
  if (pressure > 20) {
    int duration = map(pressure, 10, 1023, 10, 200);
    // Play the drum sound
    if (currentDrum == 0) {
      tone(PIEZO_PIN, kickDrum, duration);
    } else if (currentDrum == 1) {
      tone(PIEZO_PIN, snareDrum, duration);
    } else {
      tone(PIEZO_PIN, hiHat, duration);
    }
    delay(50);  
  }
  int buttonState = digitalRead(BUTTON_PIN);
  //if button was just pressed, we need to change drum sound
  if (buttonState == LOW && lastButtonState == HIGH) {
    currentDrum = currentDrum + 1;
    if (currentDrum > 2) {
      currentDrum = 0;
    }
    delay(200); 
  }
  lastButtonState = buttonState;  // Store utton state 
}

Future Improvements

For future improvements, we’d like to add a potentiometer to control the sound more precisely, allowing the player to adjust tone or volume in real time while drumming. We could also include LEDs that light up based on which drum sound is active and how hard the pad is hit. These additions would make the drum pad feel more dynamic,  and visually engaging.

Week 10: Musical Instrument

Concept

For this week’s assignment, we had to make a musical instrument involving at least one digital sensor and one analog sensor. Aditi and I decided to create a simple piano-like instrument with lights whose pitch level can be controlled by a potentiometer. There are 4 buttons (switches) that act as the piano “keys” and play different sounds, while the potentiometer has been mapped to three different levels so that the keys produce a high-pitched, middle-pitched, and low-pitched sound.

Materials

  • Analog sensor: 10K Trimpot
  • Digital Switch: Red, Blue, Yellow, and Green tactile buttons
  • Output: Piezo Buzzer to produce the sound and LEDs for light output

Schematic

Video Documentation

The Code

const int button_yellow=8;
const int yellow_light=7;
const int button_blue=9;
const int blue_light=6;
const int green_light=5;
const int button_green=10;
const int red_light=4;
const int button_red=11;
#define BUZZER 12
int potValue=0;
const int potPin=A0;
int melody[]={262,294,330,349,392,440,492,523}; // notes from C4-C5
int melody2[] = {523, 587, 659, 698, 784, 880, 988, 1047}; // notes from C5–C6
int melody3[] = {1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093}; // C6–C7
int nodeDurations=4;
int increase=0;
int potValue_p;

void setup() {
  // put your setup code here, to run once:
  pinMode(button_yellow, INPUT);
  pinMode(yellow_light,OUTPUT);
  pinMode(button_blue, INPUT);
  pinMode(blue_light, OUTPUT);
  pinMode(button_green, INPUT);
  pinMode(green_light,OUTPUT);
  pinMode(button_red, INPUT);
  pinMode(red_light, OUTPUT);

  pinMode(BUZZER, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  potValue = analogRead(potPin);
  
  if (digitalRead(8) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[1]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[1]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[1]);
    }
    digitalWrite(yellow_light, HIGH);

  } else if (digitalRead(9) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[2]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[2]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[2]);
    }
    digitalWrite(blue_light, HIGH);

  } else if (digitalRead(10) == HIGH) {
    Serial.print("green");
    if (potValue <= 300) {
      tone(BUZZER, melody[3]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[3]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[3]);
    }
    digitalWrite(green_light, HIGH);

  } else if (digitalRead(11) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[4]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[4]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[4]);
    }
    digitalWrite(red_light, HIGH);

  } else if(digitalRead(8)==LOW && digitalRead(9)==LOW && digitalRead(10)==LOW && digitalRead(11)==LOW) {  
    Serial.println("Pin is LOW");
    noTone(BUZZER);
    digitalWrite(yellow_light, LOW);
    digitalWrite(blue_light, LOW);
    digitalWrite(green_light, LOW);
    digitalWrite(red_light, LOW);
  }
}

Reflection & Future Improvements

We had fun making this project, however it was also challenging to make all of the connections. Initially, we planned to have at least 6-7 switches, but the setup became so crowded with just 4. Aditi drew the schematic diagram on paper before we began to build the connections physically, and this made the process much more manageable. We definitely could not have done it without having the diagram in front of us first. Sometimes, the setup for a particular switch would not work and we would face trouble in figuring out whether the issue was in the code or the wiring. In future versions, we would love to have more keys, as well as a tiny screen that displays the letter for the current note that is being played. We would also want to research on more “melodic” or pleasant sounding notes to add to the code.

Week 10 – Reading Reflection

My initial impression regarding the”Brief Rant on the Future of Interaction Design” reading was that it was formatted in a very engaging way. As someone with ADHD, I tend to read long paragraphs a lot slower than individual sentences structured this way; this made me trust in the credibility of the author’s perspective on interaction design.

The way the author transitions from explaining the importance of touch to the complete lack of it in our touchscreen technology was really done, and I couldn’t agree more. Using touchscreens is really un-stimulating, and if anyone who has tried texting without haptics will know it feels incredibly unresponsive– but that also seems to be the future we’re headed towards. The images Bret Victor chooses to accompany his texts are hilarious too; there really aren’t many things we naturally need our hands to swipe on other than manmade touchscreens. Victor’s explanation of how humans naturally transition between grip-styles is pretty interesting to me too.

This reading gave me a lot to think about. One of the things that came to mind was the experience of using a controller versus mouse & keyboard when playing video games. For those unaware, let me explain the main differences between the two.

When you use a controller (or gamepad as some call it), you’re primarily using your thumbs for everything from the analog sticks to the surface-buttons. Using just your thumb to control your camera controls can be quite difficult if precise and delicate movements are required.

When you use a keyboard and mouse, your arm and wrist are capable of microadjustments while holding the mouse to input much more precise and delicate movements; not to mention your keyboard hand is using way more than just your thumb to control what’s happening on screen.

So going by what I’ve said, many would probably wonder why anyone would ever use a controller, but that’s the thing– I haven’t explained the one thing that makes this remotely a difficult decision.
Controllers give an extra layer of immersion by both letting the user relax their arms and lean back, but also provide the user with haptic feedback and vibrations in response to what’s happening in-game. Imagine you’re playing a game where explosions are involved– the controller would vibrate violently in your hands as explosions affect your character. This is why you turn to keyboard and mouse for precision but controllers for immersion.

Now onto Victor’s follow-up article– I thought his response to voice was pretty amusing, “I have a hard time imagining Monet saying to his canvas, “Give me some water lilies. Make ’em impressionistic.” It’s amusing because that’s literally how our modern generation approaches stuff they don’t know how to do.

One other thing that really caught my attention in the follow-up was this quote, “The density of nerve endings in our fingertips is enormous. Their discrimination is almost as good as that of our eyes. If we don’t use our fingers, if in childhood and youth we become “finger-blind”, this rich network of nerves is impoverished.” I wonder if late-gen Z and gen alpha have any indicators of finger blindness as so many of us grew up with touchscreen devices as our main source of entertainment.

 

Reading Reflection – Week 10

A Brief Rant on the Future of Interaction Design and A follow-up article

I have never really thought much about it, but if someone were to put me on the spot and ask me which is more important, the visual or the tactile senses, I would probably choose the visual. This rant by Bret Victor has successfully changed my mind, though. I think what really convinced me was the example he gave on tying shoelaces. Indeed, I could easily tie my laces with my eyes closed, but if I numbed my hands or fingertips, I wouldn’t be able to do it anymore. This enlightenment has made me realize that I’ve never even considered the tactile aspect of any of my works, even though the first part of the course didn’t have room for it. I’m excited to take this into account for my upcoming projects since they involve physical human interaction.

I really liked the way Victor described today’s technology of iPads and phones as Pictures Under Glass. When phrased in this way, it actually makes the technology seem so boring and unintuitive. Like what do you mean our “revolutionary” technology doesn’t use the human tactile sense at all, probably one of our greatest assets? It actually gets more absurd the more I think about it.

When I moved on to the follow-up article, I found some of the comments and his response to them quite funny. But amongst all of this, the line that really stood out to me was, “Channeling all interaction through a single finger is like restricting all literature to Dr Seuss’s vocabulary.” I believe this line perfectly sums up the point Bret was trying to bring across about iPad technology only using your fingers. Where real literature’s scope goes to Shakespeare and beyond, leaving a human being with words only from a Dr. Seuss book is the dumbest thing, anyone can see that. So why can’t people see that limiting one’s fingertips- the body part with the highest density of nerve endings- to a flat, smooth screen, is dumb too? Overall, this has been one of my favorite readings so far, very eye-opening.

W10: Reading Reflection

The Future of Interaction Design

Reading this piece immediately took me back to Steve Jobs’ keynote when he unveiled the iPhone and boldly declared that we don’t need a stylus; that our fingers are the best pointing device we’ll ever have. Jobs’ vision, at that time, was revolutionary because it simplified interaction and made technology more accessible. He recognised how naturally intuitive our sense of touch is which is the same quality the author values but he focused on usability on physical feel.

While the author criticises “Pictures Under Glass” for robbing us of sensory depth, I see it as a meaningful trade-off. It allowed us to consolidate multiple tools into one, replacing the clutter of physical devices with a single screen that could transform into anything we needed. The flatness of the glass became the canvas of endless interfaces. Even if it dulled the sensation of texture, it heightened the sense of control, mobility, and creative possibility.

That said, I agree that the future can move beyond this limitation. The author’s call to embrace our full tactile and bodily potential opens an exciting direction for technology. What if screens could morph in texture, shape, and resistance depending on the app in use, a photo that feels like paper, a drum pad that vibrates ? That would merge Jobs’ vision of simplicity with the author’s longing for physical depth.

Perhaps, then, “Pictures Under Glass” wasn’t the end of interaction design but a stepping stone.

Moving forward from his response to the comments, I really agreed with the author’s take on the “iPad bad” comment. I liked how he clarified that the iPad is actually good for now. It was a revolutionary invention that changed how we interact with technology. But I also agree with his warning that if, twenty years from now, all we have is the same flat, glassy device with minor haptic improvements, then it would be bad. His comparison to black-and-white film before color photography made a lot of sense to me. It’s a reminder that innovation should keep evolving rather than settling for what feels advanced in the moment.

Reading Reflection Week 10

One of the examples I found most interesting in “A Brief Rant on the Future of Interaction Design” was the author’s take on voice, especially when he described a fictional situation in which Monet uses his voice to obtain the painting he wants. It may sound ridiculous but I really do agree and understand his argument, how active exploration is necessary in some cases, and the use of voice doesn’t apply. This challenged my own way of thinking, expanding to something essential that the author may be trying to express, whether consciously or not. This is, some forms of interactive will not apply to some cases, and this is why the designer or programmer must understand what is most relevant and effective for the experience they are making. For example, in an interactive experience designed to be engaged while standing up, it might not be the best idea to make certain responses triggered by having the user go click on something on the laptop every time as well. Allowing the user to focus on a few things at a time will improve their experience and help them understand the relevance of the interactive elements within the main objective of the project

 

Something that caught my attention in Bret Victor’s follow-up article was the neglected factor when it comes to tools that are meant to amplify human capabilities, and how hands play an important role in these. I think I speak for many when I say that we often take the skills of our hands for granted. Cooking, drawing, writing, building, organizing, some of the best activities done both for pleasure or to complete a task are done with our hands. And I agree with the author that there is a unique satisfaction that comes from doing these things with our hands. For example, I do love drawing digitally, as the result of my efforts match my intention to draw realistic yet slightly cartoonish objects, landscapes, and characters. However, this process doesn’t match the precision and delicacy of using a brush and applying paint to a canvas, stroke for stroke.

Despite this, I must disagree with the author on one thing. Even though the connection between what I am doing with my hands and the outcome on a screen might not be as strong as doing with another more tangible medium, this does not mean there is no connection at all. Before technology took over most of our ordinary activities such as writing, I enjoyed writing stories with a pen or pencil on my notebook. However, this process became exhausting whenever I had to erase or cross something, or my fingers started to ache from writing for so long. Now, even though I am staring at a screen instead of brushing my fingers over a piece of paper, and typing over a keyboard instead of using a pen, I still enjoy writing just as much as before, and the action has become more manageable for me.

 

Week 10 – Reading Reflection

Bret Victor’s “A Brief Rant on the Future of Interaction Design” reads less like a complaint and more like a plea for imagination. His frustration with “pictures under glass” isn’t really about touchscreens rather about how easily we mistake convenience for progress. Victor’s argument that our hands are not just pointers but thinking tools is relatable. I’d never thought of a touchscreen as “numb,” but he’s right: sliding a finger across glass is nothing like twisting, folding, or shaping an object. He’s asking designers to respect the body’s intelligence, the millions of years of evolution that made us good at feeling, gripping, and sensing the world. I agree with him almost entirely. What unsettles me is how quickly we’ve accepted numbness as normal. We’ve trained ourselves to think that friction is a flaw when it comes to UX.

The Responses piece adds an interesting layer of humility. Victor doesn’t pretend to have the solution. He admits he was ranting, not designing one. Still, his answers to critics are telling. He pushes back against gimmicks like voice control and “waving your hands in the air,” arguing that real interaction should involve the whole body and the tactile richness of touch. I found myself nodding at his line about computers needing to adapt to our bodies, not the other way around. That’s such a simple reversal, yet it cuts right through decades of design laziness. When he compares touchscreen culture to “restricting all literature to Dr. Seuss’s vocabulary,” it’s funny, but it also nails the deeper loss: we’re settling for tools built for children, not adults.

If there’s one thing I’d question, it’s Victor’s nostalgia for physicality. I agree that touch and movement matter, but I also think the human imagination adapts. The digital world is training new forms of dexterity which are mostly mental than physical. Coding, multitasking, navigating layered interfaces – these, too, are forms of “touch,” just less visible. Maybe the future of interaction design isn’t about replacing glass with holographic clay, but about balancing sensory depth with cognitive range. Victor’s rant reminds me that design should evolve with both the hand and the mind.

Week 10 – Reading Reflection

I think the idea of moving beyond flat screens is important. Victor Bret’s rant from 2011 makes me think about this a lot. He says our hands are amazing tools. I believe he is right. We should use our hands’ full abilities, not just for poking glass. I see a problem with how we use technology today. We call things interactive when they are not. Tapping a picture on a screen is a limited way to interact. I think true interaction should involve more of our senses and our physical body. It should feel like we are handling real things. Bret’s follow-up answers show that some people missed his point. They focused on new screen technology. But I believe his main idea was about our hands. He wanted us to think about touch and grip, not just better displays.

This makes me consider the role of voice commands. Some people think voice is the future. But I think it has limits. Speaking is not always practical or private. Our hands are often faster and more precise. The best future might combine voice, touch, and physical movement. I feel that the real challenge is for designers. They need to imagine tools that use our whole bodies. I believe we need technology that feels less like a window and more like a material. We need to build things we can truly feel and shape with our hands. This is a difficult but exciting goal.

Week 10: Musical Instrument (Ling and Mariam)

Concept

For this assignment, I worked together with Ling to create an instrument using the ultrasonic sensor. He came up with the idea to use the distance sensor, and the way we executed it reminded me of a theremin because there’s no physical contact needed to play the instrument. We also used a red button to turn the instrument on and off, as well as a green button that controls the tempo of the piezo buzzer (based on how many times you click on it in 5 seconds).

Schematic Diagram

 

Code

const int trigPin = 9;  
const int echoPin = 10; 
const int buzzerPin = 8; 
const int buttonPin = 4;  // system ON/OFF button
const int tempoButtonPin = 2;  // tempo setting button

//sensor variables
float duration, distance; 
bool systemOn = false; // system on or off state

// tempo variables
bool tempoSettingActive = false; 
int  tempoPressCount    = 0;   
unsigned long tempoStartTime = 0;
unsigned long tempoInterval  = 500; 

// for edge detection on tempo button (to count presses cleanly)
int lastTempoButtonState = HIGH;

// for edge detection on system button (cleaner)
int lastSystemButtonState = HIGH;


// for controlling when next beep happens
unsigned long lastBeatTime = 0;



void setup() {  
 pinMode(trigPin, OUTPUT);  
 pinMode(echoPin, INPUT);  
 pinMode(buzzerPin, OUTPUT);
 pinMode(buttonPin, INPUT_PULLUP); // system ON/OFF button 
 pinMode(tempoButtonPin, INPUT_PULLUP);  // tempo button 
 Serial.begin(9600);  
}  

void loop() {
  // system on/off state
  int systemButtonState = digitalRead(buttonPin);

  // detects a press: HIGH -> LOW 
  if (systemButtonState == LOW && lastSystemButtonState == HIGH) {
    systemOn = !systemOn;  // Toggle system state

    Serial.print("System is now ");
    Serial.println(systemOn ? "ON" : "OFF");

    delay(200); // basic debounce
  }
  lastSystemButtonState = systemButtonState;
  
  // if system is OFF: make sure buzzer is silent and skip rest
  if (!systemOn) {
    noTone(buzzerPin);
    delay(50);
    return;
  }

  // tempo button code
  int tempoButtonState = digitalRead(tempoButtonPin);

  // detects a press 
  if (tempoButtonState == LOW && lastTempoButtonState == HIGH) {
    if (!tempoSettingActive) {
      
      //first press (starts 5 second capture window)
      tempoSettingActive = true;
      tempoStartTime = millis();
      tempoPressCount = 1;  // Count this first press
      Serial.println("Tempo setting started: press multiple times within 5 seconds.");
    } else {
      // additional presses inside active window (5 secs)
      tempoPressCount++;
    }

    delay(40); // debounce for tempo button
  }
  lastTempoButtonState = tempoButtonState;

  // if we are in tempo mode, check whether 5 seconds passed
  if (tempoSettingActive && (millis() - tempoStartTime >= 5000)) {
    tempoSettingActive = false;

    if (tempoPressCount > 0) {
      // tempo interval = 1000 ms / (number of presses in 5 secs)
      tempoInterval = 1000UL / tempoPressCount;

      // avoids unrealistically fast intervals
      if (tempoInterval < 50) {
        tempoInterval = 50;
      }

      Serial.print("Tempo set: ");
      Serial.print(tempoPressCount);
      Serial.print(" presses in 5s -> interval ");
      Serial.print(tempoInterval);
      Serial.println(" ms between beeps.");
    } else {
      Serial.println("No tempo detected.");
    }
  }
 
  // ultrasonic sensor distance measurement
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);

  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  duration = pulseIn(echoPin, HIGH);
  distance = (duration * 0.0343) / 2.0;  // in cm

  Serial.print("Distance: ");
  Serial.println(distance);

  
  //buzzer pitch and tempo
  // checks if distance is valid and in a usable range
  if (distance > 0 && distance < 200) {
    float d = distance;

    // limit distances for stable mapping
    if (d < 5)  d = 5;
    if (d > 50) d = 50;

    // maps distance to frequency (closer = higher pitch)
    int frequency = map((int)d, 5, 50, 2000, 200);

    // uses tempoInterval to define how often we "tick" the sound (creates short beeps at each interval using current frequency)
    unsigned long now = millis();

    if (now - lastBeatTime >= tempoInterval) {
      lastBeatTime = now;

      // plays a short beep (half of the interval duration)
      unsigned long beepDuration = tempoInterval / 2;
if (beepDuration < 20) {
beepDuration = 20;  // minimum hearable blip
 }

tone(buzzerPin, frequency, beepDuration);
}

 } else {
    // if bad/no reading, silence the buzzer between ticks
    noTone(buzzerPin);
  }
  delay(50); //delay to reduce noise

}
Favorite Code
//buzzer pitch and tempo
// checks if distance is valid and in a usable range
if (distance > 0 && distance < 200) {
  float d = distance;

  // limit distances for stable mapping
  if (d < 5)  d = 5;
  if (d > 50) d = 50;

  // maps distance to frequency (closer = higher pitch)
  int frequency = map((int)d, 5, 50, 2000, 200);

This is a pretty simple part of the code, but I loved how the beeping sounds turned out. It’s so satisfying to see how my hand movements directly control the sound, and it made me realize how significant even a small part of the code could be.

The code was completed with assistance from ChatGPT.

Video Demo

Reflection and Future Improvements

Mapping the distance to control the sound was pretty simple. The challenging part was controlling the tempo, we had to figure out the values to divide in order for it to have a clear change in the sound of the piezo buzzer after clicking the green button. Overall, this was a really cool project and I hope to continue to improve and expand my knowledge in physical computing!