Week 10: Reading Response

A Brief Rant on the Future of Interaction Design

After finishing reading the article, I caught myself staring at my hands and realizing how much I take them for granted. It’s strange how technology, which is supposed to make us feel more connected, can also make us feel so detached from the world right in front of us.

I’ve spent years glued to glowing screens typing, swiping, scrolling without thinking about how unnatural it all feels. Victor’s phrase “Pictures Under Glass” stuck with me like a quiet accusation. That’s exactly what these devices are: smooth, cold panes that flatten our world into something we can only look at, not touch. I thought about how I used to build things as a kid—Legos, paper circuits, even sandcastles. I remember the feeling of resistance when stacking wet sand, the satisfaction of shaping something that pushed back. Now, most of my creations live behind glass, where nothing pushes back.

What really struck me was Victor’s idea that true tools amplify human capabilities. We’ve built technologies that expand what we can see and compute, but not what we can feel. I realized that so many “futuristic” designs we celebrate are really just shinier versions of what we already have more pixels, more gestures, less connection. It’s like we’ve mistaken slickness for progress.

Reading this made me wonder what kind of future I want to help design as someone in tech. I don’t just want to make tools people use. I want to make tools people feel. Tools that understand how deeply human interaction is tied to touch, weight, and presence. Maybe that means haptic design, maybe something beyond it. But I know it means remembering that we are not just eyes and brains; we are hands, bodies, and motion.

 

Responses: A Brief Rant on the Future of Interaction Design

When I read Bret Victor’s Responses to A Brief Rant on the Future of Interaction Design, it felt like listening to a passionate inventor defending an idea most people didn’t quite understand. His frustration was almost funny at times how he keeps saying “No! iPad good! For now!” but underneath that humor was something deeper: a kind of hope that we can still dream bigger about how we interact with technology.

What struck me most was his insistence that the problem isn’t the devices themselves. It’s our lack of imagination. We celebrate flat screens as the height of innovation, but Victor reminds us they’re only a step in a much longer journey. Reading this, I realized how often I accept technology as inevitable, as if it just happens to us. But Victor insists that people choose the future. We decide what gets built, what gets funded, what gets normalized. That realization hit me hard, because it turned passive admiration into personal responsibility.

His section on “finger-blindness” especially stuck with me. The idea that children could lose part of their sensory intelligence by only swiping at glass felt unsettling. I thought about how often I see kids with tablets, and how natural it looks yet maybe that’s exactly the danger. Our hands were made to shape, feel, and learn through texture. If we stop using them that way, we’re not just changing how we play; we’re changing how we think.

What I admire about Victor’s writing is that he doesn’t reject technology. He just wants it to grow up with us. He’s not nostalgic for the past or obsessed with sci-fi fantasies; he’s practical, grounded in the body. When he says that the computer should adapt to the human body, not the other way around, it reminded me that innovation should honor what makes us human, not erase it.

Reading his response made me feel both small and inspired. Small, because it reminded me how little we really understand about designing for human experience. Inspired, because it reminded me that design is not just about what looks cool. It’s about what feels alive.

Maybe the real future of interaction design isn’t in inventing smarter machines, but in rediscovering the intelligence already built into our own bodies.

Week 10: Musical Instrument

For this week’s assignment, we had to make a musical instrument involving at least one digital sensor and one analog sensor. Aditi and I decided to create a simple piano-like instrument with lights whose pitch level can be controlled by a potentiometer. There are 4 buttons (switches) that act as the piano “keys” and play different sounds, while the potentiometer has been mapped to three different levels so that the keys produce a high-pitched, middle-pitched, and low-pitched sound.

Materials

  • Analog sensor: 10K Trimpot
  • Digital Switch: Red, Blue, Yellow, and Green tactile buttons
  • Output: Piezo Buzzer to produce the sound and LEDs for light output

Schematic

Video

link: drive

Code

const int button_yellow=8;
const int yellow_light=7;
const int button_blue=9;
const int blue_light=6;
const int green_light=5;
const int button_green=10;
const int red_light=4;
const int button_red=11;
#define BUZZER 12
int potValue=0;
const int potPin=A0;
int melody[]={262,294,330,349,392,440,492,523}; // notes from C4-C5
int melody2[] = {523, 587, 659, 698, 784, 880, 988, 1047}; // notes from C5–C6
int melody3[] = {1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093}; // C6–C7
int nodeDurations=4;
int increase=0;
int potValue_p;

void setup() {
  // put your setup code here, to run once:
  pinMode(button_yellow, INPUT);
  pinMode(yellow_light,OUTPUT);
  pinMode(button_blue, INPUT);
  pinMode(blue_light, OUTPUT);
  pinMode(button_green, INPUT);
  pinMode(green_light,OUTPUT);
  pinMode(button_red, INPUT);
  pinMode(red_light, OUTPUT);

  pinMode(BUZZER, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  potValue = analogRead(potPin);
  
  if (digitalRead(8) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[1]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[1]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[1]);
    }
    digitalWrite(yellow_light, HIGH);

  } else if (digitalRead(9) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[2]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[2]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[2]);
    }
    digitalWrite(blue_light, HIGH);

  } else if (digitalRead(10) == HIGH) {
    Serial.print("green");
    if (potValue <= 300) {
      tone(BUZZER, melody[3]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[3]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[3]);
    }
    digitalWrite(green_light, HIGH);

  } else if (digitalRead(11) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[4]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[4]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[4]);
    }
    digitalWrite(red_light, HIGH);

  } else if(digitalRead(8)==LOW && digitalRead(9)==LOW && digitalRead(10)==LOW && digitalRead(11)==LOW) {  
    Serial.println("Pin is LOW");
    noTone(BUZZER);
    digitalWrite(yellow_light, LOW);
    digitalWrite(blue_light, LOW);
    digitalWrite(green_light, LOW);
    digitalWrite(red_light, LOW);
  }
}

Reflection & Future Improvements

We had fun making this project, however it was also challenging to make all of the connections. Initially, we planned to have at least 6-7 switches, but the setup became so crowded with just 4. I started drawing the schematic diagram on paper before we began to build the connections physically, and this made the process much more manageable. We definitely could not have done it without having the diagram in front of us first. Sometimes, the setup for a particular switch would not work and we would face trouble in figuring out whether the issue was in the code or the wiring. In future versions, we would love to have more keys, as well as a tiny screen that displays the letter for the current note that is being played. We would also want to research on more “melodic” or pleasant sounding notes to add to the code.

Week 10 Reading

In the early iPhone era, Apple’s design mimicked physical textures(leather stitching in Calendar, felt in Game Center, the green felt of iBooks’ shelves). This skeuomorphism gives digital things tactile analogs was arguably the last mainstream attempt to preserve a sense of touch through sight.

Then came iOS 7 led by Jony Ive, where Apple decisively flattened everything: gradients gone, shadows gone, buttons became text. It was a move toward visual minimalism, but also toward what Victor warns against. The glass stopped pretending to be wood, or leather, or paper. It simply became glass.

Victor’s essay makes me realize that friction is information. When you open a book or turn a knob, your hand is in dialogue with resistance; you feel the world push back. That pushback is not inefficiency. It’s meaning.

What’s fascinating is that Apple, perhaps subconsciously, has been quietly circling back to Victor’s point. The Apple Vision Pro’s “spatial computing” rhetoric reintroduces physical space and hand gestures, but ironically, without touch. You can see and move objects in 3D, but you can’t feel them. It’s embodiment without embodiment. Victor’s “hands without feedback” problem all over again.

Every major design philosophy of the 2010s has quietly absorbed the “frictionless” ethos. Uber, Tinder, and Amazon all measure success by how little thought or effort stands between desire and fulfillment. You tap once and the world rearranges itself.

But Victor’s warning, when applied here, becomes almost moral: when everything becomes too smooth, we lose the feedback loops that teach us how the world works. Swiping on a screen doesn’t just numb the fingers, it numbs cause and effect. It’s a design culture that erases the material consequences of our actions.

Week 10 – Reading Reflection

Bret Victor’s rant made me rethink what we even mean when we call something “the future.” He argues that touchscreens, gesture controls, and all these “advanced” interfaces are actually making us less connected to our own abilities. Our hands are one of the deepest ways we understand the world. They know tension, pressure, texture. They think with us. But we’ve decided progress means tapping around on cold glass. When I read that, the first thing I thought of was LEGO. There is this unspoken language when you build: the way your fingers already know which brick fits, the tiny resistance before a perfect click. That sound. That feeling. It’s not just play; it is intelligence happening through the body. No screen has ever replicated that.

I’ve tried the digital LEGO builders before, and they always feel wrong. You can assemble something on the screen, sure, but there is no weight, no friction, no small ritual of digging through pieces and recognizing one by touch alone. Same with crocheting. The yarn runs differently through your fingers depending on tension, mood, the hook, your posture. You feel progress. You feel mistakes. Your hands correct before your mind catches up. Victor’s point clicked for me here: creativity is not just in the mind. It is in the wrists, fingertips, joints, and muscle memory. When interfaces ignore the body, they are not futuristic. They are incomplete.

The responses page made it clear he is not saying we need to go backwards. He is saying we should refuse a future that flattens our senses. There are richer, more human possibilities if we let our full selves participate in the interaction. For me, the future I want is textured, clickable, tuggable, threaded, snapped together. A future that feels like LEGO: discovery through touch, play, accident, correction, and joy. Innovation that doesn’t just live on a screen, but lives in your hands.

W10: Instrument

Inspiration

The xylophone has always fascinated me. I loved watching the vibrant melodies come to life as each bar was tapped. This inspired me to create a digital version using everyday materials, giving the classic xylophone a modern, interactive twist.

Concept

The idea was simple yet playful: use Aluminum foil as the xylophone buttons. Each strip of foil represents a note, and tapping on it triggers a sound. To bring in the concept of tuning (something I deeply appreciate from my experience playing the violin) we incorporated a potentiometer. This allows the user to adjust the pitch of each note in real-time, just as a musician tunes their instrument before performing. By combining tactile interaction with the flexibility of pitch control, we aimed to create an instrument that feels both familiar and innovative.

 

Code I’m Most Proud Of

int potVal = analogRead(potPin);
float multiplier = map(potVal, 0, 1023, 60, 180) / 100.0;

if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    delay(50);
} else {
    noTone(buzzerPin);
}

What makes this snippet special is how it turns a simple analog input into musical expression. By mapping the potentiometer to a frequency multiplier, each foil strip produces a different tone that can be adjusted on the fly. Using constrain() ensures the sounds remain within a safe, audible range. It was rewarding to see how these functions, which we learned in class, could be combined to create a tactile, musical experience.

Future Improvements

Right now, the instrument plays a sound as long as the foil is touched. In the future, I’d like to add note duration control so that pressing a strip produces a single tone, similar to how a piano note behaves, with a possible fade-out effect when the note ends. This would make the interaction feel more natural and musical.

Another exciting improvement could be a wireless “stick” that triggers the foil strips remotely. This would allow the musician to move freely and perform more expressively, opening up new possibilities for live interaction and playability.

 

Week 10 — Reading Response

Reading 1

I don’t completely agree with the author’s criticism that modern technologies like touchscreens and digital interfaces make interaction feel “glassy” and disconnected. While it’s true that they remove some tactile feedback, I think their purpose isn’t to replace physical touch but to make life more efficient and accessible. For example, instead of physically going to a post office to send a letter, I can send an email or message in seconds. This doesn’t make the experience meaningless it just reflects how technology has evolved to save us time and effort. These interfaces have also allowed people with disabilities to communicate, work, and interact in ways that might not have been possible with traditional physical tools. In that sense, “pictures under glass” might actually expand human capability rather than limit it.

However, I understand the author’s point about how important our sense of touch is and that certain interactions lose something when they become purely digital. For example, learning to play a real piano or sculpting clay engages the hands in ways that a touchscreen keyboard or 3D modeling app never could. I think the balance lies in knowing where tactile interaction matters and where digital convenience should take over. For creative or learning experiences, keeping physical feedback is valuable it builds skill, emotion, and memory. But for communication, organization, and quick access to information, digital tools are the smarter choice. So rather than rejecting “pictures under glass,” I think the future of interaction should combine both worlds using technology to simplify life without losing the richness of real, physical touch.

Reading 2

After reading Bret Victor’s responses, I actually liked his explanation a lot more because it helped me understand his real point. At first, I thought he wanted us to go “old school” and reject technology completely, like he was against modern progress. That’s why I used the example of sending an email instead of going to the mailbox because I thought he didn’t appreciate how technology saves time and makes life easier. But after reading his clarification, I realized he’s not saying we should stop using touchscreens or digital tools; he’s saying we should build on them and make them better. I especially liked his comparison of the iPad to black-and-white film before color came along that made so much sense. He wants us to advance technology even further, but in a way that brings back the richness of physical touch and real interaction. I still think that won’t be possible for everything, because the future is definitely digital, but if we can find ways to blend technology with physical sensations, that would be amazing it would make our interactions more natural, creative, and human.

Week 10 —Piano keys

Reflection on My Schematic

For this project, I decided to use three pushbuttons and a potentiometer because I wanted to keep the design simple but still interactive. My idea was to make the buttons act like keys—each one triggering a different tone on the piezo speaker. I also included the potentiometer to control the volume of the sound, which made the circuit more dynamic and gave me a sense of how analog and digital components can work together.

Designing the schematic was a fun and challenging part of the process. I had to carefully think about where each component should go and how to connect everything so it would be electrically correct. The piezo speaker was, of course, essential for producing sound it’s what brings the “instrument” to life.

What I enjoyed most was the problem-solving aspect figuring out how all the parts would communicate through the Arduino. It required patience and logical thinking, but once I understood the flow of current and how each input affects the output, it all made sense. Seeing the schematic come together gave me a real sense of accomplishment because it represented both my planning and my creativity.

IMG_0620

Reflection on the Arduino Build

The Arduino part of this project was honestly very easy and smooth, especially because I had already planned everything out in my schematic. All I had to do was follow my own plan carefully and make sure that every wire and component was connected in the correct place.

One of the first things I did was connect the GND pin on the Arduino to the negative railon the breadboard. That allowed me to connect all my components that needed ground like the potentiometer, buttons, and piezo speaker to the same shared ground.

I had to be precise and double-check my connections, especially where the potentiometer linked to the speaker and where each button was wired. To make the process easier and more organized, I actually color-coded my wires. For example, I used red wires for all the connections that went to ground. That made it much simpler to trace the circuit visually and fix small mistakes when I needed to adjust something later.

Overall, the Arduino part went very smoothly because I had a solid plan and followed it carefully. Seeing everything come together and actually work just like I designed it in the schematic felt really rewarding.

//set the pins for the button and buzzer
int firstKeyPin = 2;
int secondKeyPin = 3;
int thirdKeyPin = 4;

int buzzerPin = 10;


void setup() {
  //set the button pins as inputs
  pinMode(firstKeyPin, INPUT_PULLUP);
  pinMode(secondKeyPin, INPUT_PULLUP);
  pinMode(thirdKeyPin, INPUT_PULLUP);

  //set the buzzer pin as an output
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  
  if(digitalRead(firstKeyPin) == LOW){        //if the first key is pressed
    tone(buzzerPin, 262);                     //play the frequency for c
  }
  else if(digitalRead(secondKeyPin) == LOW){  //if the second key is pressed
    tone(buzzerPin, 330);                     //play the frequency for e
  }
  else if(digitalRead(thirdKeyPin) == LOW){   //if the third key is pressed
    tone(buzzerPin, 392);                     //play the frequency for g
  }
  else{
    noTone(buzzerPin);                        //if no key is pressed turn the buzzer off
  }
}

Reflection on the Coding

The coding part of this project was very straightforward. I already had my circuit ready, so I just needed to make sure that everything in my code matched the pins I used in my schematic and breadboard. I started by defining all of my pins clearly one for each of the three buttons and one for the piezo speaker.

I wanted each button to act as a different key, so I chose three different frequencies (in hertz) to create distinct sounds. I adjusted the hertz values to my liking until each key sounded right and had its own tone. This made the instrument feel more realistic and musical.

I also made sure to define all the variables and constants I needed in my code so that everything was organized and easy to change later if I wanted to experiment with new sounds or pin connections. Overall, the coding was simple but satisfying because it brought the entire project to lifethe schematic, the Arduino wiring, and the code all worked together perfectly.

Week 10 – Music instrument

Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram:
Setup:Video:

Highlight of the code:
Full-code on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

 

Week 10 – Musical instrument

Group members: Aizhan and Darmen
Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram: 

 

 

 

 

 

 

Setup:

 

 

 

 

 

 

 

Video Demonstration

Coding:

Arduino file on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }
}

float getDistance() {
  float echoTime;                   //variable to store the time it takes for a ping to bounce off an object
  float calculatedDistance;         //variable to store the distance calculated from the echo time
  //send out an ultrasonic pulse that's 10ms long
  digitalWrite(TRIG_PIN, HIGH);
  delayMicroseconds(10);
  digitalWrite(TRIG_PIN, LOW);
  echoTime = pulseIn(ECHO_PIN, HIGH);      //pulsein command to see how long it takes for pulse to bounce back to sensor
  calculatedDistance = echoTime / 55.2;    //calculate  distance of object that reflected the pulse in cm 
  return calculatedDistance;
}

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

Week 10 – reading

A Brief Rant on the Future of Interaction Design – initial article

The part of the reading that I found most interesting was the idea that we should stop limiting human interaction to just the tips of our fingers. It made me question why so many interactive designs and technologies focus almost entirely on finger-based gestures. Why not explore other parts of the body as tools for interaction?

I think part of the reason is that we, as humans, feel we have the most control over our fingers – they’re precise, sensitive, and easy to coordinate. But that doesn’t mean interaction should stop there? How could we design experiences that involve other body parts, ones that are still easy to control, but that go beyond what’s been unimaginatively repeated in current design?

The article emphasises how much of our natural ability to sense and manipulate the world through touch has been lost in “Pictures Under Glass”, flat screens that give us no tactile feedback. That really stood out to me, because it highlights how technology, despite its innovation, can sometimes strip away what makes interaction human.

Overall, the reading made me realise how limited our current designs are compared to the potential of the human body. I think it helped me to challenge myself and image more creative ways to interactive with my projects.

Responses

I appreciate his point on the iPad being revolutionary but if it acts the same in 20 years, it won’t be as there has been no big enough change to stay revolutionary. But what are those next steps? Mind control, hand and arms altogether with holograms? It is difficult to come up with answers or suggestions to the rants that were mentioned but I also see the finger being a cap to how we interact with daily tech objects.