Week 10 – reading

A Brief Rant on the Future of Interaction Design – initial article

The part of the reading that I found most interesting was the idea that we should stop limiting human interaction to just the tips of our fingers. It made me question why so many interactive designs and technologies focus almost entirely on finger-based gestures. Why not explore other parts of the body as tools for interaction?

I think part of the reason is that we, as humans, feel we have the most control over our fingers – they’re precise, sensitive, and easy to coordinate. But that doesn’t mean interaction should stop there? How could we design experiences that involve other body parts, ones that are still easy to control, but that go beyond what’s been unimaginatively repeated in current design?

The article emphasises how much of our natural ability to sense and manipulate the world through touch has been lost in “Pictures Under Glass”, flat screens that give us no tactile feedback. That really stood out to me, because it highlights how technology, despite its innovation, can sometimes strip away what makes interaction human.

Overall, the reading made me realise how limited our current designs are compared to the potential of the human body. I think it helped me to challenge myself and image more creative ways to interactive with my projects.

Responses

I appreciate his point on the iPad being revolutionary but if it acts the same in 20 years, it won’t be as there has been no big enough change to stay revolutionary. But what are those next steps? Mind control, hand and arms altogether with holograms? It is difficult to come up with answers or suggestions to the rants that were mentioned but I also see the finger being a cap to how we interact with daily tech objects.

Shahram Chaudhry – Week 10 – Reading Response

Reading A Brief Rant on the Future of Interaction Design honestly made me think how much I take my hands for granted. The author talks about how our hands are designed to feel and manipulate things and it really clicked for me when he compared using a touchscreen to “pictures under glass.”  That idea stuck with me because it perfectly captures what phones and tablets feel like: flat, smooth, and completely disconnected from the sense of touch. I never thought about it before, but it’s true , we don’t really “feel” anything when we use them. It’s weird to realize that the same hands that can tie shoelaces or shape clay are now mostly used to swipe on a flat screen.

The part that resonated most with me was his point that technology should adapt to us, not the other way around, especially when he says that technology can change but human nature cannot change as much. I see it all the time,  people getting frustrated because a phone gesture doesn’t work or because an app “expects” you to use it a certain way. It’s like we’re the ones bending to fit the machine, instead of the machine fitting how we naturally act. Also , with older objects like books or physical instruments, there’s something really satisfying about physically turning a page or pressing real keys. You just feel connected to what you’re doing. With touchscreens, everything feels the same no matter what you’re interacting with.

One part I didn’t completely agree with was how strongly he dismissed “Pictures Under Glass.” I get his point, but I think those devices have opened up creativity in new ways,  like drawing on an iPad or making music digitally. It’s not tactile in the same sense, but it’s still expressive.

The follow-up also made me think about how disconnected technology can make us sitting all day, barely moving, everything done with a single finger. I guess, I am a little scared of a future where we become immobile because there’s too much automation, and we don’t need to do much anymore. I guess we shouldn’t be losing touch entirely with the world around us (pun intended).

Week 10 – Shahram Chaudhry – Musical Instrument

 

Concept

For this week’s assignment, Khatira and I made a small musical instrument using an Arduino. We decided to create a pressure-sensitive drum pad that lets you play different drum sounds depending on how hard you press it.

The main part of the project is a force-sensitive resistor (FSR) that acts as the drum pad. When you press on it, the Arduino reads how much pressure you applied and plays a sound through a small buzzer. The harder you hit it, the longer the sound lasts, kind of like playing a real drum.

We also added a button that lets you switch between three drum sounds: a kick, a snare, and a hi-hat. So pressing the pad feels interactive, and you can change the type of drum as you play. It’s a really simple setup, but it was fun to experiment with.

 

Schematic

Video Demo

 

const int FSR_PIN = A0;        
const int BUTTON_PIN = 2;     
const int PIEZO_PIN = 8;     

// Drum sounds 
int kickDrum = 80;             // low drum
int snareDrum = 200;           // mid drum
int hiHat = 1000;              // high drum

int currentDrum = 0;         
int lastButtonState = HIGH;    

void setup() {
  pinMode(BUTTON_PIN, INPUT);  
  pinMode(PIEZO_PIN, OUTPUT);
  Serial.begin(9600); 
}

void loop() {
  int pressure = analogRead(FSR_PIN);
  if (pressure > 20) {
    int duration = map(pressure, 10, 1023, 10, 200);
    // Play the drum sound
    if (currentDrum == 0) {
      tone(PIEZO_PIN, kickDrum, duration);
    } else if (currentDrum == 1) {
      tone(PIEZO_PIN, snareDrum, duration);
    } else {
      tone(PIEZO_PIN, hiHat, duration);
    }
    delay(50);  
  }
  int buttonState = digitalRead(BUTTON_PIN);
  //if button was just pressed, we need to change drum sound
  if (buttonState == LOW && lastButtonState == HIGH) {
    currentDrum = currentDrum + 1;
    if (currentDrum > 2) {
      currentDrum = 0;
    }
    delay(200); 
  }
  lastButtonState = buttonState;  // Store utton state 
}

Future Improvements

For future improvements, we’d like to add a potentiometer to control the sound more precisely, allowing the player to adjust tone or volume in real time while drumming. We could also include LEDs that light up based on which drum sound is active and how hard the pad is hit. These additions would make the drum pad feel more dynamic,  and visually engaging.

Week 10: Musical Instrument

Concept

For this week’s assignment, we had to make a musical instrument involving at least one digital sensor and one analog sensor. Aditi and I decided to create a simple piano-like instrument with lights whose pitch level can be controlled by a potentiometer. There are 4 buttons (switches) that act as the piano “keys” and play different sounds, while the potentiometer has been mapped to three different levels so that the keys produce a high-pitched, middle-pitched, and low-pitched sound.

Materials

  • Analog sensor: 10K Trimpot
  • Digital Switch: Red, Blue, Yellow, and Green tactile buttons
  • Output: Piezo Buzzer to produce the sound and LEDs for light output

Schematic

Video Documentation

The Code

const int button_yellow=8;
const int yellow_light=7;
const int button_blue=9;
const int blue_light=6;
const int green_light=5;
const int button_green=10;
const int red_light=4;
const int button_red=11;
#define BUZZER 12
int potValue=0;
const int potPin=A0;
int melody[]={262,294,330,349,392,440,492,523}; // notes from C4-C5
int melody2[] = {523, 587, 659, 698, 784, 880, 988, 1047}; // notes from C5–C6
int melody3[] = {1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093}; // C6–C7
int nodeDurations=4;
int increase=0;
int potValue_p;

void setup() {
  // put your setup code here, to run once:
  pinMode(button_yellow, INPUT);
  pinMode(yellow_light,OUTPUT);
  pinMode(button_blue, INPUT);
  pinMode(blue_light, OUTPUT);
  pinMode(button_green, INPUT);
  pinMode(green_light,OUTPUT);
  pinMode(button_red, INPUT);
  pinMode(red_light, OUTPUT);

  pinMode(BUZZER, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  potValue = analogRead(potPin);
  
  if (digitalRead(8) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[1]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[1]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[1]);
    }
    digitalWrite(yellow_light, HIGH);

  } else if (digitalRead(9) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[2]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[2]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[2]);
    }
    digitalWrite(blue_light, HIGH);

  } else if (digitalRead(10) == HIGH) {
    Serial.print("green");
    if (potValue <= 300) {
      tone(BUZZER, melody[3]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[3]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[3]);
    }
    digitalWrite(green_light, HIGH);

  } else if (digitalRead(11) == HIGH) {
    if (potValue <= 300) {
      tone(BUZZER, melody[4]);
    } else if (potValue >= 300 && potValue <= 550) {
      tone(BUZZER, melody2[4]);
    } else if (potValue >= 550 && potValue <= 1023) {
      tone(BUZZER, melody3[4]);
    }
    digitalWrite(red_light, HIGH);

  } else if(digitalRead(8)==LOW && digitalRead(9)==LOW && digitalRead(10)==LOW && digitalRead(11)==LOW) {  
    Serial.println("Pin is LOW");
    noTone(BUZZER);
    digitalWrite(yellow_light, LOW);
    digitalWrite(blue_light, LOW);
    digitalWrite(green_light, LOW);
    digitalWrite(red_light, LOW);
  }
}

Reflection & Future Improvements

We had fun making this project, however it was also challenging to make all of the connections. Initially, we planned to have at least 6-7 switches, but the setup became so crowded with just 4. Aditi drew the schematic diagram on paper before we began to build the connections physically, and this made the process much more manageable. We definitely could not have done it without having the diagram in front of us first. Sometimes, the setup for a particular switch would not work and we would face trouble in figuring out whether the issue was in the code or the wiring. In future versions, we would love to have more keys, as well as a tiny screen that displays the letter for the current note that is being played. We would also want to research on more “melodic” or pleasant sounding notes to add to the code.

Week 10 – Reading Reflection

My initial impression regarding the”Brief Rant on the Future of Interaction Design” reading was that it was formatted in a very engaging way. As someone with ADHD, I tend to read long paragraphs a lot slower than individual sentences structured this way; this made me trust in the credibility of the author’s perspective on interaction design.

The way the author transitions from explaining the importance of touch to the complete lack of it in our touchscreen technology was really done, and I couldn’t agree more. Using touchscreens is really un-stimulating, and if anyone who has tried texting without haptics will know it feels incredibly unresponsive– but that also seems to be the future we’re headed towards. The images Bret Victor chooses to accompany his texts are hilarious too; there really aren’t many things we naturally need our hands to swipe on other than manmade touchscreens. Victor’s explanation of how humans naturally transition between grip-styles is pretty interesting to me too.

This reading gave me a lot to think about. One of the things that came to mind was the experience of using a controller versus mouse & keyboard when playing video games. For those unaware, let me explain the main differences between the two.

When you use a controller (or gamepad as some call it), you’re primarily using your thumbs for everything from the analog sticks to the surface-buttons. Using just your thumb to control your camera controls can be quite difficult if precise and delicate movements are required.

When you use a keyboard and mouse, your arm and wrist are capable of microadjustments while holding the mouse to input much more precise and delicate movements; not to mention your keyboard hand is using way more than just your thumb to control what’s happening on screen.

So going by what I’ve said, many would probably wonder why anyone would ever use a controller, but that’s the thing– I haven’t explained the one thing that makes this remotely a difficult decision.
Controllers give an extra layer of immersion by both letting the user relax their arms and lean back, but also provide the user with haptic feedback and vibrations in response to what’s happening in-game. Imagine you’re playing a game where explosions are involved– the controller would vibrate violently in your hands as explosions affect your character. This is why you turn to keyboard and mouse for precision but controllers for immersion.

Now onto Victor’s follow-up article– I thought his response to voice was pretty amusing, “I have a hard time imagining Monet saying to his canvas, “Give me some water lilies. Make ’em impressionistic.” It’s amusing because that’s literally how our modern generation approaches stuff they don’t know how to do.

One other thing that really caught my attention in the follow-up was this quote, “The density of nerve endings in our fingertips is enormous. Their discrimination is almost as good as that of our eyes. If we don’t use our fingers, if in childhood and youth we become “finger-blind”, this rich network of nerves is impoverished.” I wonder if late-gen Z and gen alpha have any indicators of finger blindness as so many of us grew up with touchscreen devices as our main source of entertainment.

 

Reading Reflection – Week 10

A Brief Rant on the Future of Interaction Design and A follow-up article

I have never really thought much about it, but if someone were to put me on the spot and ask me which is more important, the visual or the tactile senses, I would probably choose the visual. This rant by Bret Victor has successfully changed my mind, though. I think what really convinced me was the example he gave on tying shoelaces. Indeed, I could easily tie my laces with my eyes closed, but if I numbed my hands or fingertips, I wouldn’t be able to do it anymore. This enlightenment has made me realize that I’ve never even considered the tactile aspect of any of my works, even though the first part of the course didn’t have room for it. I’m excited to take this into account for my upcoming projects since they involve physical human interaction.

I really liked the way Victor described today’s technology of iPads and phones as Pictures Under Glass. When phrased in this way, it actually makes the technology seem so boring and unintuitive. Like what do you mean our “revolutionary” technology doesn’t use the human tactile sense at all, probably one of our greatest assets? It actually gets more absurd the more I think about it.

When I moved on to the follow-up article, I found some of the comments and his response to them quite funny. But amongst all of this, the line that really stood out to me was, “Channeling all interaction through a single finger is like restricting all literature to Dr Seuss’s vocabulary.” I believe this line perfectly sums up the point Bret was trying to bring across about iPad technology only using your fingers. Where real literature’s scope goes to Shakespeare and beyond, leaving a human being with words only from a Dr. Seuss book is the dumbest thing, anyone can see that. So why can’t people see that limiting one’s fingertips- the body part with the highest density of nerve endings- to a flat, smooth screen, is dumb too? Overall, this has been one of my favorite readings so far, very eye-opening.

Week 10 – Reading Reflection

Bret Victor’s “A Brief Rant on the Future of Interaction Design” reads less like a complaint and more like a plea for imagination. His frustration with “pictures under glass” isn’t really about touchscreens rather about how easily we mistake convenience for progress. Victor’s argument that our hands are not just pointers but thinking tools is relatable. I’d never thought of a touchscreen as “numb,” but he’s right: sliding a finger across glass is nothing like twisting, folding, or shaping an object. He’s asking designers to respect the body’s intelligence, the millions of years of evolution that made us good at feeling, gripping, and sensing the world. I agree with him almost entirely. What unsettles me is how quickly we’ve accepted numbness as normal. We’ve trained ourselves to think that friction is a flaw when it comes to UX.

The Responses piece adds an interesting layer of humility. Victor doesn’t pretend to have the solution. He admits he was ranting, not designing one. Still, his answers to critics are telling. He pushes back against gimmicks like voice control and “waving your hands in the air,” arguing that real interaction should involve the whole body and the tactile richness of touch. I found myself nodding at his line about computers needing to adapt to our bodies, not the other way around. That’s such a simple reversal, yet it cuts right through decades of design laziness. When he compares touchscreen culture to “restricting all literature to Dr. Seuss’s vocabulary,” it’s funny, but it also nails the deeper loss: we’re settling for tools built for children, not adults.

If there’s one thing I’d question, it’s Victor’s nostalgia for physicality. I agree that touch and movement matter, but I also think the human imagination adapts. The digital world is training new forms of dexterity which are mostly mental than physical. Coding, multitasking, navigating layered interfaces – these, too, are forms of “touch,” just less visible. Maybe the future of interaction design isn’t about replacing glass with holographic clay, but about balancing sensory depth with cognitive range. Victor’s rant reminds me that design should evolve with both the hand and the mind.

Week 10 – A DIY Musical Instrument (The Ultrasonic Air Piano)

Concept:

The idea behind my instrument was to create a hands-free musical device that transforms invisible gestures into sound. My goal was to design something playful yet technical; a device that reacts to both motion and touch. By combining distance and pressure, the instrument invites intuitive exploration: the closer your hand gets, the higher the note, while pressing the sensor triggers the sound. It merges tactile interaction with sound, making music a physical experience.

 

Method & Materials:

This project was my first time working with two types of sensors on the Arduino Uno:

  • Analog Sensor: Ultrasonic sensor (HC-SR04) to measure distance.
  • Digital Switch: Force Sensitive Resistor (FSR) to detect pressure.
  • Output: Piezo buzzer to produce sound.

I connected the ultrasonic sensor to pins 10 (trig) and 11 (echo), the FSR to analog pin A0, and the buzzer to pin 12.
Each note from the C major scale (C–D–E–F–G–A–B) was assigned to a specific distance range, coded in an array:

int notes[7] = {261, 294, 329, 349, 392, 440, 494};

The system reads distance in real time:

  • When the FSR is pressed and your hand is between 0–50 cm of the sensor, the buzzer plays a tone corresponding to that range.
  • If no pressure is detected or the hand moves out of range, the sound stops.

Process:

At first, it took time to understand how analog vs. digital inputs work and how to read them simultaneously. I researched how to use pulseIn() for the ultrasonic sensor and experimented with mapping values using the map() function.
To visualize the notes, I placed colored paper strips at different distances  (each representing one note of the scale)

Throughout the process, I learned:

  • The importance of wiring correctly (e.g., ensuring the FSR forms a voltage divider).
  • How combining two sensors can create more expressive interaction.

Schematic:

Code:

This combines input from two sensors, an ultrasonic sensor and a force-sensitive resistor (FSR) to generate musical notes through a piezo buzzer. The ultrasonic sensor continuously measures the distance of my hand, while the FSR detects when pressure is applied. When both conditions are met (hand within 50 cm and FSR pressed), the code maps the distance value to a specific note in the C major scale (C, D, E, F, G, A, B). Each distance range corresponds to a different pitch, allowing me to “play” melodies in the air. The code I’m most proud of is the single line that transforms the project from a simple sensor experiment into a musical instrument. It defines the C major scale, turning numerical frequency values into recognizable notes. I love that such a short line of code gives the device its expressive character, it bridges logic and creativity, translating distance data into melody. It’s the heart of the project, where sound and interaction truly come together.

// --- Define musical notes (C major scale) ---
int notes[7] = {261, 294, 329, 349, 392, 440, 494}; // C D E F G A B

Result:

The final prototype acts like an invisible piano: you play by waving your hand in front of the sensor and pressing lightly on the FSR. Each distance triggers a different musical note. The colored papers made it easier to perform intentionally and visually mark pitch changes.

In the demonstration video, the tones respond smoothly to my gestures, transforming simple components into an expressive interface.

Challenges:

One of the main challenges I faced was understanding how each pin on the ultrasonic sensor worked. At first, I didn’t realize that every pin had a specific purpose, like trig for sending signals and echo for receiving them, so it took me a while to fully grasp how data was actually being measured. I also struggled with wiring the circuit, often making small mistakes that stopped the whole setup from working. Drawing out the schematic was another time-consuming part since there were many components to connect and label correctly. Finally, the coding process was challenging because it was my first time using several of these elements, and I had to learn through trial and error how to make the sensors and buzzer communicate smoothly.

Inspiration + Tools thats helped me:

https://projecthub.arduino.cc/theriveroars/simple-hand-controlled-instrument-372bfc

https://learn.adafruit.com/force-sensitive-resistor-fsr/using-an-fsr

Reflection:

This project taught me how code, sensors, and sound can merge into interactive art. The challenge was balancing sensitivity: sometimes the ultrasonic readings fluctuated, and it took some fine tuning. But once it worked, it felt rewarding to hear the instrument. It also made me realize how music can be built from logic, how creative coding allows emotional expression through electronics. I see this as the beginning of exploring computational instruments that combine art and technology.

Week 9 – Two-Degree Safety

My Concept

I decided to build a “two-degree safety guardrail.” The logic is based on two separate actions:

  1. Idle State: A red LED is ON (digital HIGH).
  2. “Armed” State: A button is pressed. This turns the red LED OFF (digital LOW).
  3. “Active” State: While the button is held, a FSR (force-sensing resistor) is pressed, which controls the brightness of a green LED (my analog output).

Challenge

My challenge was getting the red LED to be ON by default and turn OFF only when the button was pressed. I tried to build a hardware-only pull-up or bypass circuit for this but struggled to get it working reliably on the breadboard.

So, I shifted that logic into Arduino code adapted from a template in the IDE.

// constants won't change. They're used here to set pin numbers:
const int buttonPin = 2;  // the number of the pushbutton pin
const int ledPin = 13;    // the number of the LED pin

// variables will change:
int buttonState = 0;  // variable for reading the pushbutton status

void setup() {
  // initialize the LED pin as an output:
  pinMode(ledPin, OUTPUT);
  // initialize the pushbutton pin as an input:
  pinMode(buttonPin, INPUT);
}

void loop() {
  // read the state of the pushbutton value:
  buttonState = digitalRead(buttonPin);

  // check if the pushbutton is pressed. If it is, the buttonState is HIGH:
  if (buttonState == LOW) {
    // turn LED on:
    digitalWrite(ledPin, LOW);
  } else {
    // turn LED off:
    digitalWrite(ledPin, HIGH);
  }
}

The demo was shot with the Arduino implementation.

Schematic

However, I later figured out the pull-up logic, and was able to implement a hardware-only solution. This schematic was a result of the updated circuit.

Week 10 – Reading Response (A Brief Rant on the Future of Interaction Design)

One passage that really stayed with me from Bret Victor’s A Brief Rant on the Future of Interaction Design is his statement that screens are “pictures under glass.” That phrase hit me because it’s so ordinary yet so revealing; every day I touch my phone dozens of times, yet I never actually feel anything back. Victor’s argument that we’ve limited human interaction to tapping on cold glass made me realize how passive our so-called “interactive” technologies have become. I started thinking about how my creativity, whether sketching or coding, always feels richer when my hands are physically involved; pressing, folding, shaping. It made me question: why did we let convenience replace tactility? What would technology look like if it honored the intelligence of our hands instead of reducing them to cursors?

In the Responses section, I was fascinated by how defensive many readers became, as if Victor’s critique was anti-progress. But what I sensed in his tone was care, not nostalgia; a desire to expand our sense of what interaction can mean. This really reminded me of  Refik Anadol’s Machine Hallucinations, a piece I’m analyzing for another course, where data transforms into movement, color, and emotion. Anadol’s work feels like the future Victor imagines: one where technology engages the body and senses, not just the eyes.

These readings challenged my old assumption that the “best” design is the smoothest and most frictionless. Victor helped me see friction as meaningful; it’s how we feel our way through the world. I now think of design less as creating perfect efficiency and more as crafting moments of connection between body, mind, and machine. The essay left me wondering whether the future of interaction design depends not on faster touchscreens, but on rediscovering touch itself; real, textured, imperfect, human touch.

Ultimately, I completely agree with Victor’s message. His critique felt refreshing, almost like a wake-up call to slow down and rethink what “innovation” actually means. I liked how he exposed the emptiness behind shiny new interfaces and instead celebrated the physical, human side of design. Even though his tone was mainly critical, I didn’t find it negative; I found it hopeful. It made me appreciate the kind of design that makes people feel connected, not just technologically advanced.