Assignment 10 – Reading Response (A Brief Rant On the Future Of Interaction Design, and it’s Follow-up)

Reading this response made me think about how much and how little has changed since 2011. At the time, touchscreens were really starting to take over. The iPhone was still kind of shiny and new, and starting to set a trend, and the iPad had just started making its way into homes, schools, and workplaces. It was exciting. And yet, looking back from 2025, it’s almost prophetic how spot-on this rant was about where things might go. We did, in many ways, double down on the flat, glassy rectangles.

What really struck me was how much of this still applies today. In fact, it’s even more relevant now. Since 2011, we’ve added smart speakers, VR headsets, and now AI tools like ChatGPT and image generators like DALL·E. The author says Monet couldn’t have painted by saying “Give me some water lilies,” but with generative AI, that’s suddenly a real thing, and has become increasingly more relevant in debates on human participation; it’s both exciting and a little unsettling. It made me wonder: are we making creativity more accessible, or are we distancing ourselves from the hands-on, exploratory process that gives creative work its depth and meaning?

The rant also touched on something deeper, the idea that our tools shape how we think, learn, and grow. When we limit interaction to just tapping a screen or giving voice commands, we risk becoming passive users instead of active thinkers. Especially now, when so much of daily life is mediated by screens and digital assistants, it’s easy to forget how valuable physical engagement really is. In the end, this wasn’t just a rant about interface design; it was a call to imagine more ambitious, embodied futures for how we use technology. It made me reflect on my own habits and what kind of tech I want to see (and use) going forward.

Week 10: Musical Instrument

Link to video demo : https://drive.google.com/file/d/1KGj_M7xq6IdsS2Qwq-zbjjspCPPgcaj4/view?usp=sharing

For this assignment, I decided to build a digital trumpet using an Arduino Uno, three push buttons, a potentiometer, and a speaker. My goal was to simulate the behavior of a real trumpet in a fun and creative way, even though I knew the sound would be more electronic than acoustic. It was a great opportunity for me to explore how hardware and code can come together to create music, and I ended up learning a lot about sound generation and analog input in the process.

The concept was simple: each of the three buttons acts like a trumpet valve, and each one triggers a different note — specifically G4, A4, and B4. These are represented in the code as fixed frequencies (392 Hz, 440 Hz, and 494 Hz). When I press one of the buttons, the Arduino sends a signal to the speaker to play the corresponding note. The potentiometer is connected to analog pin A0 and is used to control the volume. This was a really cool addition because it gave the instrument a bit of expressive control — just like how a real musician might vary their breath to change the loudness of a note.

To make the sound a bit more interesting and less robotic, I added a little “vibrato” effect by randomly adjusting the pitch slightly while the note is playing. This gives the tone a subtle wobble that sounds more natural — kind of like the way a real trumpet player might shape a note with their lips. It’s still a square wave, and it’s definitely digital-sounding, but it gives it more character than just playing a flat, unchanging frequency.

If I were to continue developing this project, I have a few ideas for improvements. One would be to add more buttons or allow combinations of the three to create more notes — like a real trumpet with multiple valve positions. I’d also love to add some kind of envelope shaping, so the notes could have a smoother fade-in or fade-out instead of sounding flat and abrupt. It might also be fun to hook the project up to MIDI so it could control a software synthesizer and produce higher quality trumpet sounds. And for an extra visual touch, I could add LEDs that light up in sync with the music.

CODE :

const int potPin = A0;          // Potentiometer for volume
const int speakerPin = 8;       // Speaker on PWM pin
const int buttonPins[] = {2, 3, 4}; // 3 buttons = 3 different notes

// Trumpet-like frequencies (roughly G4, A4, B4)
const int trumpetNotes[] = {392, 440, 494}; 

void setup() {
  for (int i = 0; i < 3; i++) {
    pinMode(buttonPins[i], INPUT); // External pull-down resistors
  }
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  int volume = analogRead(potPin) / 4;

  for (int i = 0; i < 3; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) {
      playTrumpetNote(trumpetNotes[i], volume);
    }
  }

  delay(10); 
}

void playTrumpetNote(int baseFreq, int volume) {
  unsigned long duration = 10000; // microseconds per cycle
  unsigned long startTime = micros();

  while (micros() - startTime < duration) {
    // Slight pitch wobble
    int vibrato = random(-3, 3);
    int currentFreq = baseFreq + vibrato;
    int halfPeriod = 1000000 / currentFreq / 2;

    analogWrite(speakerPin, volume);
    delayMicroseconds(halfPeriod);
    analogWrite(speakerPin, 0);
    delayMicroseconds(halfPeriod);
  }
}

 

Week 10: Musical Instrument

For this assignment, Izza and I worked together to come up with the idea of using the push buttons from our kit as keys for a piano. We used cardboard to create the piano keys and poked the push buttons through the bottom layer. We then used copper tape to cover the push button’s pins and give the alligator clips something to attach to in order to connect the buttons with wires that went into the breadboard. For our analog sensor, we used a potentiometer to control the length of the sound made once a key was pressed. The result can be seen here:

https://drive.google.com/file/d/187WqUyYvRZ6KFFVMn0NtSO0ycqEzKyXq/view?usp=sharing

 

Our circuit diagram can also be seen here:

We’re really proud of the fact that we were able to complete the circuit using a variety of tools like the copper tape and alligator pins and were able to have a creative and working result. We are also really proud of the code that was inspired by the toneMelody exercise we did in class for the pitches. The code can be seen below:

#include "pitches.h"

const int speakerPin = 8;
const int potPin = A0;

const int buttonPins[] = {2, 3, 4, 5};
const int numButtons = 4;

// Define the notes for each button
int notes[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4};

void setup() {
  for (int i = 0; i < numButtons; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);  // Internal pull-up resistor
  }
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  int potValue = analogRead(potPin);  // 0–1023
  int noteDuration = map(potValue, 0, 1023, 100, 1000);  // Adjusts the lengths of the notes

  for (int i = 0; i < numButtons; i++) {
    if (digitalRead(buttonPins[i]) == LOW) {  // Button is pressed
      tone(speakerPin, notes[i], noteDuration);
      delay(noteDuration * 1.3);  // Pause between tones
      noTone(speakerPin);
    }
  }
}

We had some difficulty getting the buttons to connect with the alligator clips using the copper tape since it kept poking through the tape and was very fragile to move around. Even with a double reinforcement, the pins would still stick through. If we were to recreate it, we may seek another alternative that is thicker. We also encountered an unknown issue with some ghost keys where sounds would appear even if no key was pressed. This could be due to the copper tape issue as well.

Overall though, we are proud of the fact that the piano keys worked when pressed and the potentiometer properly adjusted the length of the notes as seen in the video.

Week 10 – Reading Response

Reading A Brief Rant on the Future of Interaction Design felt less like a critique of current technology and more like a reminder of how disconnected we’ve become from our own bodies when interacting with digital tools. What stood out to me wasn’t just the fixation on touch or screens, but the larger idea that we’ve built a digital world that doesn’t physically involve us much at all. We sit, swipe, and speak to our devices, but rarely do we engage with them in a way that feels natural or satisfying. That idea made me reflect on how strange it is that we’ve accepted this passive interaction style as normal, even though it barely scratches the surface of what our senses and motor skills are capable of. The rant made me question whether convenience has quietly replaced depth in our relationship with technology.

What also struck me was the underlying urgency — not just to change what we build, but to change how we think about building. It challenged the assumption that progress is purely about making things smaller, faster, or more responsive. Instead, it asked: what if we measured progress by how much it involves us — our movement, our perception, our ability to explore ideas physically, not just conceptually? It reminded me that interaction design isn’t only about the interface; it’s about the experience and how deeply it aligns with our human nature. This reading didn’t just shift my thinking about interfaces — it made me realize that future design needs to be less about controlling machines and more about collaborating with them through a fuller range of expression. That’s a future I’d like to be part of.

Week 10: Reading Assignments

Reading 1: A Brief Rant on the Future of Interaction Design

This article challenged me not just to think about technology differently, but to think about my own body in a way I hadn’t before. I’ve never really considered how much my hands do until I read this, and how trained they are to respond to tactile stimuli. The jar had never even occurred to me until I read this piece. The idea that we’ve somehow accepted a future where our main way of interacting is through a single flat surface feels, honestly, a little absurd now. Victor’s relation of tying shoes with numb fingers to the uncomfortable, clumsy feeling of using a touchscreen was also very interesting. It’s like we’ve trained ourselves to tolerate the flat, lifeless interaction because it’s become such an integral part of our daily lives. That realization made me question how many other “innovations” we’ve accepted without thinking critically about what they’re replacing.

Victor’s point that “technology doesn’t just happen” was also really impactful. The future isn’t just something that will occur with no warning but rather something that we have a say and control over. We are not and should not be just passive consumers of technology and we can and should demand more human-centered, embodied interaction. The piece didn’t just critique existing technology, it kind of made me mourn what tactile feelings are being lost in this rush for sleek minimalism.

Reading 2: Follow-Up

I like that the author did not try to fake any sweetness in his responses and was extremely straight up in his replies. Still, he reframes the rant not as a complaint, but as a call to arms for future innovation. Rather than proposing a specific fix, he emphasized the importance of recognizing what’s missing in current interaction design which, as mentioned before, is the lack of physical, tactile, and dynamic engagement with technology nowadays. His point isn’t that iPads or voice commands are inherently bad, but that they represent a limited vision if they become the final stop in interface development because of the lack of tactility. Through analogies to historical tech like early cameras and color film, Victor highlights how true change begins with noticing a gap and daring to explore it.

 

 

 

Week 10: Musical Instrument Group Assignment [Izza and Hoor]

For this assignment, Hoor and I worked together to come up with the idea of using the push buttons from our kit as keys for a piano. We used cardboard to create the piano keys and poked the push buttons through the bottom layer. We then used copper tape to cover the push button’s pins and give the alligator clips something to attach to in order to connect the buttons with wires that went into the breadboard. For our analog sensor, we used a potentiometer to control the length of the sound made once a key was pressed. The result can be seen here:

https://drive.google.com/file/d/187WqUyYvRZ6KFFVMn0NtSO0ycqEzKyXq/view?usp=sharing

Our circuit diagram can also be seen here:

We’re really proud of the fact that we were able to complete the circuit using a variety of tools like the copper tape and alligator pins and were able to have a creative and working result. We are also really proud of the code that was inspired by the toneMelody exercise we did in class for the pitches. The code can be seen below:

#include "pitches.h"

const int speakerPin = 8;
const int potPin = A0;

const int buttonPins[] = {2, 3, 4, 5};
const int numButtons = 4;

// Define the notes for each button
int notes[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4};

void setup() {
  for (int i = 0; i < numButtons; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);  // Internal pull-up resistor
  }
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  int potValue = analogRead(potPin);  // 0–1023
  int noteDuration = map(potValue, 0, 1023, 100, 1000);  // Adjusts the lengths of the notes

  for (int i = 0; i < numButtons; i++) {
    if (digitalRead(buttonPins[i]) == LOW) {  // Button is pressed
      tone(speakerPin, notes[i], noteDuration);
      delay(noteDuration * 1.3);  // Pause between tones
      noTone(speakerPin);
    }
  }
}

We had some difficulty getting the buttons to connect with the alligator clips using the copper tape since it kept poking through the tape and was very fragile to move around. Even with a double reinforcement, the pins would still stick through. If we were to recreate it, we may seek another alternative that is thicker. We also encountered an unknown issue with some ghost keys where sounds would appear even if no key was pressed. This could be due to the copper tape issue as well.

Overall though, we are proud of the fact that the piano keys worked when pressed and the potentiometer properly adjusted the length of the notes as seen in the video.

Musical Instrument

Group members: Genesis & Nikita

Concept:

video

“WALL-E’s Violin” is inspired by the character WALL-E from the popular animated movie. The idea behind “WALL-E’s Violin” is to mimic WALL-E expressing emotion through music, specifically through a violin — a symbol of sadness and nostalgia. A cardboard model of WALL-E is built with an ultrasonic distance sensor acting as his eyes. A cardboard cutout of a violin and bow are placed in front of WALL-E to represent him holding and playing the instrument.

As the user moves the cardboard stick (the “bow”) left and right in front of WALL-E’s “eyes” (the ultrasonic sensor), the sensor detects the distance of the bow. Based on the distance readings, different predefined musical notes are played through a piezo buzzer. These notes are chosen for their slightly melancholic tones, capturing the emotional essence of WALL-E’s character

To add a layer of animation and personality, a push-button is included. When pressed, it activates two servo motors attached to WALL-E’s sides, making his cardboard arms move as if he is emotionally waving or playing along. The mechanical hum of the servos blends into the soundscape, enhancing the feeling of a robot expressing music not just through tones, but with motion — like a heartfelt performance from a lonely machine.

How it Works:

  • Ultrasonic Sensor: Acts as WALL-E’s eyes. It continuously measures the distance between the robot and the cardboard bow moved in front of it.
  • Note Mapping: The Arduino interprets the measured distances and maps them to specific, predefined musical notes. These notes are chosen to sound melancholic and emotional, fitting WALL-E’s character.
  • Piezo Buzzer: Plays the corresponding musical notes as determined by the distance readings. As the user sways the bow, the pitch changes in real time, simulating a violin being played.
  • Servo Motors + Button: Two servo motors are connected to WALL-E’s arms. When the button is pressed, they animate his arms in a waving or playing motion. The sound of the servos adds a robotic texture to the performance, further humanizing the character and blending physical movement with audio expression.

Code:

#include <Servo.h>
#include <math.h>

// Pin Assignments

const int trigPin   = 12;   // Ultrasonic sensor trigger
const int echoPin   = 11;   // Ultrasonic sensor echo
const int buzzerPin = 8;    // Buzzer output
const int buttonPin = 2;    // Button input for servo tap
const int servoPin  = 9;    // First servo control pin (default starts at 0°)
const int servoPin2 = 7;    // Second servo control pin (default starts at 180°)

Servo myServo;    // First servo instance
Servo myServo2;   // Second servo instance

// Bittersweet Melody Settings

int melody[] = {
  294, 294, 370, 392, 440, 392, 370, 294
};
const int notesCount = sizeof(melody) / sizeof(melody[0]);


// Sensor to Note Mapping Parameters

const int sensorMin = 2;     // minimum distance (cm)
const int sensorMax = 30;    // max distance (cm)
int currentNoteIndex = 0;  

// Vibrato Parameters 

const float vibratoFrequency = 4.0;  
const int vibratoDepth = 2;          

// Timer for vibrato modulation

unsigned long noteStartTime = 0;

void setup() {
  // Initialize sensor pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  
 // Initialize output pins
  pinMode(buzzerPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);  
  
  // Initialize the first servo and set it to 0°.
  myServo.attach(servoPin);
  myServo.write(0);
  
   // Initialize the second servo and set it to 180°..
  myServo2.attach(servoPin2);
  myServo2.write(180);
  
 // Initialize serial communication for debugging
  Serial.begin(9600);
  
// Get the initial note based on the sensor reading
  currentNoteIndex = getNoteIndex();
  noteStartTime = millis();
}

void loop() {
 // Get the new note index from the sensor
  int newIndex = getNoteIndex();
  
 // Only update the note if the sensor reading maps to a different index
  if (newIndex != currentNoteIndex) {
    currentNoteIndex = newIndex;
    noteStartTime = millis();
    Serial.print("New Note Index: ");
    Serial.println(currentNoteIndex);
  }
  
 // Apply vibrato to the current note.
  unsigned long elapsed = millis() - noteStartTime;
  float t = elapsed / 1000.0;  // Time in seconds for calculating the sine function
  int baseFreq = melody[currentNoteIndex];
  int modulatedFreq = baseFreq + int(vibratoDepth * sin(2.0 * PI * vibratoFrequency * t));
  
// Output the modulated tone
  tone(buzzerPin, modulatedFreq);
  
  // Check the button press to trigger both servos (movement in opposite directions).
  if (digitalRead(buttonPin) == LOW) {
    tapServos();
  }
  
  delay(10); 
}

// getNoteIndex() measures the sensor distance and maps it to a note index.
int getNoteIndex() {
  int distance = measureDistance();
// Map the distance (between sensorMin and sensorMax) to the range of indices of the array (0 to notesCount-1).
  int index = map(distance, sensorMin, sensorMax, 0, notesCount - 1);
  index = constrain(index, 0, notesCount - 1);
  return index;
}

// measureDistance() triggers the ultrasonic sensor and calculates the distance in cm.
int measureDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  return constrain(distance, sensorMin, sensorMax);
}

// tapServos() performs a tap action on both servos:
// - The first servo moves from 0° to 90° and returns to 0°.
// - The second servo moves from 180° to 90° and returns to 180°.
void tapServos() {
  myServo.write(90);
  myServo2.write(90);
  delay(200);
  myServo.write(0);
  myServo2.write(180);
  delay(200);
}

 

Circuit:

Future Improvements:

    • Add more notes or an entire scale for more musical complexity.
    • Integrate a servo motor to move the arm or head of WALL-E for animation.
    • Use a better speaker for higher-quality sound output.
    • Use the LED screen to type messages and add to the character.

 

Week 3: Generative artwork using Object-Oriented Programming.

Concept

Prior to this assignment being due, I was going through some of my childhood story books and thats when The Pied Piper of Hamelin inspired me to create this artwork using OOP. In the original story, the Piper plays his magical tune, and all the mice follow him (and later, all the children of the town as well)

Pied Piper IllustrationBut I wanted to keep things happy and do just that part of mice following him in my preferred aesthetic- a night sky. In my version, a glowing dot represents the Piper, and mice randomly appear and start following it. But they’re not just mindless followers—if you move your mouse, you can actually distract them!

When the mice get close to the Piper, they start glowing, like they’re enchanted by some unseen force. To make things even more atmospheric, the background keeps shifting, creating a dreamy, night-sky effect.

 The Artwork

Week 10

const int ldrPin = A0;         // LDR connected to analog pin A0
const int buttonPin = 2;       // Button connected to digital pin 2
const int speakerPin = 9;      // Speaker connected to digital pin 9
const int ledPin = 13;         // LED connected to pin 13

// Dramatically different frequencies (non-musical)
int notes[] = {100, 300, 600, 900, 1200, 2000, 3000};

void setup() {
  pinMode(buttonPin, INPUT);         // Button logic: HIGH when pressed
  pinMode(speakerPin, OUTPUT);     
  pinMode(ledPin, OUTPUT);         
  Serial.begin(9600);              
}

void loop() {
  int buttonState = digitalRead(buttonPin); // Read the button

  if (buttonState == HIGH) {
    int lightLevel = analogRead(ldrPin);         // Read LDR
    int noteIndex = map(lightLevel, 0, 1023, 6, 0); // Bright = low note
    noteIndex = constrain(noteIndex, 0, 6);      // Keep within range
    int frequency = notes[noteIndex];            // Pick frequency

    tone(speakerPin, frequency);                 // Play note
    digitalWrite(ledPin, HIGH);                  // LED on

    Serial.print("Light: ");
    Serial.print(lightLevel);
    Serial.print(" | Frequency: ");
    Serial.println(frequency);
  } else {
    noTone(speakerPin);            // No sound
    digitalWrite(ledPin, LOW);     // LED off
  }

  delay(100);
}


 

 

 

Week 10: Follow-up article

Going through the responses to Victor’s rant, I found it interesting how many people agreed with the idea that touchscreens are a dead end, but still struggled to imagine what a better alternative would look like. It’s almost like we all know something is missing, but we’re too deep inside the current system to clearly picture a different path. I noticed some people pointed to things like haptic feedback or VR as potential improvements, but even those seem to stay within the same basic mindset — still about looking at and manipulating screens, just in fancier ways. It made me realize how hard it is to break out of an existing mental model once it becomes the norm.

What also stood out to me is that a lot of the responses weren’t dismissive or defensive — they were actually pretty hopeful. Even though Victor’s tone was a bit harsh in the original rant, the responses seemed to take it as a genuine challenge rather than just criticism. That feels important, because it shows that many designers do want to think bigger; they just need help finding new tools or ways of thinking. It made me think that maybe progress in interaction design isn’t about inventing some magic new device overnight, but about slowly shifting how we even define interaction in the first place.