Week 10 – Reading Response

While reading A Brief Rant on the Future of Interaction Design, I didn’t really expect it to be this focused on hands. But the author makes a pretty solid point: for all our futuristic tech, we’re still weirdly okay with staring and tapping at flat glass rectangles all day. It’s kind of wild how little we think about how limited that is compared to what our hands are actually capable of.

What really stuck with me was the idea that “Pictures Under Glass” is basically a transition phase. We’re so used to touchscreens that we’ve stopped questioning whether they’re actually good. The essay really challenges that, and it’s kinda refreshing. It reminded me of how good tools should feel natural, like when you’re making a sandwich and your hands just know what to do without you even thinking about it. You don’t get that with a tablet or phone. You’re just poking at digital buttons.

I also liked how the author ties this all back to choice. Like, we’re not just being dragged into a touchscreen future, we’re choosing it, whether we realize it or not. That part made me think about design more like storytelling or world-building. We’re shaping how people live and interact, and that’s a big deal.

Overall, it was a fun read with a strong message to stop settling for tech that dulls our senses. Our bodies are way more capable than what current interfaces allow, and maybe the real futuristic thing would be tech that actually lets us do more and not less.

—————————————————————-

Reading the follow-up article helped me better understand where the author was coming from. It’s not just a complaint about modern tech, it’s a call to action. He’s not saying the iPad or touchscreen interfaces are bad. In fact, he even says they are revolutionary. The point is that we shouldn’t stop there. Just because something works now doesn’t mean it can’t be better.

What really stood out to me was the idea that technology doesn’t just evolve on its own. People make choices on what to build, what to fund, what to imagine. If we don’t invest in more expressive, physical ways to interact with computers, we’re basically choosing a future where everything stays flat and glassy.

I also liked how the author handled the voice interface discussion. Voice has its place, commands, quick info, etc. but when it comes to creating or understanding complex things, it falls short. You can’t sculpt a statue or sketch a design with just your voice. Some things require our hands, our bodies, our sense of space.

That quote from the neuroscientist about touch and brain development was super eye-opening. It made me think about how much we take our sense of touch for granted, especially when using digital devices. In the end, the response made the original rant feel less like a critique and more like an invitation to dream a little bigger about what our future tech could be.

Week 10 : Musical Instrument

For this week’s assignment, we were tasked with using Arduino to create a musical instrument. Working in pairs, Areeba and I decided to create a piano based on the tone() function we had explored earlier in class. In our project, we wanted each button switch to correspond to a different note from a piano, so that it could be “played”.

We were asked to use both an analogue and digital component for the assignment; while the digital component was simple enough with the buttons, we decided to use a pontentiometer as our analogue component, and used it to control the pitch of the notes being produced by each button.

The components we used were:

  • Arduino Uno
  • Breadboards
  • Jumper cables
  • 10k Ohm resistors
  • Push buttons
  • 5V speaker

Here is an image of our project, and the schematic:

Our code:

#include "pitches.h"

const int potPin = A0;          // Potentiometer on A0
const int buzzerPin = 8;        // Speaker on D8
const int buttonPins[] = {2, 3, 4, 5, 6, 7, 9, 10}; // C4-C5 buttons
const int baseNotes[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_G4, NOTE_A4, NOTE_B4, NOTE_C5};

void setup() {
  for (int i = 0; i < 8; i++) {
    pinMode(buttonPins[i], INPUT); 
  }
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  static int potValues[5] = {0};
  static int potIndex = 0;
  potValues[potIndex] = analogRead(potPin);
  potIndex = (potIndex + 1) % 5;
  int octaveShift = map(
    (potValues[0] + potValues[1] + potValues[2] + potValues[3] + potValues[4]) / 5,
    0, 1023, -2, 2
  );

  // Check buttons 
  bool notePlaying = false;
  for (int i = 0; i < 8; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) { // HIGH when pressed (5V)
      int shiftedNote = baseNotes[i] * pow(2, octaveShift);
      tone(buzzerPin, constrain(shiftedNote, 31, 4000)); 
      notePlaying = true;
      break;
    }
  }

  if (!notePlaying) {
    noTone(buzzerPin);
  }
  delay(10);
}

Videos of our project:

Playing Happy Birthday

Adjusting the Potentiometer

A challenge we faced was definitely our lack of musical knowledge. Neither Kashish or I are musicians, and as such, we had to do a lot of research to understand the terminology and theory, such as the notes, and how to adjust the octave using the potentiometer. We then also had to figure how to reflect these findings within our code.

Overall though, we had a great time making this project, and then executing our idea.

Week 10: Instrument

Concept

Our musical instrument is based off a combination of a both a mechanical and digital noise machine–controlled with a surprisingly satisfying potentiometer that allows us to modify the beats per minute (BPM), and a button that changes the octave of our instrument. We had fun with the relationship between the buzzer and our mechanical servo which produces the other noise, where the two are inversely proportional! In other words,  as the BPM of the buzzer increases, the servo slows down! And vice versa.

Figuring out the right code combination was a bit tricky at first, as we first made the mistake of nesting both the servo trigger and the buzzer tone in the same beat conditional. This meant that we fundamentally could not separate the motor from the buzzer tones. To resolve this, we ended up using two triggers–a buzzer and a motor trigger–which both run independently.

The code below shows how we ended up resolving this problem. Once we implemented this fix, we were able to celebrate with our somewhat perplexing instrument, that requires the user to complement the buzzers various beeping with the servos mechanical changes.

// --- 3. Perform Actions based on Timers ---

// --- Action Group 1: Metronome Click (Buzzer) ---
if (currentMillis - previousBeatMillis >= beatInterval) {
  previousBeatMillis = currentMillis; // Store the time of this beat

  // Play the click sound
  tone(BUZZER_PIN, currentFreq, CLICK_DUR);

  // Print beat info
  Serial.print("Beat! BPM: ");
  Serial.print(currentBPM);
  Serial.print(" | Freq: ");
  Serial.print(currentFreq);
  Serial.print(" | Beat Interval: ");
  Serial.print(beatInterval);
  Serial.println("ms");
}

// --- Action Group 2: Servo Movement Trigger ---
if (currentMillis - previousServoMillis >= servoInterval) {
  previousServoMillis = currentMillis; // Store the time of this servo trigger

  // Determine Target Angle
  int targetAngle;
  if (servoMovingToEnd) {
    targetAngle = SERVO_POS_END;
  } else {
    targetAngle = SERVO_POS_START;
  }

  // Tell the servo to move (NON-BLOCKING)
  myServo.write(targetAngle);

  // Print servo info
  Serial.print("Servo Trigger! Target: ");
  Serial.print(targetAngle);
  Serial.print("deg | Servo Interval: ");
  Serial.print(servoInterval);
  Serial.println("ms");

  // Update state for the next servo trigger
  servoMovingToEnd = !servoMovingToEnd; // Flip the direction for next time
}

 

Reading reflection-week 10

Reading A Brief Rant on the Future of Interaction Design honestly made me pause and look around at all the tech I use every day. The author’s frustration really hit home, especially the part about how everything has become a flat screen with no real feedback. It reminded me of how awkward it feels to use a touchscreen on a stove or in a car just to do something simple, like turn down the heat or change the music. Those actions used to be so quick and intuitive. I think the author has a point—we’ve lost touch with how things feel. It’s like we’re designing for appearances now, not for actual human comfort. I didn’t feel like he was biased in a bad way he just seemed passionate, like someone who really cares about how design affects everyday life.

What stuck with me most was how much we’ve adapted to poor interaction design without even noticing. I catch myself getting used to frustrating interfaces all the time, and this reading made me realize we shouldn’t have to settle for that. It made me wonder: what would technology look like if we designed it based on how our bodies naturally move and react? That question stayed with me after I finished reading. It also raised another thought—how do we balance innovation with simplicity? Just because something’s high-tech doesn’t mean it’s better. I think future designers, myself included, need to remember that. This reading didn’t completely change my beliefs, but it definitely sharpened them.

Week 10 – Reading Response

These readings were very eye-opening in some regards, but I also felt that they were a bit closed-off in others. With our many discussions on what it means for something to be well-designed, we have had to start thinking about our works beyond just their function. Now we are bringing physiology into the equation on top of the prior psychological aspects. I have some idea of how miraculous the human body is and how much work it takes to try and replicate the simplest of actions via technology, but the examples posed about how we use our hands painted a much more intimate picture than I had considered. For example, I might think about how I struggle to play FPS games on a controller whereas several of my friends can’t stand using a mouse and keyboard. The broader significance of tactile feedback would be largely lost in that simple case, let alone the discussion on how input devices might look in 20 years.

Speaking of, I was much more uncertain about how we might work towards this brighter future. This post from 2011 has plenty of thoughts on the matter, but the landscape has changed greatly since then. For example, a good portion of the reading focuses on how touchscreens fall flat of delivering the true experience, but I would argue that many people feel more comfortable with an e-book than a real one, especially so for the younger demographic that grew up with a tablet in hand. The discussion of voice also seems rather shortsighted in hindsight, given that Siri first released in 2010 and was shortly followed by Alexa in 2014. Nowadays you can even push voice-based systems to the point of live-translating speech or leaning on LLMs to achieve goals in the realm of creating and understanding. Lastly, on the subject of brain interfaces, I felt that the author was a bit too pessimistic about things. Companies like Neuralink still fall short today, but they can be hugely impactful for people that can’t rely on their bodies. I can definitely agree that nature is the ultimate inventor, but I don’t think that brain-computer interfaces should be disregarded out of fear that we will all end up forsaking the real world. Either way, with the rate that certain technologies have progressed over the past few years, it’s hard to even imagine what direction interfaces might evolve in.

Week 10 – Musical Instrument

Design:
I used a slide switch as the digital sensor, and a potentiometer + photometer as the analog sensors. The slide switch was originally just an on/off switch, but now allows the instrument to switch between two modes (which currently shifts the octave).  The potentiometer acts as a volume control, while the photometer provides an additional input towards the tone frequency.
The physical component is a simple sort of keyboard made from paper and copper tape. It works by passing current through one of five different resistors (1k, 10k, 100k, 470k, 1m) and using analogRead() to identify which key was pressed. The five keys correspond to the notes A4/5 through E4/5. The initial steps of wiring the circuit and writing the code were fairly painless, which was not the case for converting it into its final result. I think it definitely suffered from my lack of ability with music as well as arts and crafts.

Diagram:

Code:

/*
  Musical Instrument
*/

#include "pitches.h"

// // Global Variables
const int LIGHT_PIN = A0;
const int TOUCH_PIN = A5;
const int MODE_PIN = 2;
const int SOUND_PIN = 8;
int low[] = {NOTE_A4, NOTE_B4, NOTE_C4, NOTE_D4, NOTE_E4};
int high[] = {NOTE_A5, NOTE_B5, NOTE_C5, NOTE_D5, NOTE_E5};



void setup() {
  Serial.begin(9600);
  pinMode(LIGHT_PIN, INPUT);
  pinMode(TOUCH_PIN, INPUT);
  pinMode(MODE_PIN, INPUT);
  pinMode(SOUND_PIN, OUTPUT);
}

void loop() {
  // // Read digital/analog
  int modeVal = digitalRead(MODE_PIN);
  // Serial.println(modeVal);
  int lightVal = analogRead(LIGHT_PIN);
  // Serial.println(lightVal);
  int touchVal = analogRead(TOUCH_PIN);
  Serial.println(touchVal);
  
  // // Choose note based on touch input
  // 1k = 932
  // 10k = 512
  // 100k = 92
  // 470k = 20
  // 1m = 8
  int x; int mapped;
  if (touchVal < 4) {noTone(SOUND_PIN); return;
  }
  else if (touchVal < 16) {x = 0;}
  else if (touchVal < 64) {x = 1;}
  else if (touchVal < 128) {x = 2;}
  else if (touchVal < 768) {x = 3;}
  else if (touchVal < 1024) {x = 4;}
  // Serial.println(x);

  // // Choose high/low range based on mode
  int note;
  if (modeVal) {
    note = low[x];
    mapped = map(lightVal, 400, 1200, 0, 2500);
  }
  else {
    note = high[x];
    mapped = map(lightVal, 400, 1200, 2500, 5000);
  }
  // // Play sound
  tone(SOUND_PIN, note);
}

 

 

Assignment 10: Musical Instrument

For this week’s assignment, we were tasked with using Arduino to create a musical instrument. Working in pairs, Kashish and I decided to create a piano based on the tone() function we had explored earlier in class. In our project, we wanted each button switch to correspond to a different note from a piano, so that it could be “played”.

We were asked to use both an analogue and digital component for the assignment; while the digital component was simple enough with the buttons, we decided to use a pontentiometer as our analogue component, and used it to control the pitch of the notes being produced by each button.

The components we used were:

  • Arduino Uno
  • Breadboards
  • Jumper cables
  • 10k Ohm resistors
  • Push buttons
  • 5V speaker

Here is an image of our project, and the schematic:

Our code:

#include "pitches.h"

const int potPin = A0;          // Potentiometer on A0
const int buzzerPin = 8;        // Speaker on D8
const int buttonPins[] = {2, 3, 4, 5, 6, 7, 9, 10}; // C4-C5 buttons
const int baseNotes[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_G4, NOTE_A4, NOTE_B4, NOTE_C5};

void setup() {
  // Keep external 10k resistors
  for (int i = 0; i < 8; i++) {
    pinMode(buttonPins[i], INPUT); 
  }
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  // potentiometer value
  int potValues[5] = {0};
  int potIndex = 0;
  potValues[potIndex] = analogRead(potPin);
  potIndex = (potIndex + 1) % 5;
  int octaveShift = map( // mapping potentiometer values as a shift in octave
    (potValues[0] + potValues[1] + potValues[2] + potValues[3] + potValues[4]) / 5,
    0, 1023, -2, 2
  );

  // Check buttons to play notes
  bool notePlaying = false;
  for (int i = 0; i < 8; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) {  
      int shiftedNote = baseNotes[i] * pow(2, octaveShift);
      tone(buzzerPin, constrain(shiftedNote, 31, 4000)); 
      notePlaying = true;
      break;
    }
  }

  if (!notePlaying) {
    noTone(buzzerPin);
  }
  delay(10); 
}

Videos of our project:

Playing Happy Birthday

Adjusting the Potentiometer

A challenge we faced was definitely our lack of musical knowledge. Neither Kashish or I are musicians, and as such, we had to do a lot of research to understand the terminology and theory, such as the notes, and how to adjust the octave using the potentiometer. We then also had to figure how to reflect these findings within our code.

Overall though, we had a great time making this project, and then executing our idea.

Week 10 – Reading Response

Bret Victor’s “A Brief Rant on the Future of Interaction Design” calls out future interfaces to be mere pictures under glass. He argues that this approach is ignoring the amazing capabilities of the human hands. Current trends focus too much on visual interactions, neglecting our sense of touch. Hands are powerful tools for a sense of dexterity and manipulation, something currently absent in a lot of touch based technologies. He takes inspiration from Alan Key’s early vision of the iPad, and encourages designers to explore our mediums that can potentially improve user experiences by engaging us both visually and sensually. A lot of his argument actually makes sense. It is human nature, from being babies to be attracted to bring screens, but our immediate reaction to anything is to touch it, see if it turns, or moves, or if its a switch or a knob. We always seek some sense of physical manipulation in objects, and I do think modern technologies are limiting this to a far more visual approach. Even when I type essays, I prefer using a laptop keyboard over my iPad screen, because I genuinely like the touch-and-feel of clicking and clacking on the keyboard keys- it somehow motivates me to write better. Even though this isn’t a full usage of hands and body movements, this minute degree of change in physical interaction itself makes me more interested. “With an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?” – this statement really opened my eyes to how much technology has been reshaped to fit users to a limited physical experience. 

In the response article, he answers some of the questions people had. I can admit that I too was unsure of the stylus situation. And I’m not sure how styluses can be made to be dynamic. I fully agree with him on the voice input part- I never found that to be appealing or better. Overall, I’d say Vistor’s rant on the future of interaction design is completely justifiable- to be futuristic doesn’t mean to be deprived of sensuality. Rather embracing the movements of our hands and channeling a sense of physical interaction can definitely aid us in improving the future of interaction.



Analog input & output

Concept

For this project, I tried to build a simple lighting system using an LDR (light sensor), a push-button, and two LEDs. One LED changes brightness depending on how bright the room is, and the other lights up to show when I’ve manually taken control.

I wanted to manually override the automatic light control with the press of a button—so if I want the light to stay at a fixed brightness no matter how bright it is outside, I just hit the button. Press it again, and the automatic behavior comes back.

I used TinkerCad for the circuit simulation.

Video

How It Works

    1. The LDR is connected to pin A0 and tells the Arduino how bright the environment is.

    2. Based on this reading, the Arduino maps the value to a number between 0 and 255 (for LED brightness).

    3. The LED on pin 9 gets brighter when it’s dark and dims when it’s bright—automatically.

    4. I also wired a button to pin 2. When I press it, the system switches to manual mod

      • In this mode, the LED stays at medium brightness, no matter the light level.

      • An indicator LED on pin 13 lights up to let me know I’m in manual mode.

    5. Pressing the button again switches back to automatic mode.

Code

// Pin Definitions
const int ldrPin = A0;         // LDR sensor connected to A0
const int buttonPin = 2;       // Push-button connected to digital pin D2
const int pwmLedPin = 9;       // PWM LED for the ambient light effect
const int overrideLedPin = 13; // Digital LED for manual override indicator

// Variables
bool manualOverride = false;   // Tracks if the override mode is active
int lastButtonState = LOW;     // With external pull-down, default is LOW
unsigned long lastDebounceTime = 0;
const unsigned long debounceDelay = 50; // Debounce time in milliseconds

void setup() {
  pinMode(ldrPin, INPUT);
  pinMode(buttonPin, INPUT);
  pinMode(pwmLedPin, OUTPUT);
  pinMode(overrideLedPin, OUTPUT);
  
  // Start with manual override off, LED off
  digitalWrite(overrideLedPin, LOW);
  Serial.begin(9600);
}

void loop() {
  // Read the LDR Sensor
  int ldrValue = analogRead(ldrPin);
  
  // Map the LDR value to PWM brightness (0-255).
  // Darker environment (low ldrValue) yields a higher brightness.
  int pwmValue = map(ldrValue, 0, 1023, 255, 0);
  
  // Handle the Push-Button for Manual Override with Debouncing
  int reading = digitalRead(buttonPin);
  if (reading != lastButtonState) {
    lastDebounceTime = millis();
  }
  if ((millis() - lastDebounceTime) > debounceDelay) {
    //Unpressed = LOW, pressed = HIGH.
    if (reading == HIGH && lastButtonState == LOW) { // button press detected
      manualOverride = !manualOverride;
      // Update the indicator LED accordingly
      digitalWrite(overrideLedPin, manualOverride ? HIGH : LOW);
    }
  }
  lastButtonState = reading;
  
  // LED Behavior Based on Mode 
  if (manualOverride) {
    // In manual override mode, set LED to a fixed brightness.
    analogWrite(pwmLedPin, 128);
  } else {
    // Set brightness according to ambient light measured by the LDR.
    analogWrite(pwmLedPin, pwmValue);
  }
  
  // Debug output
  Serial.print("LDR Value: "); Serial.print(ldrValue);
  Serial.print(" | PWM Brightness: "); Serial.print(pwmValue);
  Serial.print(" | Manual Override: "); Serial.println(manualOverride ? "ON" : "OFF");
  
  delay(10);
}

Challenges

    • Balancing Automatic and Manual Modes:
      Getting the right balance between automatic brightness adjustments and a satisfying manual override was a fun challenge. I had to fine-tune the mapping of LDR readings to PWM values until the LED’s response felt right in different lighting conditions.

    • Debugging with Serial Monitor:
      Utilizing the Serial Monitor was incredibly useful. Every time something wasn’t working as expected, I added more Serial prints to understand what was happening.

Week 9 – Reading Response

Physical Computing:
It was really interesting to see all the different forms that physical computing pieces can take. The examples given, despite only being a small glimpse into the genre, cover a whole bunch of mediums and correspondingly provide entirely new experiences. In order to do so, they each have to take into account the physical and digital components for both input and output, and figure out how to blend them all together into one working whole. What stood out to me the most was all the little tricks used to make sense of intuitive human motions such as hand movements, as well as the variations of ways to do so. That particular point was also echoed in the introduction, where it mentioned that the same idea can be produced in varying ways. Hopefully as the semester progresses, my own pieces will similarly become more complex and interactive.

Interactive Art:
After the first half of this class and the resulting midterm project, I can definitely identify with what the reading is getting at. The previous readings on different aspects of design come into play once again, albeit with more emphasis on the physical aspect this time around. The listening aspect builds on that and turns the creation process into a more iterative version of itself to presumably beneficial effect. I liked the comparison to telling an actor how to act versus giving them the tools to perform their craft, which naturally leads into the next analogy of the piece being a performance involving the audience. Going forward I’d like to keep this perspective in mind, and endeavour to provide a more comprehensive experience in my work.