Afra Binjerais – Week 11 assignment

Arduino Piano

🌟 Inspiration: 

Music possesses a remarkable capacity to bridge cultural and linguistic divides and unite individuals from diverse backgrounds. Motivated by the notion of promoting interactive experiences and democratizing music creation, we set out to build a one-of-a-kind musical instrument out of buttons and Arduino. Our intention was to enable people to use sound as a creative medium for self-expression, irrespective of their experience with music. We chose the piano as our main inspiration because we could recreate the chords using buttons.

💡 Process:

Using Arduino Uno we wired up buttons to serve as interactive triggers for different musical notes. Each button was assigned a specific pitch or sound, allowing users to create melodies by pressing combinations of buttons. We leveraged our programming skills to code the logic behind the instrument, ensuring seamless functionality and an intuitive user interface.

🚀 Difficulties: 

Although our journey was exciting and innovative, there were certain difficulties. A major challenge we encountered was guaranteeing the reliability and consistency of button pushes. We have to use precise calibration and filtering techniques to get beyond problems like noise interference and debounce.

There were additional difficulties in creating an intuitive user interface. To ensure that users could simply grasp how to interact with the instrument while still having the freedom to explore and experiment with various musical compositions, we had to find a balance between simplicity and utility.

Arduino illustration:

The code:

const int speakerPin = 9;  // Speaker connected to pin 9
int buttonPins[] = {2, 3, 4, 5};  // Button pins for C, D, E, F
int notes[] = {262, 294, 330, 349};  // Frequencies for C4, D4, E4, F4
int potPin = A0;  // Potentiometer connected to A0
int volume;
int melody[] = {262, 262, 349, 349, 330, 330, 294};  // Melody for Twinkle Twinkle
int noteDurations[] = {500, 500, 500, 500, 500, 500, 500};  // Duration of each note


void setup() {
 for (int i = 0; i < 4; i++) {
   pinMode(buttonPins[i], INPUT_PULLUP);
 }
 pinMode(speakerPin, OUTPUT);
}


void loop() {
 volume = analogRead(potPin) / 4;  // Read volume level from potentiometer


 if (digitalRead(buttonPins[0]) == LOW) {  // Check if the first button is pressed
   playSong();
 }
 for (int i = 0; i < 4; i++) {
   if (digitalRead(buttonPins[i]) == LOW) {  // Check if any button is pressed
     tone(speakerPin, notes[i], volume);  // Play note with dynamic volume
     delay(200);  // A short delay to help debounce the button
     while (digitalRead(buttonPins[i]) == LOW);  // Wait for button release
     noTone(speakerPin);  // Stop the note
   }
 }
}


void playSong() {
 for (int thisNote = 0; thisNote < 7; thisNote++) {
   int noteDuration = 1000 / noteDurations[thisNote];
   tone(speakerPin, melody[thisNote], noteDuration);
   int pauseBetweenNotes = noteDuration * 1.30;
   delay(pauseBetweenNotes);
   noTone(speakerPin);
 }

The video:

 

Week 11: Our Musical Instrument “Turntable” – Yaakulya & Luke

Concept : This project is a creative exploration into the realm of music and interactivity. Inspired by the dynamic role of a DJ, we have created a turntable that allows users to listen  two globally popular songs: “Despacito” by Luis Fonsi and Daddy Yankee, and the “Mario Theme Song” by Koji Kondo for Nintendo. The project aims to provide a simple yet enjoyable experience, blending Western and Eastern musical influences.

Objective: The primary objective is to design an interactive turntable that allows the user to play and switch between two distinct songs using a digital switch and an analog potentiometer, emulating the experience of a DJ.

Components we used:
Arduino UNO
Potentiometer (Analog Sensor)
Push Button (Digital Sensor)
Miniature Speaker
10k Ohms Resistor

Implemented Code with comments:

#include "pitches.h"

#define BUZZER_PIN 9
#define POTENTIOMETER_PIN A0
#define BUTTON_PIN 7

// Melody for the first song
int melody1[] = {
  NOTE_D4, NOTE_CS4, NOTE_B3, NOTE_FS3, NOTE_FS3, NOTE_FS3, NOTE_FS3, NOTE_FS3,
  NOTE_FS3, NOTE_B3, NOTE_B3, NOTE_B3, NOTE_A3, NOTE_B3, NOTE_G3, NOTE_G3, NOTE_G3, NOTE_G3, NOTE_G3,
  NOTE_G3, NOTE_B3, NOTE_B3, NOTE_B3, NOTE_B3, NOTE_CS4, NOTE_D4, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3,
  NOTE_A3, NOTE_D4, NOTE_D4, NOTE_D4, NOTE_D4, NOTE_E4, NOTE_E4, NOTE_CS4
};

// Note durations for the first song
int noteDurations1[] = {
  4,4,8,8,16,16,16,16,
  16,16,16,8,16,8,16/3,16,16,16,16,
  16,16,16,16,8,16,8,16/3,16,16,16,16,
  16,16,16,16,8,16,8,16/3
};

// Melody for the second song
int melody2[] = {
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_C4, NOTE_E4, NOTE_G4, NOTE_G3, NOTE_C4, NOTE_G3, NOTE_E3, NOTE_A3, NOTE_B3, NOTE_AS3, NOTE_A3,NOTE_G3,NOTE_E4,NOTE_G4,NOTE_A4,NOTE_F4,NOTE_G4,NOTE_E4,NOTE_C4,NOTE_D4,NOTE_B3,NOTE_C4
};

// Note durations for the second song
int noteDurations2[] = {
  16,8,8,16,8,4,4,16/3,16/3,16/3,8,8,16,8,32/3,32/3,32/3,8,16,8,8,16,16,16/3,8
};

// Flag to indicate if a song is currently playing
bool songPlaying = false;
// Variable to store the previous state of the button
bool lastButtonState = false;

// Setup function to initialize the pins
void setup() {
  // Set the buzzer pin as output
  pinMode(BUZZER_PIN, OUTPUT);
  // Set the button pin as input with internal pull-up resistor enabled
  pinMode(BUTTON_PIN, INPUT_PULLUP);
}

// Loop function to continuously check for button presses and play melodies
void loop() {
  // Read the current state of the button
  bool buttonState = digitalRead(BUTTON_PIN);

  // Check if the button is pressed and it was not pressed in the previous loop iteration
  if (buttonState == LOW && lastButtonState == HIGH) {
    // Toggle the song playing state
    songPlaying = !songPlaying;

    // Start or stop playing the selected song
    if (songPlaying) {
      // If the potentiometer value is less than 512, play the first melody; otherwise, play the second melody
      if (analogRead(POTENTIOMETER_PIN) < 512) {
        playMelody(melody1, noteDurations1, sizeof(melody1) / sizeof(int));
      } else {
        playMelody(melody2, noteDurations2, sizeof(melody2) / sizeof(int));
      }
    } else {
      // If the song was playing, stop playing it
      noTone(BUZZER_PIN);
    }
  }

  // Update the last button state for the next iteration
  lastButtonState = buttonState;
}

// Function to play a melody
void playMelody(int melody[], int noteDurations[], int size) {
  // Iterate over each note in the melody
  for (int thisNote = 0; thisNote < size; thisNote++) {
    // Calculate the duration of the current note
    int noteDuration = 1000 / noteDurations[thisNote];
    // Play the current note on the buzzer pin
    tone(BUZZER_PIN, melody[thisNote], noteDuration);
    // Wait for the duration of the note
    delay(noteDuration * 2);
    // Stop playing the note
    noTone(BUZZER_PIN);
    // Add a short pause between notes
    delay(noteDuration * 0.1);
  }
}

Code Logic:

The code begins by defining the pin numbers for the buzzer, potentiometer, and button. The buzzer pin is where the speaker is connected, the potentiometer pin is where the potentiometer (for selecting songs) is connected, and the button pin is where the push button (for controlling song playback) is connected.

  • In the setup() function, the buzzer pin is set as an output, and the button pin is set as an input with an internal pull-up resistor enabled. This configuration ensures that the button pin reads HIGH when the button is not pressed and LOW when it is pressed.  Moreover two melodies are defined (melody1 and melody2), each represented as an array of note frequencies. Corresponding to each melody, two arrays (noteDurations1 and noteDurations2) define the duration of each note in the melodies.
  • In the loop() function, the code continuously checks the state of the button. When the button transitions from HIGH to LOW (indicating a button press), the code toggles the songPlaying variable to start or stop playing the selected song. If songPlaying is true, the code reads the potentiometer value to determine which melody to play. If the potentiometer value is less than 512, it plays melody1; otherwise, it plays melody2. If songPlaying is false, it stops playing the melody by calling noTone() on the buzzer pin.
  • The playMelody() function is responsible for playing a melody. It takes as input the melody array, note duration array, and the size of the melody. Inside the function, it iterates over each note in the melody, calculates the duration of the note, plays the note using tone(), adds a short delay between notes, and then stops playing the note using noTone().

Video Demonstration:

Images of the circuit:

Schematics Sketch:

Circuit connections from above Schematics diagram.

1) Potentiometer Connections: The potentiometer is crucial for song selection and control. It has three pins: two for power and one for the signal. The outer pins are connected to the 5V and GND (ground) terminals on the Arduino to provide it with power. The middle pin, which is the wiper, is connected to the analog input A0 on the Arduino. As the user turns the knob of the potentiometer, the resistance changes, and the Arduino reads this as an analog value, allowing the user to blend between the two songs.

2) Button Connections: A push button is used to start or stop the song playback. One side of the button is connected to the digital pin 2 on the Arduino. The other side is connected to the GND terminal on the breadboard. A pull-up resistor is not necessary as the Arduino’s internal pull-up resistor is used in the code to avoid floating values when the button is not pressed.

3) Speaker Connections: The speaker, which outputs the sound, has two wires. The positive wire is connected to digital pin 8 on the Arduino. This pin is set up in the code to deliver a pulse-width modulation (PWM) signal that creates the audio tones for the songs. The negative wire from the speaker is connected directly to the GND terminal on the breadboard to complete the circuit.

4) Power Connections: The breadboard’s power rail is used to distribute the 5V and GND connections to the potentiometer and the button. Wires run from the 5V and GND pins on the Arduino to the power rail on the breadboard.

With this setup, the user interacts with the button and potentiometer to control the playback of “Despacito” and the “Mario Theme Song”. The code on the Arduino processes the input signals and triggers the speaker to play the appropriate song based on the state of the button and the value from the potentiometer.

What does the Analog and Digital Sensor do in this circuit? – Schematics:

The code and circuit are structured to respond to the interactions with the potentiometer and the push button:

1) Button Press: Initiates the playback of “Despacito” or stops the current song.

2) Potentiometer Rotation: Facilitates the blending from “Despacito” to the “Mario Theme Song.” and vice versa.

The user must press the button to start playing “Despacito.” They can then rotate the potentiometer to initiate the “Mario Theme Song” by pressing the button again. This interaction can be repeated to switch between the two songs.

Challenges & Reflections:

1) Connecting the buzzer, potentiometer, and push button to the Arduino board together proved to be trickier than anticipated. We encountered issues with loose connections, incorrect wiring, and misplacing the digital pin assignments. Later then, we carefully reviewed the circuit diagram and schematics to double-checked each wiring connection, additionally we also  referenced the Arduino documentation and class slides to ensure we understood the purpose and functionality of each sensor working nature.

2) When testing the circuit, we noticed inconsistencies in melody playback. Some notes were not playing correctly, and there were unexpected pauses between notes and some notes are repeating itself. To troubleshoot this issue, we reviewed the code related to melody playback and examined the arrays containing note pitches and durations. We  adjusted the timing parameters and experimented with different delay values to ensure smooth transitions between notes.

Future Improvements:

1) To create a vibrant atmosphere that mimics a mini club, we aim to incorporate LEDs that flash in sync with the music notes.

2) To provide a more DJ-like experience, we intend to program more songs and add a crossfade effect, allowing one song to ease into the other as the potentiometer is turned, instead of multiple switches.

 

Afra Binjerais – Week 11 concept for final

Mood lamp

My idea for the final project is the Interactive Mood Lamp, a user-responsive system that combines hardware interaction through Arduino and software interfacing via P5.js to create an emotionally adaptive environment. This system utilizes a webcam to capture the user’s facial expressions, which are then processed through P5.js to analyze and interpret emotions. Following this analysis, the detected emotions are communicated to an Arduino-controlled RGB LED lamp, which changes its colors accordingly. The primary goal of this project is to provide visual feedback based on the user’s emotional state.

Sensing: The project begins with the webcam capturing live video of the user’s face. This video feed is processed by a P5.js application running on a connected computer. The application utilizes a facial recognition algorithm, potentially enhanced by machine learning libraries

Processing: Upon identifying an emotion, the P5.js application assigns a specific color to represent each mood: blue might indicate sadness, green happiness, red anger, and yellow surprise. This color mapping will be based on common psychological associations between colors and emotions

Responding: After processing the emotional data, the P5.js application sends the corresponding color command to the Arduino via a serial communication link. The Arduino then adjusts the RGB LED’s color to match the detected emotional state.

Week 11 : Pi’s Moving Castle – A not-so-preliminary concept for final project

“Pi, you are cursed as the Artistic Engineer. Artists and engineers alone don’t share this burden.

But you, your unconscious artist within dreams the idealistic, fictional, perfect beauty, and the conscious engineer within, uses any methods possible, uncompromisingly, to bring that perfect fiction into reality.

Yet, when reality falls short of your dreams, a piece of your soul dies each time.

~ Pi’s father, after Pi surviving the panic attack

Yes, this is my greatest curse. I dream perfection, and kill my soul little by little. I have put myself into life threatening situations…. People warn me… Pi, Pi, life is not always the wholesome Ghibli films.

I mean… Duh 🤷‍♂️ ! I have this poster above my bed.

I religiously worship Hayao Miyazaki, and have always been inspired by the the Dutch kinetic sculptor Theo Jansen. Jansen is one of the few people who managed to bring my Ghibli dreams into reality. He developed a planar leg mechanism, which converts rotatory motion into a living creature.

[Jansen’s Linkage Simulation – Source Wiki ]

And, Pi heart was hurt very very deeply with great pain. So we have –

  • Ghibli Movies
  • Walking Mechanism and
  • Pi’s heart pain

And smashing them together gives…. wait for it –

Hence, ladies and gents (and everyone in between), I present …

 

Backstory

In a realm of both steam and green, Pi the artistic engineer lived in a castle on legs of iron and steam. His home, grand and tall, walked on land with ease. Pi, in his chamber of echoes, wove art, spun gears, and melodies spilled like whispers of a stream.

The castle’s pulse, Pi’s own heart encased in a labyrinth of cogs and sparks, pushed it to dance across the valleys and peaks. It moved with the rhythm of his boundless dreams.

One sad day, the heart stopped. The castle stood still, and so did Pi.

Now, you stand before this slumbering giant. To wake it, to heal Pi, you must delve into the heart’s core, where steam circuits lie in wait of knowing hands. Each piece you fit, each puzzle you solve, is a beat returned, a hope rekindled, and will wake up the castle and Pi again.

So the idea is super simple. You play a p5js puzzle game on the computer, which is to repair Pi’s mechanical heart. And we will have an actual physical steampunk robot which is not moving. But as the user solves the puzzle through stages, the castle will wake up, and finally be able to walk again. (Once it walks, the user can program the path of the castle 😉 … i.e. like Scratch)

Pi’s moving castle looks like this.

Q: Alright Pi, stop your arty-farty nonsense and tell us how it’s gonna work.

A: Here’s a not so complicated Schematic. The diagram should be self explanatory.

Pi :  So, for the base frame, I began by 3D printing the model I stole (I mean… **cough** **cough** legally obtained) from https://cults3d.com/en/3d-model/gadget/strandbeest-arduino-robot-stl-files

Q: Good. How does it fulfill the … umm..err… the good o’d listening, thinking, and speaking from both parties..? 

A: Well, here. The feedback loop is closed.

Q: Cool cool, do you think you can implement it on time and get it working before you graduate?

A: Well, this is a preliminary concept so I am only talking about the non-tangible ideas, planning stuff here. But I will let you sneak peak into the actual tangible stuff below.

(3D Printed Castle Parts)

(Black Paint to make it… steam punky)

(Steam Punky 3D Printed Castle Parts)

And below is one of the legs and the linkage working.

Q: Great. What if I do not approve this Final Project?

A:  Pi will cry.

 

 

Week 11: (Pi / Darko) Overengineered Guitar – An Arduino Based Musical Instrument

Introduction

This is the “Overengineered Guitar” by Pi Ko and Darko Skulikj.

We massacred an acoustic guitar with a single string into an automated musical device that can be played akin to a theremin.

Concept – A Cringy Skit

Darko does not know how to play the Pirates of the Caribbean theme song on guitar, so he decided to turn to Arduino for help.

Demonstration and Media

The video demo of the instrument is shown below.

The sound sucks 😡. Moral of the story : Pi should play the guitar instead of the machine 🫵😠🎸.

Concept

In principle, the “Overengineered Guitar” is a one-stringed setup acoustic guitar for playing by using an array of sensors and servo motors. It has a push button digital sensor and an analog ultrasonic sensor. The main idea is to control the musical notes by hand over the ultrasonic sensor. Then a propeller does the mechanical plucking, controlled through a push button.

This provides the user the opportunity to play the predefined sequence from “He’s a Pirate” from Pirates of the Caribbean over and over again.

Components

  • Arduino Uno: Serves as the main controlling box for all the sensors and actuators used.
  • Servo Motors (5x): Five servo motors are being attached all across the fretboard, pressing their respective frets according to the desired note.
  • Ultrasonic Sensor: Used to toggle the press on the fretboard by the motors.
  • Digital pushbutton: It is pressed down to start up the propeller, which plucks the string.
  • Propeller motor and DC Motor: It gives the mechanical pluck on the guitar string.
  • L293D motor driver IC: Takes care of the high current requirement for the propeller.
  • External Power Supply: This ensures that the system power is being properly distributed among the various components without having to necessarily overpower the Arduino.

(Arduino, ultrasonic sensor, switch and L293D IC)

(Propellor on DC Motor)

The motors are attached to the arduino as below.

Arduino Pin Motor/Music Note ID Open Servo Angle Press Servo Angle
11 5 180 0
10 4 0 180
6 3 180 0
5 2 180 0
3 1 180 0
N/A 0 (open string) N/A (no servo) N/A (no servo)

Challenges

Our main challenge involved managing power to ensure that each component the required current. Also, the wiring was a challenge since there were a lot of wires.

(Wire management 😎)

Task Allocation

Darko took care of the wiring and code for the ultrasonic sensor and the switch using non-blocking code. The rest is filled by Pi.

Code

The code is below. It has debouncing to guarantee reliable operation of the switch.

#include <Servo.h>

// Define a struct to hold servo data
struct ServoData {
  int pin;
  int openAngle;
  int pressAngle;
  Servo servo;
};

// Create an array of servos for each note
ServoData servos[] = {
  {3, 180, 0}, // Note 1
  {5, 180, 0}, // Note 2
  {6, 180, 0}, // Note 3
  {10, 0, 180}, // Note 4
  {11, 180, 0} // Note 5
};
const int numServos = sizeof(servos) / sizeof(ServoData);

// Note durations in milliseconds
int noteDurations[] = {500, 500, 2000, 500, 2000, 500, 1000, 500, 500, 1000};
int noteSequence[] = {0, 1, 2, 3, 4, 5, 3, 2, 1, 2};
const int numNotes = sizeof(noteSequence) / sizeof(int);

unsigned long previousMillis = 0;  // Stores last update time
int currentNoteIndex = 0;          // Index of the current note being played

// Push Button and Propeller control
const int buttonPin = 4; // Pushbutton pin
const int ledPin = 13; // LED pin (for debugging)
int enA = 9; // Enable pin for motor
int in1 = 8; // Motor control pin
int buttonState = 0; // Current button state
// Define the pins for the ultrasonic sensor
const int trigPin = 13;
const int echoPin = 12;

// Define variables for the duration and the distance
long duration;
int distance;


void setup() {
  // Setup for servos
  for (int i = 0; i < numServos; i++) {
    servos[i].servo.attach(servos[i].pin);
    servos[i].servo.write(servos[i].openAngle);
  }

 // Define pin modes for ultrasonic
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
  // Setup for button and propeller
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT);
  pinMode(enA, OUTPUT);
  pinMode(in1, OUTPUT);
  analogWrite(enA, 255); // Set propeller speed
  digitalWrite(in1, LOW); // Initially disable propeller
}

void loop() {
  unsigned long currentMillis = millis();
  // Darko - Switch
  // Improved button reading with debouncing
  int readButton = digitalRead(buttonPin);
  if (readButton != buttonState) {
    delay(50); // Debounce delay
    readButton = digitalRead(buttonPin);
    if (readButton == HIGH) {
      digitalWrite(ledPin, HIGH);
      digitalWrite(in1, HIGH); // Enable propeller
    } else {
      digitalWrite(ledPin, LOW);
      digitalWrite(in1, LOW); // Disable propeller
    }
    buttonState = readButton;
  }

  // Darko - Ultrasonic
  // Clear the trigPin condition
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  // Sets the trigPin HIGH (ACTIVE) for 10 microseconds
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  // Reads the echoPin, returns the sound wave travel time in microseconds
  duration = pulseIn(echoPin, HIGH);
  
  // Calculating the distance
  distance = duration * 0.034 / 2; // Speed of sound wave divided by 2 (go and back)
  

  if(distance<=12){

  // Handling servo movements based on timing
  if (currentMillis - previousMillis >= noteDurations[currentNoteIndex]) {
    // Move to the next note
    if (noteSequence[currentNoteIndex] != 0) {
      // Release the previous servo, if any
      int prevNote = (currentNoteIndex == 0) ? -1 : noteSequence[currentNoteIndex - 1];
      if (prevNote != -1 && prevNote != 0) {
        servos[prevNote - 1].servo.write(servos[prevNote - 1].openAngle);
      }
      // Press the current servo
      int currentNote = noteSequence[currentNoteIndex];
      if (currentNote != 0) {
        servos[currentNote - 1].servo.write(servos[currentNote - 1].pressAngle);
      }
    } else {
      // Release all servos for open string
      for (int i = 0; i < numServos; i++) {
        servos[i].servo.write(servos[i].openAngle);
      }
    }

    previousMillis = currentMillis; // Update the last actuated time
    currentNoteIndex++;
    if (currentNoteIndex >= numNotes) {
      currentNoteIndex = 0; // Restart the sequence
    }
  }
  }
}

 

 

Week 11: A Brief Rant on the Future of Interaction Design & Follow-Up Article

In my Spring semester of last year, I took a class called “Touch” that opened my eyes not only to the primacy of touch, but also to our consistent ignorance of the primacy of touch. Our tactile sensory system is the first to develop in the womb and is arguably the last to deteriorate as we get older. It is also the only reciprocal sense – we can only touch something if it touches us – providing us with a truly connective interface to interact with the diverse environments around us. The sense of touch is also extremely vital for normal cognitive and social development. Yet, prototypes of future technological interfaces reduce the full range of interaction that touch affords us or eliminate it altogether. The future vision for AR/XR technologies seems to be doing away with the tactile. Pictures-Under-Glass, the moniker Victor aptly coins and uses in this week’s readings, technologies already offer us limited tactile feedback that only our fingertips enjoy. We now see the future of AR/XR interfaces, such as headsets utilizing eye movement tracking systems, heading toward a direction where the visual is prioritized. Our rich haptic system is left to atrophy in the process.

It is indeed worrying that we might reach a time when kids struggle to use a pen or open lids. In attempting to connect to the world via modern-day interfaces, it seems like we are disconnecting from it in important ways. There must have been interfaces that we could have envisioned that integrate all of our varied sensory systems, had we invested the time to make that a priority. Victor’s rant and his follow-up article were both written in 2011. 13 years later, his call to invest in research that centers the full range of human capabilities in the design of technological interfaces seems to have gone unanswered. We have yet to make our capabilities a priority in design, but that could change if we collectively dare to choose a different way of envisioning tomorrow’s technologies.

 

 

Afra binjerais – Week 11 reading response

Reading 1:

I found great resonance in reading the assessment of the shortcomings of the interfaces used by modern technology, particularly the widespread “Pictures Under Glass” method. It caused me to reevaluate how I use technology and how these regular interactions lack the tactile depth. I became dissatisfied with the devices that rule my life after reading the author’s claim that our future interfaces will not be visionary; instead, they will only perpetuate the problems with the current system.
I was forced to consider how my devices, which are primarily touchscreens, feel impersonal and distant after reading the criticism. My laptop, phone, and even the smart devices at home operate with simple, repetitive motions that don’t have the complex feedback loop that comes with more tactile interactions. This insight made me uncomfortable since it brought to light a sensory deprivation that I was experiencing when interacting with the digital world. Compared to handling tangible objects like books or tools, I’m now more conscious of the shallowness of the experience and the flatness of the screens I swipe across on a regular basis.

Reading 2:

In reflecting on “A Brief Rant on the Future of Interaction Design,” a key question emerges: How can designers and technologists create interactive technologies that truly enhance human capabilities and foster a deeper connection with our physical environment, rather than simply replicating or replacing traditional physical interactions?

This reading essentially aligns with an argument for a more deliberate approach to technology, one that honors and strengthens rather than ignores our cognitive and physical abilities. It serves as a reminder that, whether we are using, designing, or developing technology, our relationship with it should not be passive. Rather, we should support and push for advancements that genuinely increase human potential in terms of interacting with the digital world. This mode of thinking encourages us to imagine a future where technology supports and enhances human talents in more comprehensive and meaningful ways, which is in line with a larger philosophical dilemma concerning the role technology should play in our lives.

Week 10 Reading Response

I enjoyed Tigoe’s piece about creating one’s work and then giving it the freedom to be experienced by the public. In the world of interactive art, instead of laying out your art with a manual, it’s about setting the stage and letting the audience be the audience. The main goal here is to direct a play, where the users aren’t told what to think or feel, but are provided with cues and props to navigate their journey. I would 100% prefer having the chance to figure out a task myself than to read a manual or to hear an explanation. It’s like teaching a friend a new card game and saying, “You’ll catch on as we play”. Of course, there will be a trial and error or some confusion but once you’re there, there’s an “ohhh” moment which is my favourite to experience and witness. This reading made me redefine interactivity in my head, it’s almost like figuring out someone’s piece is so stimulating it could in itself be the peak of the experience.

In the second reading, I liked the gloves work. Just as the author voices the importance of listening to the audience’s reactions and responses to the artwork, the discussion about sensor integration in gloves underscores the importance of understanding and responding to the gestures and movements of the user. This parallels the dynamic exchange described in the first reading, where meaning unfolds through the engagement between creator and audience.
Overall, both readings focus on the importance of creating interactive experiences that allow users to engage and contribute to the ongoing painting not a final product, whether through exploring an interactive artwork or experimenting with musical expression using gloves.

Pi : Week 11 Reading – The Feel Factor: Escaping the Flatland of Modern Interfaces

Whenever I do the Interactive Media reading assignments, I almost always disagree or criticize the author, or make fun of their claims. But this week will be different. Bret Victor is one of the only two people I idolized when I was a child and one day hope to be like. Whatever those two men (the childhood heroes) say, I will always agree with them 😉 .

I work in the field of haptics – the science of touch. My supervisor once explained that when a human uses a particular tool for long, he becomes too much accustomed to that tool. For example, think of the carpenters, painters, guitarists, etc.

This continues to the extent that the tool becomes a part of his body, and his means to sense the world.

But in order for that tool to become an extension of his body, the tool must give the user “good tangible feedback.” And that kind of feedback is something digital devices truly lack. Try using an actual ruler to measure the length of something… Now try measuring the same distance using the measurement app in the iPad, which is technically intricate, but very cumbersome to use. It just does not feel like a ruler.

For one of my personal projects, I draw circuit diagrams with Chinese traditional ink on rice paper—to imagine an alternate universe, where electricity and the industrial revolution have started from the Orient.

In this project, I tried and evaluated two methods: the iPad method and the literal ink method. Somebody whose eyes are not trained will tell you that the results are nearly the same. Thu,  in reality, both methods work. However, for my artistic satisfaction, the iPad approach was a disaster. Though I am using a state of the art Chinese brush simulator, I could not feel the resistance of the paper, the ink was flowing, the brush was scratching the paper, and so on, ad infinitum.

The approach towards the iPad was simply soulless. Thus, Bret’s vision—or better said, his criticism on the lack of vision in the current design trends—had a deep echo with my experiences. “Our hands feel things and our hands manipulate things. Why shoot for anything less than a dynamic medium that we can see, feel, and manipulate?” This should be written on the wall of each tech company urging to birth the next iteration of soulless gadgets into existence.

No manner of me waving a stylus around is going to recreate the sensory feedback of ink travelling down to paper through my ink brush.

In short, the iPad stylus will never become an extension to my body, like the real paintbrush does.

“Hands feel things, and hands manipulate things,” Bret points out ; I couldn’t but smile to see that most modern designs spectacularly reject this in a fashion way over the top.

I really love Bret’s equivalent of the whole “Pictures Under Glass” idea to how utterly ridiculous it would have been if someone had once said, “black-and-white is the future of photography.”

In conclusion, Bret’s rant isn’t just a criticism; it’s a roadmap for future innovators.

Bret did not give us a solution, but a question to ponder. This question alone should challenge us to look for something beyond the conventional in thinking about how we make our interactions with technology as instinctive as moving our limbs, as natural as using your hands.

Week 10 Response – Tom Igoe

Tom Igoe’s articles on physical computing and interactive art describe the growing relationship between technology and user engagement. In Physical Computing’s Greatest Hits (and Misses), Igoe talks about the relationship between creativity and technology, highlighting how simple designs can provoke complex interactions and reflections. Additionally, in Making Interactive Art: Set the Stage, Then Shut Up and Listen, he advocates for a minimalistic approach in guiding the audience’s experience by using the art of subtlety in interactive design.

Igoe’s philosophy resonates deeply with me; it challenges the beauty of discovery within the constraints of design and technology, reminding creators to trust their audience’s intuitive interactions with their work. .