Week 11 – Bret Victor’s Rant

In Bret Victor’s rant called A Brief Rant On The Future Of Interaction Design an important element that is commonly forgotten in the realm of designing future interfaces is hands! I have never read anything that focused on something as mundane as hands this passionately but it is true! 

Victor categorizes the function of hands into feeling and manipulation, something that I was never fully aware of until it was pointed out in the rant, our hands have programmed ways of holding several different items in so many ways it is choreographed in such a way where it is efficient and functional. 

So it makes sense that the future of interaction design focuses on our hands and their function, but that doesn’t limit other sensory aspects such as eye tracking or voice user interface, of course, our hands have been our main interaction facilitator however that doesn’t mean it would continue to be so.

Final Project Proposal – Saeed Lootah

Russian Roulette

Introduction

Going into the final project I had a few ideas, some of which I had before the midterm. One of the ideas I had was Russian Roulette, but you get electrocuted instead of getting shot. I felt this was a much safer way of playing Russian Roulette whilst also giving each player the chance to play again if they want to. I chose electrocution because I felt the game still needed some kind of stake, otherwise it’s not nearly as exciting.

Outline

The plan is to have a two player game where each player sits across from the other on a table. To the side of the table is a screen which displays the number of lives each player has as well as an animation that will play when the round starts, when the “gun” is shot, and when the shot is a blank. The screen will also display information about who’s turn it is, and also will allow the players to see a tutorial and also allow the players to play another round. In the game there wont be a physical gun as such, the players will use buttons to control who to electrocute, themselves or the the other person, and sometimes it will be a blank. I may represent the bullets as resistors in any of the drawings just to add to the effect of Russian roulette. As for the electrocution: There will be a device upon which the player will place their hand and in the event that the player will be electrocuted the device will only deliver a small electric shock. I will not make this device myself (I do not trust myself, nor am I willing to test it), so I will search, as I already have been doing, for a small device which can deliver a small, harmless, electric shock.

Side note: The game will be properly labeled with safety signs to indicate that only people above the age of 18 can play and that there are electric shocks involved.

Week #11 Assignment – Orquid by Jana & Rashed

Concept:

For this assignment, we wanted to go with something simple. Rashed is known for his love for music so I was excited to see what we would come up with. During our brainstorming session, Rashed was doing his hobby – which is music production- on his phone on GarageBand and he noticed how he had to click on the “arrow” button to change the pitch of the piano and we decided we wanted to do something like that but with the potentiometer.

The pitch-changing arrow I was referring to on GarageBand:

Materials used:

  • x11 Wires
  • x4 Buttons
  • x4 10k Resistors
  • x1 Buzzer
  • x1 Potentiometer

Production:

Writing this code was pretty challenging for us as both of us are still adapting to this new language.

We used the toneMelody example from the examples folder that we used in class to take the “pitches.h” tab from it in order to give a specific range. We collectively decided to have the buttons, by default, be the most basic notes “Do, Re, Me, Fa” also known as “NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4” which was pretty simple to implement.

However, we faced one major issue:

For some reason, the potentiometer would not cooperate at all. The buttons were working fine, but the pitch won’t budge. We, unfortunately, had to use ChatGPT to help us but it was still very helpful in helping us understand what we were doing wrong.

Here’s the code:

#include "pitches.h"

const int speakerPin = 9;  // Speaker connected to pin 9
int buttonPins[] = {2, 3, 4, 5};  // Button pins for notes
int potPin = A0;  // Potentiometer connected to A0

// Initialize an array of notes that will be used depending on the button press
int noteIndices[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4};  // Starting notes for each button

void setup() {
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);  // Setup button pins as input with pull-up resistor
  }
  pinMode(speakerPin, OUTPUT);  // Set speaker pin as output
}

void loop() {
  int potValue = analogRead(potPin);  // Read the current value from the potentiometer
  int noteRange = NOTE_C6 - NOTE_C4;  // Define the range of notes to span
  int noteOffset = map(potValue, 0, 1023, 0, noteRange);  // Map potentiometer value to note range
  
  for (int i = 0; i < 4; i++) {
    if (digitalRead(buttonPins[i]) == LOW) {  
      int noteToPlay = noteIndices[i] + noteOffset;  
      tone(speakerPin, noteToPlay);  // Play the note on the speaker
      delay(200);  // A short delay to help debounce the button
      while (digitalRead(buttonPins[i]) == LOW);  // Wait for button release
      noTone(speakerPin);  // Stop the note
    }
  }
}

 

 

Here’s the video of us using it:

 

Reflection:

I think, despite chatGPT’s involvement in this, we did pretty well; the concept was there, the energy and effort that was put into this is there and we’re both proud of our little creation. However, we both do with we took it a step further and maybe added more buttons or tried to use the sensor instead of the potentiometer. We really hope to get used to and adapt to this language and I think after doing

Final Project Concept – Khalifa Alshamsi

Concept: To create a gyroscope controller wearable controller that is connected to a spaceship game in which there would be power-ups and better graphics for the final project.

Objectives:
Design a Gyroscope Controller: Create a hardware prototype of a gyroscope-based controller that can detect and interpret the user’s hand motions as input commands.
Develop Interface Software: Write software that interprets the gyroscope data and translates it into game commands that are compatible with the game.
Integrate with a Space Simulation Game: Modify my existing Midterm game or develop a simple new game designed to work specifically with this controller.
User Testing and Feedback: Conduct testing sessions to gather feedback and improve the interface and game interaction based on user responses.

Methodology
Hardware Development: Use a 3-axis gyroscope sensor to capture tilt and rotation. Design the controller’s physical form factor to be strapped around the hands.
Integrate with Bluetooth or USB for wireless connectivity with the P5js game.

Software Development: Develop software to read data from the gyroscope sensor. Convert the gyroscope data into P5js-compatible input commands.
Ensure the software supports real-time interaction with minimal latency.

Game Integration: Adapt a spaceship simulation game to respond to the new controller inputs. Implement basic gameplay features such as navigation, obstacle avoidance, and speed control using gyroscope inputs.

Testing and Iteration: Test the controller with users of varying gaming experience. Collect qualitative and quantitative feedback on usability, responsiveness, and enjoyment. Refine the hardware and software iteratively based on this feedback.

Week 11: Reading Responses + Final Proposal – Shereena AlNuaimi

Reading Response #1:

The article “A Brief Rant on the Future of Interaction Design” offers a convincing critique of the dominant touchscreen-centric methodology in interaction design, sometimes known as “Pictures Under Glass.” Using their knowledge of futuristic interface design, the author draws attention to the shortcomings of this paradigm, which underutilizes the tactile and manipulation capacities of human hands. The author challenges a reassessment of design goals by arguing in favor of interfaces that enhance rather than limit human capabilities. This emphasizes the significance of interfaces that physically connect with users.

The argument emphasizes how important physical and tactile contact is for human-computer interfaces and how modern designs frequently ignore these important details. Rather, the author sees a day when interfaces make use of the whole range of human skills to engage users in a more meaningful and multifaceted way. This forward-thinking perspective questions the present quo and exhorts technology and design stakeholders to give top priority to developing interfaces that are deeply in harmony with human experience, not just functional. This will enable hands-on technology to work together harmoniously, enabling us to reach our greatest potential.

Reading Response #2:

The author addresses criticisms and questions arising from their previous paper on interface design in this follow-up. The author makes it clear that their tirade was intended to draw attention to current issues rather than offer quick fixes, highlighting the necessity of continued investigation and learning in the field of interface design. The author makes the case for future innovation that goes beyond present technical constraints, supporting interfaces that more closely match human capabilities—all the while appreciating the usefulness of gadgets like iPhones and iPads.

The author dismisses styluses and physical keyboards as static instruments in response to criticisms of them, imagining future technologies that will be able to express and manipulate almost anything tangibly. Moreover, the author supports voice interfaces that encourage creativity and subtle involvement beyond basic voice commands, even while they acknowledge the usefulness of voice interfaces.  Furthermore, the author also addresses criticisms about brain and gestural interfaces, advising against designs that circumvent the body or interfere with proprioception. In the end, he rejects the idea that touchscreens are enough and advocates for interfaces that fully take into account the possibilities and difficulties of human interaction.

Final Proposal:

For my final project, I plan on incorporating something that’s been a huge part of my life, archery. I intend to create a game using P5JS and Arduino. With P5JS I plan to display a background of an archery target and make the arrow move left and right. With the Arduino, I plan to use pressure sensors and create a button so when the button is pressed the arrow stops and releases onto the target. I also plan on making it all a bit pixelated to make it look more like a game.

Week 11 Music – Saeed, Khalifa, Jihad

Concept:

The concept of this project was to create a mini drum pad, or what is equivalent to one, with the hardware we have available. The device would use buttons to trigger different buzzer sounds, mimicking the functionality of a traditional drum pad. Each button on the device would correspond to a different sound, with the frequency of these sounds adjustable via a potentiometer. This allows the user to modify the pitch of the tones.

Code:

// Defining pins assignments for buttons and buzzers
const int buttonPin1 = 2;
const int buttonPin2 = 3;
const int buttonPin3 = 4;
// Coded with the Aid of ChatGPT
const int buttonPin4 = 5; // Monitoring and playbacks button
// Coded with the Aid of ChatGPT
const int buzzerPin1 = 8;
const int buzzerPin2 = 9;
const int buzzerPin3 = 10;
const int potPin = A0; // Potentiometer connected to A0 for frequency control

// Variables to manage button states and debounce timing
int buttonState1 = 0;
int lastButtonState1 = 0;
int buttonState2 = 0;
int lastButtonState2 = 0;
int buttonState3 = 0;
int lastButtonState3 = 0;
int buttonState4 = 0;
int lastButtonState4 = 0;

unsigned long lastDebounceTime1 = 0;
unsigned long lastDebounceTime2 = 0;
unsigned long lastDebounceTime3 = 0;
unsigned long lastDebounceTime4 = 0;
unsigned long debounceDelay = 50; // Debounce delay in milliseconds

// Struct to hold buzzer activation data including the pin and frequency
struct BuzzerAction {
  int buzzerPin;
  int frequency;
};

// Coded with the Aid of ChatGPT
BuzzerAction record[100]; // Array to store each buzzer activation
int recordIndex = 0; // Index for recording array
//Coded with the Aid of ChatGPT

void setup() {
  // Initialize all button and buzzer pins
  pinMode(buttonPin1, INPUT);
  pinMode(buttonPin2, INPUT);
  pinMode(buttonPin3, INPUT);
// Coded with the Aid of ChatGPT
  pinMode(buttonPin4, INPUT);
// Coded with the Aid of ChatGPT
  pinMode(buzzerPin1, OUTPUT);
  pinMode(buzzerPin2, OUTPUT);
  pinMode(buzzerPin3, OUTPUT);
  pinMode(potPin, INPUT); // Setups potentiometer pin as input
}

void loop() {
  // Reads current state of buttons
  int reading1 = digitalRead(buttonPin1);
  int reading2 = digitalRead(buttonPin2);
  int reading3 = digitalRead(buttonPin3);
// Coded with the Aid of ChatGPT
  int reading4 = digitalRead(buttonPin4);
// Coded with the Aid of ChatGPT
  int potValue = analogRead(potPin); // Reads potentiometer value
  int frequency = map(potValue, 0, 1023, 200, 2000); // Maps potentiometer value to frequency range

  // Handle button 1 press and recording
  debounceAndRecord(reading1, &lastButtonState1, &buttonState1, &lastDebounceTime1, buzzerPin1, frequency);

  // Handle button 2 press and recording
  debounceAndRecord(reading2, &lastButtonState2, &buttonState2, &lastDebounceTime2, buzzerPin2, frequency);

  // Handle button 3 press and recording
  debounceAndRecord(reading3, &lastButtonState3, &buttonState3, &lastDebounceTime3, buzzerPin3, frequency);

  // Handles button 4 for playback
  if (reading4 != lastButtonState4) {
    lastDebounceTime4 = millis();
  }
  if ((millis() - lastDebounceTime4) > debounceDelay) {
    if (reading4 != buttonState4) {
      buttonState4 = reading4;
      if (buttonState4 == HIGH) {
        for (int i = 0; i < recordIndex; i++) {
          // Play each recorded buzzer action with the specific frequency recorded
          tone(record[i].buzzerPin, record[i].frequency, 200);
          delay(250); // Short delay between each buzzer action for clarity
        }
        recordIndex = 0; // Resets record index after playback
      }
    }
  }

  // Update last button states for next loop iteration
  lastButtonState1 = reading1;
  lastButtonState2 = reading2;
  lastButtonState3 = reading3;
  lastButtonState4 = reading4;
}
// Coded with the Aid of ChatGPT
// Function to handle button debouncing and recording buzzer actions
void debounceAndRecord(int reading, int *lastButtonState, int *buttonState, unsigned long *lastDebounceTime, int buzzerPin, int frequency) {
  if (reading != *lastButtonState) {
    *lastDebounceTime = millis(); // Reset debounce timer
  }
  if ((millis() - *lastDebounceTime) > debounceDelay) {
    if (reading != *buttonState) {
      *buttonState = reading; // Updates button state
      if (*buttonState == HIGH && recordIndex < sizeof(record) / sizeof(record[0])) {
        record[recordIndex++] = {buzzerPin, frequency}; // Records the buzzer activation
        tone(buzzerPin, frequency, 200); // Plays buzzer at recorded frequency
      }
    }
  }
  *lastButtonState = reading; // Updates last button state for debouncing
// Coded with the Aid of ChatGPT
}

 

Hardware Configuration: The system is designed with four button inputs and three buzzer outputs. Additionally, a potentiometer is used to control the frequency of the buzzer sounds. Button Functionality: Buttons 1 to 3 are connected to buzzers and are responsible for triggering sounds with variable frequencies determined by the potentiometer. Button 4 is designated for playback. It plays back a sequence of sounds that have been recorded based on earlier interactions with buttons 1 to 3.

Frequency Control: The frequency of the sounds is dynamically adjusted using a potentiometer. The analog value from the potentiometer is mapped to a specified frequency range (200 Hz to 2000 Hz), which determines how the buzzers sound.

Debouncing: To ensure reliable button press detection without noise interference, the code implements debouncing logic. This involves measuring the time since the last button state change and updating the state only if this interval exceeds a predefined threshold (50 milliseconds).

Recording and Playback (Aided by ChatGPT)

Recording: When a button (1 to 3) is pressed, the action (which buzzer is activated and at what frequency) is recorded in an array. This includes storing both the pin of the buzzer and the frequency at which it was activated.

Playback: When button 4 is pressed, the system iterates over the recorded actions and plays them sequentially. Each action triggers the corresponding buzzer to sound at the recorded frequency for a short duration.

Loop and Functions: The main loop continuously checks the state of each button and the potentiometer, updating the frequency accordingly. A helper function, debounceAndRecord, is used to handle the logic

Video of Project:

Reflection and ideas for future work or improvements:

Integrating a small display screen would significantly improve its functionality, further enhancing the project. This screen would provide real-time visual feedback on button presses and frequency outputs, allow users to scroll through and select different sounds or presets, and serve as a simple interface for directly programming the device.

The potential for further development and refinement holds exciting prospects. The integration of a display screen and the addition of more customizable buttons are immediate steps that will enhance the device’s usability and versatility. Further innovations could include wireless connectivity for easy integration with other music production software or the addition of sensors to enable gesture-based controls, which would offer an even more dynamic and immersive user experience.

Several key insights stand out after reflecting on what this project has taught us. First, the practical challenges of hardware interfacing taught us the importance of robust design and a solid plan for creating it. There is also a need for effective wire management and physical housing to enhance device durability and aesthetics.

Looking Ahead:

Overall, this project resulted in a functional and entertaining product and served as a significant learning experience, underscoring the importance of patience, precision, and creativity in making it happen. These lessons will guide further improvements and innovations in our future projects.

Project Proposal – Stefania Petre

Hey there!

For this final, I have imagined stepping into a world where art and technology intertwine to create something truly special.

Envision an interactive space where art and technology fuse, transforming physical gestures into a cascade of colors on a digital canvas. My project is an invitation to express creativity through motion, as participants use a paintbrush to navigate a symphony of shapes and hues. This is not just an artwork but an experience that captures the essence of movement and translates it into a personalized digital masterpiece.

Arduino Part:

At the interaction’s core lies the Arduino Uno, paired with a ZX Distance and Gesture Sensor. The sensor is adept at tracking the proximity of the paintbrush and the subtle hand gestures of the artist.

  • Input: Proximity data and gesture commands from the ZX Sensor.
  • Output: Serial communication to relay the sensor data to the computer running the P5 sketch.
  • Data to P5: Real-time Z-axis data for proximity (distance of the hand or brush from the sensor) and X-axis data for lateral movement, along with gesture detections (swipes, taps).
  • From P5: Instructions may be sent back to calibrate gesture sensitivity or toggle the sensor’s active state.

P5JS Part:

On the digital front, P5 will act as the canvas and palette, responsive and mutable. It will process the incoming data from the Arduino, converting it into an array of colors and movements on the screen.

  • Receiving Data: Interpretation of proximity and gesture data from the Arduino.
  • Processing Movements: Real-time mapping of hand movements to strokes and splashes of color, varying in intensity and spread on the digital canvas.
  • Visual Feedback: Dynamic visual changes to represent the flow and dance of the user’s movements.
  • To Arduino: Signals for adjusting the ZX Sensor settings based on real-time performance and user experience feedback.

So far, this is how the painting should look like:

 

Development and User Testing:

The ZX Distance and Gesture Sensor is now integrated with Arduino. The creation of a smooth data flow to the P5 programme is the urgent objective. The system should react to hand motions by displaying appropriate visual changes on the screen by the time of user testing the following week. Assessing user interaction—specifically, how natural and fulfilling it is to paint in midair—will be largely dependent on this initial prototype. The documentation of user testing will involve multiple techniques such as gathering feedback from participants, recording interactions on video, and analysing the responsiveness of the system.

 

 

“Overengineered Guitar” by Pi Ko and Darko Skulikj

Introduction

This is the “Overengineered Guitar” by Pi Ko and Darko Skulikj.

We massacred an acoustic guitar with a single string into an automated musical device that can be played akin to a theremin.

Concept – A Cringy Skit

Darko does not know how to play the Pirates of the Caribbean theme song on guitar, so he decided to turn to Arduino for help.

Demonstration and Media

The video demo of the instrument is shown below.

The sound sucks 😡. Moral of the story : Pi should play the guitar instead of the machine 🫵😠🎸.

Concept

In principle, the “Overengineered Guitar” is a one-stringed setup acoustic guitar for playing by using an array of sensors and servo motors. It has a push button digital sensor and an analog ultrasonic sensor. The main idea is to control the musical notes by hand over the ultrasonic sensor. Then a propeller does the mechanical plucking, controlled through a push button.

This provides the user the opportunity to play the predefined sequence from “He’s a Pirate” from Pirates of the Caribbean over and over again.

Components

  • Arduino Uno: Serves as the main controlling box for all the sensors and actuators used.
  • Servo Motors (5x): Five servo motors are being attached all across the fretboard, pressing their respective frets according to the desired note.
  • Ultrasonic Sensor: Used to toggle the press on the fretboard by the motors.
  • Digital pushbutton: It is pressed down to start up the propeller, which plucks the string.
  • Propeller motor and DC Motor: It gives the mechanical pluck on the guitar string.
  • L293D motor driver IC: Takes care of the high current requirement for the propeller.
  • External Power Supply: This ensures that the system power is being properly distributed among the various components without having to necessarily overpower the Arduino.

(Arduino, ultrasonic sensor, switch and L293D IC)

(Propellor on DC Motor)

The motors are attached to the arduino as below.

Arduino Pin Motor/Music Note ID Open Servo Angle Press Servo Angle
11 5 180 0
10 4 0 180
6 3 180 0
5 2 180 0
3 1 180 0
N/A 0 (open string) N/A (no servo) N/A (no servo)

Challenges

Our main challenge involved managing power to ensure that each component the required current. Also, the wiring was a challenge since there were a lot of wires.

(Wire management 😎)

Task Allocation

Darko took care of the wiring and code for the ultrasonic sensor and the switch using non-blocking code. The rest is filled by Pi.

Code

The code is below. It has debouncing to guarantee reliable operation of the switch.

#include <Servo.h>

// Define a struct to hold servo data
struct ServoData {
  int pin;
  int openAngle;
  int pressAngle;
  Servo servo;
};

// Create an array of servos for each note
ServoData servos[] = {
  {3, 180, 0}, // Note 1
  {5, 180, 0}, // Note 2
  {6, 180, 0}, // Note 3
  {10, 0, 180}, // Note 4
  {11, 180, 0} // Note 5
};
const int numServos = sizeof(servos) / sizeof(ServoData);

// Note durations in milliseconds
int noteDurations[] = {500, 500, 2000, 500, 2000, 500, 1000, 500, 500, 1000};
int noteSequence[] = {0, 1, 2, 3, 4, 5, 3, 2, 1, 2};
const int numNotes = sizeof(noteSequence) / sizeof(int);

unsigned long previousMillis = 0;  // Stores last update time
int currentNoteIndex = 0;          // Index of the current note being played

// Push Button and Propeller control
const int buttonPin = 4; // Pushbutton pin
const int ledPin = 13; // LED pin (for debugging)
int enA = 9; // Enable pin for motor
int in1 = 8; // Motor control pin
int buttonState = 0; // Current button state
// Define the pins for the ultrasonic sensor
const int trigPin = 13;
const int echoPin = 12;

// Define variables for the duration and the distance
long duration;
int distance;


void setup() {
  // Setup for servos
  for (int i = 0; i < numServos; i++) {
    servos[i].servo.attach(servos[i].pin);
    servos[i].servo.write(servos[i].openAngle);
  }

 // Define pin modes for ultrasonic
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
  // Setup for button and propeller
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT);
  pinMode(enA, OUTPUT);
  pinMode(in1, OUTPUT);
  analogWrite(enA, 255); // Set propeller speed
  digitalWrite(in1, LOW); // Initially disable propeller
}

void loop() {
  unsigned long currentMillis = millis();
  // Darko - Switch
  // Improved button reading with debouncing
  int readButton = digitalRead(buttonPin);
  if (readButton != buttonState) {
    delay(50); // Debounce delay
    readButton = digitalRead(buttonPin);
    if (readButton == HIGH) {
      digitalWrite(ledPin, HIGH);
      digitalWrite(in1, HIGH); // Enable propeller
    } else {
      digitalWrite(ledPin, LOW);
      digitalWrite(in1, LOW); // Disable propeller
    }
    buttonState = readButton;
  }

  // Darko - Ultrasonic
  // Clear the trigPin condition
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  // Sets the trigPin HIGH (ACTIVE) for 10 microseconds
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
 
  // Reads the echoPin, returns the sound wave travel time in microseconds
  duration = pulseIn(echoPin, HIGH);
 
  // Calculating the distance
  distance = duration * 0.034 / 2; // Speed of sound wave divided by 2 (go and back)
 

  if(distance<=12){

  // Handling servo movements based on timing
  if (currentMillis - previousMillis >= noteDurations[currentNoteIndex]) {
    // Move to the next note
    if (noteSequence[currentNoteIndex] != 0) {
      // Release the previous servo, if any
      int prevNote = (currentNoteIndex == 0) ? -1 : noteSequence[currentNoteIndex - 1];
      if (prevNote != -1 && prevNote != 0) {
        servos[prevNote - 1].servo.write(servos[prevNote - 1].openAngle);
      }
      // Press the current servo
      int currentNote = noteSequence[currentNoteIndex];
      if (currentNote != 0) {
        servos[currentNote - 1].servo.write(servos[currentNote - 1].pressAngle);
      }
    } else {
      // Release all servos for open string
      for (int i = 0; i < numServos; i++) {
        servos[i].servo.write(servos[i].openAngle);
      }
    }

    previousMillis = currentMillis; // Update the last actuated time
    currentNoteIndex++;
    if (currentNoteIndex >= numNotes) {
      currentNoteIndex = 0; // Restart the sequence
    }
  }
  }
}

 

 

Raya Tabassum: Reading Response 6

Bret Victor’s “A Brief Rant on the Future of Interaction Design” raises some provocative questions about the trajectory of technology design. Why are we limiting ourselves to interactions with technology through mere touches on glass, when the human body has such a wide range of expressive capabilities? Victor challenges us to consider: What if we could design interactions that fully utilize the intricate ballet of our everyday physical movements? How might we transform our interactions with technology if we thought beyond the glass to harness the full potential of our tactile and kinesthetic intelligence? This essay pushes us to rethink how future technologies might enhance rather than restrict our natural interactions with the world. He champions a more expansive view that utilizes the full range of human capabilities, suggesting that true innovation in interaction design should engage more than just our fingertips — it should involve our entire bodies.
In his follow-up he addresses feedback from his initial “rant.” He clarifies that his critique was meant to highlight the limitations of current interface designs, like “Pictures Under Glass,” and to inspire research into more dynamic, tactile interfaces. He defends the potential of iPads while advocating for more dynamic, tactile mediums for interaction. Victor emphasizes that voice, gesture, and brain interfaces still miss crucial tactile feedback, and points out the dangers of technology that ignores the body’s natural abilities. He calls for technology to adapt to human capabilities, rather than bypass them.

Raya Tabassum: Final Project Primary Concept

I’m thinking of building a “Gesture-Based Musical Interface” for final which will be an interactive system that allows users to create music through hand movements. Combining Arduino’s sensing capabilities with P5.js’s graphical and sound processing, this project will transform physical gestures into a live audio-visual performance. It aims to provide an intuitive way for users to interact with technology and art. I’ll use two ultrasonic sensors to detect the hand movements in X-axis and Y-axis, speakers to output the generated musical notes, and webcam to visualize the movements on p5.