A brief rant on the future of interaction design…

A brief rant on the future of interaction design…

Victor expands on this idea by arguing that the future of interaction design should move beyond just screens and graphical user interfaces. He contends that our current reliance on touchscreens and GUIs is limiting, and that we need to explore more natural and intuitive ways for humans to interact with technology.

Victor argues that interaction design should leverage our innate abilities to manipulate physical objects and navigate three-dimensional space. He suggests that future interfaces should allow users to interact with information and digital content as if they were tangible objects, rather than abstract representations on a flat screen.

The article emphasizes the importance of incorporating more natural hand and body movements into interface design. Victor contends that touchscreens and mice are poor substitutes for the rich expressiveness and dexterity of human hands. He envisions interfaces that can interpret subtle gestures, manipulations, and movements to control digital systems more intuitively. I agree with Victor’s core argument that interaction design needs to evolve beyond just screens and GUIs to create more natural and intuitive interfaces.

However, I would add that while moving beyond touchscreens and traditional GUIs is important for pushing interaction design forward, we shouldn’t completely discard these technologies which call for the use of a single finger. This simple design he is ranting about is especially handy for users with certain disabilities or limitations.

For example, touchscreen interfaces with large, easy-to-tap buttons can be very beneficial for users with motor control issues or limited dexterity. The simplicity of tapping a screen with one finger opens up digital experiences to many who might struggle with more complex gestural interfaces. 

Week 10: Reading Response

A Brief Rant on the Future of Interaction Design.

I agree with many points the author makes about the current vision of future technology. The author believes that these “pictures under glass” — screens we interact with using simple gestures like sliding or tapping — limit the true potential of our hands and bodies. This reading made me think more about how I use my hands in ways I usually don’t notice.

The author highlights an important problem, which she also addresses in his response to critics. It’s essential to consider how people actually use technology when creating designs, so that the designs fit naturally into human behavior, not the other way around. Since this is already an issue today, it’s crucial to avoid imagining a future with similar limitations. The author is doing his part by writing about this problem to raise awareness and inspire more research and funding in this area.

This reading has encouraged me to think beyond the devices we have now. Instead of just improving screens, we could push for innovations that respect and enhance our human abilities. It makes me hopeful that the future of technology can be something much richer and more connected to our senses and actions. Looking forward, I hope to create interactive designs that put human capabilities first, rather than adding interaction elements just for the sake of it.

Reading Reflection – Week 10

A new angle on creating meaningful interactions is provided by Tom Igoe’s observations on interactive art and physical computing. One crucial takeaway is his recommendation to let individuals participate in a project in their own way rather than directing every detail. It can be tempting to give directions or explanations in interactive art, but Igoe contends that doing so can restrict the audience’s creativity. This method emphasizes how crucial it is to give consumers room to explore and interpret on their own terms, which will make the experience more memorable and intimate.

I became aware of the importance of basic, intuitive actions in design after seeing Igoe’s examples of ordinary gestures—such as tapping or moving through a space—used as interactive features. People can interact with technology naturally when these well-known motions are turned into interesting experiences. A project that combines commonplace activities with artistic involvement, such as one in which a person’s movement or touch activates music or graphics, seems both familiar and unexpected. It helps me consider how I may use such movements in my projects to produce interactions that seem natural and grab viewers’ interest.

My comprehension of user-centered design is further enhanced by his analogy between creating interactive art and directing a play. A skilled director creates the scene yet lets the actor interpret and react freely, not controlling every step. Similarly, creating a project that allows for user exploration changes the emphasis from the designer’s intention to the user’s experience, making every interaction special. In the future, I hope to develop designs that lead users through subliminal clues, empowering them to come to their own conclusions and derive personal meaning, transforming the encounter into a cooperative dialogue.

Week 10: Musical Instrument

Concept
The concept behind this project is inspired by the theremin, one of the earliest electronic musical instruments. Unlike the traditional theremin that uses electromagnetic fields, this version employs an ultrasonic sensor to detect hand movements and converts them into musical notes from the A minor pentatonic scale.

Schematic

Code snippet
The implementation uses an Arduino with an ultrasonic sensor (HC-SR04) and a piezo buzzer. Here’s the key components of the code:

const int SCALE[] = {
    147, 165, 196, 220, 262, 294, 330, 392, 440,
    523, 587, 659, 784, 880, 1047, 1175, 1319, 1568,
    1760, 2093, 2349
};

The scale array contains frequencies in Hertz, representing notes in the A minor pentatonic scale, spanning multiple octaves. This creates a musical range that’s both harmonious and forgiving for experimentation.

The system operates in two phases:

Calibration Phase

void calibrateSensor() {
    unsigned long startTime = millis();
    while (millis() - startTime < CALIBRATION_TIME) {
        int distance = measureDistance();
        maxDistance = max(maxDistance, distance);
    }
}

Performance Phase

void loop() {
    int currentDistance = measureDistance();
    int mappedNote = map(currentDistance, MIN_DISTANCE, maxDistance, 
                        SCALE[0], SCALE[SCALE_LENGTH - 1]);
    int nearestNote = findNearestNote(mappedNote);
    tone(PIEZO_PIN, nearestNote);
    delay(30);
}

Demo

When powered on, the instrument takes 5 seconds to calibrate, determining the maximum distance it will respond to. Moving your hand closer to the sensor produces higher pitches, while moving away produces lower ones. The pentatonic scale ensures that all notes work harmoniously together, making it easier to create pleasing melodies.

Reflection and Future Improvements
Current Limitations:

  • The response time has a slight delay due to sensor readings
  • Sound quality is limited by the piezo buzzer
  • Only supports single notes at a time

Potential Enhancements:

  1. Replace the piezo with a better quality speaker
  2. Add an amplifier circuit for improved sound output
  3. Incorporate multiple sensors for more control dimensions

Week 10: Jingle Bells – Speed Variation

Concept 

In this assignment, I collaborated with @Ruslan and we both love Christmas. The famous song Jingle Bells brings memories of the the times. So we explored various possibilities and decided to come up with speed variation of the Jingle Bells melody with respect to distance.

Here is the demonstration Video:

Schematic 

Here is the Schematic  for our Arduino connections:

Code:

In the implementation of our our idea, we searched for possible combinations of the notes and durations to match the Jingle Bells melody and stored them in an array. We then implemented the code mapping distance with durations. The variations in durations for each note make it seem playing faster or slower. Here is the code:

#include "pitches.h"
#define ARRAY_LENGTH(array) (sizeof(array) / sizeof(array[0]))

// Notes and Durations to match the Jingle Bells 
int JingleBells[] = 
{
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4,
};

int JingleBellsDurations[] = {
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4
};

const int echoPin = 7;
const int trigPin = 8;;
const int Speaker1 = 2;
const int Speaker2 = 3;
int volume;

void setup() 
{
// Initialize serial communication:
  Serial.begin(9600);
  pinMode(echoPin, INPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(Speaker1,OUTPUT);
}

void loop() 
{
  long duration,Distance;
  
// Distance Sensor reading
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  Distance = microsecondsToCentimeters(duration);

// Map Distance to volume range (0 to 255)
  volume = map(Distance, 0, 100, 0, 255);  
  volume = constrain(volume, 0, 255); 

// Play melody with adjusted volume


 playMelody(Speaker1 , JingleBells, JingleBellsDurations, ARRAY_LENGTH(JingleBells), volume);
  
// Debug output to Serial Monitor
  Serial.print("Distance: ");
  Serial.print(Distance);
  Serial.print("    Volume: ");
  Serial.print(volume);
  Serial.println();
}
// Get Centimeters from microseconds of Sensor
long microsecondsToCentimeters(long microseconds) 
{
  return microseconds / 29 / 2;
}
// PlayMelody function to accept volume and adjust note duration
void playMelody(int pin, int notes[], int durations[], int length, int volume) 
{
  for (int i = 0; i < length; i++) 
  {
// Adjust the note Duration based on the volume
    int noteDuration = (1000 / durations[i]) * (volume / 255.0);  

// Play the note with adjusted Durations
    tone(pin, notes[i], noteDuration);
// Delay to separate the notes
    delay(noteDuration * 1.3);  
    noTone(pin); 
  }
}
Reflections

Reflecting on this project, I learned a lot about working with notes, melodies. I was interested my the fact that  even complex music arrangements are made up of simple notes. The song “Jingle Bells” in particular really made me appreciate the structure of music on a new level. Each note represents a small part of the song, and adjusting the timing or pitch make the whole melody.

Working with @Ruslan made the process even more interesting, as we were both curious and explored various aspects before coming up with the final decision on music’s speed. I hope to continue working on musical notes in future projects.

Week 10 – Reading Reflection

I found Bret Victor’s A Brief Rant on the Future of Interaction Design really interesting because of how it critiques the touchscreen-focused “pictures under glass” model that limits our physical interaction with technology. This definitely changed my perspective on the direction technology is taking. Earlier, I thought that touchscreens are revolutionary and easy-to-use but I am now thinking about why we are willing to settle for such a limited interaction, devoid of the tactile feedback that has been essential to human experience for thousands of years. Shouldn’t we be designing technology that truly connects with the way we physically engage with the world?

I was also interested in Victor’s thoughts about the stagnation of interaction design. Instead of visionary advancements, we get small incremental changes that feel innovative but don’t really use the full potential of human capabilities. For example, I often find myself reading on my Kindle or jotting down notes on my iPad for the sake of convenience, even though I’m sacrificing the tactile feel of a physical book or the natural flow of writing on paper. This makes me further question: Are we sacrificing sensory richness for convenience? What would it take for the tech industry to prioritize deeper, more meaningful interactions over merely efficient or visually impressive ones? His argument has led me to reevaluate my own ideas about technology and wonder whether our devices could one day feel like a natural extension of our bodies, rather than just tools for quick tasks.

Week 10: Reading Response

Bret Victor’s “A Brief Rant on the Future of Interaction Design” makes us think about how we describe and imagine “interaction.” Victor criticises the way we create interactions by saying that we should not limit them to finger swipes on touchscreens and instead use our hands and other senses.

He says that one of the main problems with the business is that it only cares about “pictures under glass,” while touchscreens are now used for everything. He says that this method doesn’t use the full ability of human hands, which can do a lot more than just swipe or tap. This made me think: How often have I just used touchscreens or buttons as “interactive” features without thinking about how they use or more importantly limit our physical abilities?

This interpretation also begs a crucial question: “How much have we actually improved the ways in which the ‘interactive system’ gives us feedback when we interact with something?” Actually, we are not even near enough to create a significant kind of interactive system. This is so because we neglected haptic feedback. Our hands are a collection of many kinds of sensors: heat, pressure, force, electrical, and so on. Although Victor’s perfect is employing hands in a whole spectrum of natural movements, I think haptic feedback may help to design interaction going forward.

Finding the substitutes for actual haptic input interests me as an engineering student. To replicate physical input, I may use motor vibrations, tension-based devices, or even resistance-based sensors. That is why in my creative switch project, I used a pulley mechanism to lift the connecting switch, which is to invite the user into engaging in an interactive physical system and is able to feel a sense of ‘weight’.

Week 10 – Reading Response

Believe it or not, I nearly believed that Magic Ink link would be “super-brief.”

Well, in terms of disillusioning, I guess the readings did a great job. On the other hand, I wasn’t really interested (at least at the beginning) in the topic – yes, it’s obvious from the first paragraph what the reading is up for. And my mere response to that would be, ‘Okay, maybe that’s not a promising vision, but I still want us to achieve it someday.’ However, beyond the ‘rant’ itself, what intrigued me was the idea of ‘the conventional means of interfacing the brain to the world (i.e., the body).’ Essentially, from my perspective, that is my first impulse to get in touch with IM: how are we going to interact (input and output info) as human beings in the future?

I always told people that I don’t like to read—but I’m forced to do so just because it holds (arguably) the most dense info on this planet. That’s in terms of ‘the media’—whatever that delivers information. At least for now, according to the scope of info that we can quantify, text still has its edge. (Honestly, it’s really sad news for me to acknowledge at the beginning – like a couple of years ago – that probably music that is based on audio (2 parameters) and paints based on images (arguably three parameters?) as the art forms I love has their inherent limit to express—even if we haven’t (or maybe already) reached).

On the other hand, what about humans? I mean, what about our bodies? I would say that people who strive to devise new media and convey and people who strive to plunge into our cognitive system are two teams that approach the same theme from two angles. I cannot answer if ‘bypassing’ the body is a good thing or not. But, for now, I would say the body is still a component of what we call ‘human.’

Week 10 – Reading Response

Bret Victor’s “A Brief Rant on the Future of Interaction Design” and its follow-up article present an interesting, if not particularly deep, critique of current interaction design trends. Victor’s main argument focuses on the limitations of “Pictures Under Glass” – flat touchscreen interfaces that dominate modern technology. He contends these interfaces fail to fully utilize our hands’ tactile capabilities, which are adept at feeling and manipulating objects.While not groundbreaking, Victor’s observation challenges us to think beyond the status quo of interaction design. He makes a valid point about prioritizing tactile feedback in future interfaces, arguing that our sense of touch is fundamental to how we interact with the world.Victor calls for more research into tangible interfaces, dynamic materials, and haptics, suggesting that truly revolutionary interfaces will come from long-term research efforts rather than incremental improvements to existing technology. This emphasis on pushing boundaries through research is noteworthy, even if not explored in great depth.The follow-up responses highlight that while devices like the iPad are revolutionary, they shouldn’t be the end goal of interface design. They also suggest that dynamic tactile mediums capable of physically representing almost anything are a worthy aspiration for future interfaces.Overall, Victor’s message prompts us to consider the full range of human sensory capabilities in developing new interaction models, encouraging us to imagine more intuitive and expressive interfaces.

Week 10: A Brief Rant On The Future Of Interaction Design

The reading by Bret Victor describes how interactive design has shifted towards the digital world, rather than focusing on the simplicity of tools and capabilities of physical actions. Through the concept of Pictures Under Glass, he describes how there is a growing disconnect between human interaction between activities on screen and in reality. By focusing on primatial features of human hands, they allow us to perceive the world around us. Following the points he mention of human hands, I fully agree that human learn and understand the world around us through senses. We can’t completely interact with the world fully around us by using one sense  such as vision, touch, smell, taste, and hearing.

While I agree technology has benefited society in many ways. I do not want a completely digital world where everything is behind a glass screen or through Victor concept of picture behind glass. I think it’s critical for humans to understand the world us, otherwise we lose the compassion and sense of worth of whatever it is we are interacting with. Without our sense of touch, our capabilities as human diminishes because we cannot grasp or use the tools around us. Likewise, if we don’t physical see an object, it becomes increasingly difficult to learn about and becomes near impossible to appreciate it.