Week 10 Reading Response

The Future of Interaction

Reading this article really made me rethink how I interact with technology. I’ve always been fascinated by touchscreens and their simplicity, but I never stopped to consider how limiting they actually are. The author’s critique of “Pictures Under Glass” really hit me, especially when they described how flat and numb these interfaces feel. It’s true—I use my phone every day, but when I think about it, the experience of swiping and tapping feels so disconnected compared to how I interact with physical objects.

One part that really stood out to me was the comparison to tying shoelaces. It reminded me of when I was little and struggling to learn how to tie mine. My hands learned by feeling, adjusting, and figuring things out without needing to rely on my eyes all the time. That’s such a natural way for us to interact with the world, and it’s crazy to think how little that’s reflected in our technology today.

The section about making a sandwich was also a moment of realization for me. It’s such a simple, everyday task, but it involves so much coordination and subtle feedback from your hands—how the bread feels, the weight of the knife, the texture of the ingredients. None of that exists when I swipe through apps or scroll on a website. It made me wonder: why do we settle for technology that ignores so much of what our hands can do?

This article really inspired me to think differently about the future of technology. I agree with the author that we need to aim higher—to create interfaces that match the richness of our human abilities. Our hands are capable of so much more than sliding on glass, and it’s exciting to imagine what might be possible if we started designing for that.

Responses: A Brief Rant on the Future of Interaction Design

I found this follow-up just as thought-provoking as the original rant. The author’s unapologetic tone and refusal to offer a neatly packaged solution make the piece feel refreshingly honest. It’s clear that their main goal is to provoke thought and inspire research, not to dictate a specific path forward. I really appreciated the comparison to early Kodak cameras—it’s a great reminder that revolutionary tools can still be stepping stones, not destinatione.

The critique of voice and gesture-based interfaces resonated with me too. I hadn’t really considered how dependent voice commands are on language, or how indirect and disconnected waving hands in the air can feel. The section on brain interfaces was particularly interesting. I’ve always thought of brain-computer connections as a futuristic dream, but the author flipped that idea on its head. Instead of bypassing our bodies, why not design technology that embraces them? The image of a future where we’re immobile, relying entirely on computers, was unsettling but eye-opening.

I love how the author frames this whole discussion as a choice. It feels empowering, like we’re all part of shaping what’s next. It’s made me more curious about haptics and dynamic materials—fields I didn’t even know existed before reading this. I’m left thinking about how we can create tools that actually respect the complexity and richness of human interaction.

 

Week 10 Project Echoes of Light

Concept
Our project, “Light and Distance Harmony,” emerged from a shared interest in using technology to create expressive, interactive experiences. Inspired by the way sound changes with distance, we aimed to build a musical instrument that would react naturally to light and proximity. By combining a photoresistor and distance sensor, we crafted an instrument that lets users shape sound through simple gestures, turning basic interactions into an engaging sound experience. This project was not only a creative exploration but also a chance for us to refine our Arduino skills together.

Materials Used
Arduino Uno R3
Photoresistor: Adjusts volume based on light levels.
Ultrasonic Distance Sensor (HC-SR04): Modifies pitch according to distance from an object.
Piezo Buzzer/Speaker: Outputs the sound with controlled pitch and volume.
LED: Provides an adjustable light source for the photoresistor.
Switch: Toggles the LED light on and off.
Resistors: For the photoresistor and LED setup.
Breadboard and Jumper Wires
Code
The code was designed to control volume and pitch through the analog and digital inputs from the photoresistor and ultrasonic sensor. The complete code, as documented in the previous sections, includes clear mappings and debugging lines for easy tracking.

// Define pins for the components
const int trigPin = 5; // Trigger pin for distance sensor
const int echoPin = 6; // Echo pin for distance sensor
const int speakerPin = 10; // Speaker PWM pin (must be a PWM pin for volume control)
const int ledPin = 2; // LED pin
const int switchPin = 3; // Switch pin
const int photoResistorPin = A0; // Photoresistor analog pin

// Variables for storing sensor values
int photoResistorValue = 0;
long duration;
int distance;

void setup() {
Serial.begin(9600); // Initialize serial communication for debugging
pinMode(trigPin, OUTPUT); // Set trigger pin as output
pinMode(echoPin, INPUT); // Set echo pin as input
pinMode(speakerPin, OUTPUT); // Set speaker pin as output (PWM)
pinMode(ledPin, OUTPUT); // Set LED pin as output
pinMode(switchPin, INPUT_PULLUP); // Set switch pin as input with pull-up resistor
}

void loop() {
// Check if switch is pressed to toggle LED
if (digitalRead(switchPin) == LOW) {
digitalWrite(ledPin, HIGH); // Turn LED on
} else {
digitalWrite(ledPin, LOW); // Turn LED off
}

// Read photoresistor value to adjust volume
photoResistorValue = analogRead(photoResistorPin);

// Map photoresistor value to a range for volume control (0-255 for PWM)
// Higher light level (LED on) -> lower photoresistor reading -> higher volume
int volume = map(photoResistorValue, 1023, 0, 0, 255); // Adjust mapping for your setup

// Measure distance using the ultrasonic sensor
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

duration = pulseIn(echoPin, HIGH);

// Calculate distance in cm
distance = duration * 0.034 / 2;

// Set frequency based on distance in the range of 2-30 cm
int frequency = 0;
if (distance >= 2 && distance <= 30) {
frequency = map(distance, 1, 100, 20000, 2000); // Closer = higher pitch, farther = lower pitch
tone(speakerPin, frequency);
analogWrite(speakerPin, volume); // Apply the volume based on photoresistor reading
} else {
noTone(speakerPin); // Silence the speaker if the distance is out of range
}

// Debugging output
Serial.print("Photoresistor: ");
Serial.print(photoResistorValue);
Serial.print("\tVolume: ");
Serial.print(volume);
Serial.print("\tDistance: ");
Serial.print(distance);
Serial.print(" cm\tFrequency: ");
Serial.println(frequency);

delay(100); // Short delay for sensor readings
}

 

Video Demonstration
In our video demonstration, we showcase how the instrument responds to changes in light and proximity. We toggle the LED to adjust volume and move a hand closer or farther from the ultrasonic sensor to change pitch, demonstrating the instrument’s sensitivity and interactive potential.

Reflections
What Went Well: The project successfully combines multiple sensors to create a reactive sound device. The integration of volume and pitch control allows for intuitive, responsive sound modulation, achieving our goal of designing an engaging, interactive instrument.

Improvements: To improve this instrument, we would enhance the melody range, creating a more refined and versatile sound experience. This could involve using additional sensors or more sophisticated sound generation methods to provide a broader tonal range and a richer melody.

Assignment 13: Final Project User Testing

Progress

I finally finished with the hardware aspect of my project, in terms of making the head body and arm of the cat (the ears are still missing though..). During the process, I encountered some issues with my project, mainly with the pressure sensor. First, I made my own pressure sensor using velostat and that works as shown in the final project proposal for Assignment 12. However this was too small for what I actually need for my final piece, so I tried making it on a larger scale but the pressure sensor did not give me a variety of readings and only gave me high values of readings (in the 1000s) and it rarely varies. I tried using the piezo sensor, but it also does not vary. So at the moment, the pressure sensor is not a functioning part of my project.

Other than that, I have made progress with the interface on the p5 side, finished building the arm and have tested it so that it moves with the servo motor and I have also implemented the lcd to reflect what is written on the input box in the p5 sketch.

User Testing Video

Feedback and Reflections

For this user testing, I asked one of my friends to test my project. He was able to figure it out because I was debugging right in front of him, so he saw what actions I made and knew what to do. But if he didn’t see me do it, he said he would have been confused because I didn’t give instructions for how to trigger the cat’s hand to move.

There were also parts where I couldn’t finish in time to implement, that being illustrating the different images I want to pop up on the p5 sketch whenever the cat’s hand move. So he said the cat’s hand moving right now seems a bit pointless (and I had to explain what was meant to happen).

After this user testing, I realized there are still a lot more improvements and edits that I need to make to my project. One being to give clear instructions on the p5 sketch so that users know exactly what to do. Another thing is to find a way to get the pressure sensor working. And finally, finish with the illustrations so that I can implement it to my p5 sketch.

Week 13: User Testing

Overall Reaction 
My user testing experience went about how I expected it to. The person who participated was already familiar with my concept so it was a bit challenging to get an authentic reaction but we still had a productive conversation about what steps I need to take to improve. Although he was aware of my concept, I did not prompt him with any steps on how to run the software. Therefore, I was quite pleased that he was able to navigate it fairly seamlessly but am considering altering a few of the prompts just so users have a better idea of what to expect when they choose a location.

Areas for Improvement 

One thing he highlighted in our discussion afterwards was the quality of my mappings. Because you are viewing the same thing on the screen (with brief instructions) and what you touch on the paper, it is clear that you should press a button to generate that location’s music. Right now my biggest concern in terms of improvements in finishing adding the switches to the map. This weekend I focused my time on finishing the software but now realize how that limited the user experience for testing. To help remedy this, I plan to conduct user testing again this Saturday to get a full critique of the design.

Another feature that we discussed would be beneficial in adding is a small screen where users can see themselves dancing to help promote full interaction with the piece. Given that my user already knew the intended goal of my design, he danced for the sake of full participation but I’m not sure it’s clear to users that I also want them to dance on their own. Thus, in terms of next steps, my immediate goals are: finishing the hardware, adding the video capture component, and creating another tab with specific instructions of how to use the program.

Testing Video

Week 13: User Testing

Google Drive Video Link 

Through my user testing experience, I realized that I still have a lot of work to do before the IM Show. The person I had for my user testing experience had already played through the game on the web, but when it came to the physical components, there were a couple of mishaps.

One of the key issues was the user interface to interact with the shifting of game states from the start menu to the tutorial, to the instructions, to the game, etc. In setting up the serial port, I had the setup key connected to the SPACE bar. When the user tried to proceed to the next game state, sometimes they would instinctively reach for the SPACE bar, press it, and the program would then be stuck and unplayable. As such that is a critical point which I need to address as to how player will interact with the game.

As for the game experience, I had them play with a prototype so there were no obviously signs of what each of the button does besides the small scribbles of “up”, “down”, “left,” and “right.” I hope in the near future to get it cleaned up and have it be more presentable because the controls got a bit confusing as the game when on and more creatures appeared on the screen. However, I am very surprised that the prototype worked as well as it did, especially since it was made out of cardboard, straws, tape, and a dream. Some feedback I received was that the button was less sensitive towards the edges, so that can be something of interest to change in the coming future.

On the software side, and while the game was a success, I would like to make the progression easier and less demanding on the players. Watching others play the game and having experienced the game itself, I felt the game become overwhelming really quick. I want people to sit and enjoy the game with someone else, so I hope to make things easier and slow down the pacing of the game as well.