User testing

videos

User Testing

I asked two of my friends to try the project without giving them any instructions at all. I basically stepped back and just watched how they interacted with it. I wanted to see what they touched first, what they ignored, what they were drawn to, and where they hesitated.

The first user started gently, almost testing the waters, but the moving colors grabbed his attention fast. He pressed the big button and immediately saw his face in the painterly camera view, which made him smile. After that, he began switching modes and taking pictures like it made sense right away. The potentiometer especially got him excited. He kept turning it back and forth and saying how cool it felt that the strokes actually changed with the knob. The only time he paused was when he said it would be really helpful to have a quick three second countdown before the picture is taken so people can pose. Other than that, everything felt natural to him.

The second user had a totally different vibe. He came in pressing buttons really fast just to see what would happen. The system actually handled that well and he picked up the mapping between the controls and the changes on screen pretty quickly. He really liked how turning the knob changed the strokes and even said it felt more physical and satisfying than using a mouse. The only point where he seemed confused was after he saved a photo, because the interface didn’t really say anything. He wasn’t sure if the picture actually saved or how to get back to the live view. That moment definitely needs better feedback.

What felt great

    1. People figured things out by exploring, not thinking.
    2. The hardware controls feel natural — especially the knob!
    3. The color modes make people react emotionally (in a good way!).

What needs to change

    1. Clear photo-save feedback (a flash, message, anything!).
    2. A countdown before capturing the image.
    3. A more obvious return-to-camera cue

      What I wanted to explain

I think just two things: which button saves, and how to return after saving. If the UI signals those moments better, the whole experience becomes effortlessly smooth.

Final Project Concept

Project Concept

For my final project, I’m creating a gesture-based vocabulary learning system (more like a game, basically). The idea came from noticing how flashcard apps never really stick for me because they’re so passive, and I wanted to create something where your body is actually involved in the learning process. Hover your left hand to see just a hard definition, challenging you to guess the word. When curiosity gets the better of you, hover your right hand to reveal the answer. A quick swipe over both sensors is like saying “got it” and moves to the next word, while holding both hands still gives you an example sentence to see the word in context. The interface tracks which words trip you up and brings them back more often, while words you nail consistently take a break. It’s spaced repetition, but you’re physically interacting with it rather than just clicking buttons.

 GRE prep, LSAT terms, fancy academic writing words, or a custom list of interesting words you’ve encountered and want to remember, the system visualizes your progress over time so you can actually see your vocabulary growing.

Arduino Design

Both photoresistors constantly feed readings to the Arduino, which compares them against threshold values to detect when your hands are hovering. The key is making the gesture detection smooth rather than jumpy, so I’m building in some debouncing logic that prevents flickering when your hand is at the edge of the sensor range. The timing matters too because the Arduino needs to distinguish between a quick swipe that says “next word” and deliberately holding both hands there to request an example sentence. I’m planning to add more to this but this is it for now.

P5.js Program Design

Large, readable text dominates the screen. When you hover your hand over a sensor, there’s subtle visual feedback confirming the system detected you, so you’re never wondering if it’s working. The progress dashboard lives off to the side, quietly showing you how many words you’ve learned today and your overall mastery percentage. The “Word of the Day” feature adds a nice ritual to opening the system. The system saves your progress using local storage, so your learning history persists between sessions. Over time, you build up this visualization of your vocabulary growth that’s genuinely satisfying to watch. You can see which word sets you’ve conquered, which ones still need work, and how consistent you’ve been with your practice. It’s the kind of feedback that makes you want to keep going.

Implementation Progress

I’m starting with getting the Arduino sensors working reliably because everything else depends on that foundation. First step is just wiring up the photoresistors and watching the raw values in the Serial Monitor to understand what I’m working with. Once I know what “hand hovering” actually looks like in terms of sensor readings, I can write the gesture detection code with appropriate thresholds. After the gestures feel solid and responsive when I test them by hand, I’ll set up the serial connection to P5 and make sure the commands are flowing through correctly.  Then it’s on to building the P5 interface, starting simple with a hard-coded list of maybe ten words that cycle through in response to Arduino commands. Once that basic interaction loop works, I’ll layer in the vocabulary database, spaced repetition algorithm, and progress tracking features. The final polish phase is where I’ll add the Word of the Day, multiple vocabulary sets, and any visual refinements that make it feel complete. The goal is to have something functional quickly, then make it delightful.

Final Project Idea

For my final project, I want to create a vocabulary learning system that uses hand gestures and light sensors to make memorizing words more engaging and physical. I’ve been thinking about how passive flashcard apps never really stick with me, and I’m curious whether tying words to actual movements might help them stay in my memory better. I want to focus on making the interaction feel natural and intuitive.

The basic idea is that I’ll have 2-3 photoresistors connected to Arduino, and my hand movements over them will control what appears on the P5 display. I’m imagining hovering my left hand to see just a definition, then hovering my right hand to reveal the actual word. I was also thinking maybe a quick swipe over both sensors could mark a word as learned and move it into a review pile, while holding my hand still might show an example sentence. What interests me most is building in a spaced repetition algorithm on the P5 side. I could have it track which words I struggle with and surface them more frequently, visualizing my progress over time. I’m also thinking it could be fun to add a “word of the day” feature that appears when I first turn it on, or maybe even pull from different vocabulary sets like GRE or LSAT words, academic writing terms, or just interesting words I encounter and want to remember.

Week 11 Production(Ling and Abdelrahman)

Conceptualization:

The central idea was to build a simple connection between physical and digital worlds.

Exercise 1: Single-Sensor and p5.js Movement
Using only one analog sensor (a potentiometer), the Arduino continuously reads values and streams them to p5.js over serial. p5.js interprets those readings and moves an ellipse along the horizontal axis, keeping it vertically centered.

// exercise 1
void setup() {
 Serial.begin(9600);
  // wait for p5 to connect
 while (Serial.available() <= 0) {
   Serial.println("0,0");
   delay(300);
 }
}
void loop() {
 // wait for data from p5
 while (Serial.available()) {
   digitalWrite(LED_BUILTIN, HIGH);
    Serial.read();     // read incoming
  
   int sensorValue = analogRead(A0); // read sensor
  
   Serial.println(sensorValue);   // send sensor value
 }
 digitalWrite(LED_BUILTIN, LOW);
}

P5js: https://editor.p5js.org/aka7951/sketches/alRawYdiF

Demo https://drive.google.com/file/d/1Morf2y7cxIAgYLHKVnitsadjr813cX4Z/view?usp=sharing

 

Exercise 2: LED Brightness Controlled by p5.js
Next, I reversed the flow. Instead of only reading from Arduino, I sent numerical values from p5.js back to the board so it could adjust LED brightness using PWM.

int ledPin = 9;


void setup() {
 Serial.begin(9600);
 pinMode(ledPin, OUTPUT);
 pinMode(LED_BUILTIN, OUTPUT);
  // Wait for p5 connection
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH);
   delay(300);
   digitalWrite(LED_BUILTIN, LOW);
   delay(300);
 }
}
void loop() {
 if (Serial.available() > 0) {
   int brightness = Serial.parseInt();  // Read brightness value from p5
   analogWrite(ledPin, brightness);     // Set LED brightness
  
   if (Serial.read() == '\n') {
     digitalWrite(LED_BUILTIN, LOW);
   }
 }
}

P5js: https://editor.p5js.org/aka7951/sketches/tQoRTseMj

Demo: https://drive.google.com/file/d/1rdLpd3JG87jDa9bJEWZWtLMIvA9Ub0Vg/view?usp=sharing

Exercise 3: Gravity + Wind Integration
Finally, I modified the p5.js gravity wind sketch. Each time the ball hits the “ground,” p5 sends a signal to Arduino, turning an LED on briefly before switching back off. Meanwhile, an analog sensor feeds continuous data to p5.js to influence the wind force acting on the falling ball.

int ledPin = 5;


void setup() {
 Serial.begin(9600);
 pinMode(ledPin, OUTPUT);
  // wait for p5 to connect
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH);
   Serial.println("0");
   delay(300);
   digitalWrite(LED_BUILTIN, LOW);
   delay(50);
 }
}
void loop() {
 while (Serial.available()) {   //check if there's data from p5
   digitalWrite(LED_BUILTIN, HIGH);
  
   int ledValue = Serial.parseInt();     //read LED value from p5 (0 or 1)
   if (Serial.read() == '\n') {     // check for newline character
     digitalWrite(ledPin, ledValue);
    
     int sensorValue = analogRead(A0);  // read potentiometer value
     Serial.println(sensorValue);  // send sensor value back to p5
   }
 }
 digitalWrite(LED_BUILTIN, LOW);
}

P5js: https://editor.p5js.org/aka7951/sketches/ze7W5LcOi

Demo: https://drive.google.com/file/d/1c0O6wrqE6aStFpg9cvYVOke0zxojdR4Z/view?usp=sharing

Reflection & Challenges:

This project helped me practice the  bidirectional serial communication between Arduino and p5.js. One difficulty came from synchronizing input and output flows. In Exercises 2 and 3, sending and receiving serial data simultaneously often led to mixed or incomplete values being parsed. I learned to structure the serial communication carefully — ensuring each transmission ended with a newline and that both ends only read or wrote when data was available.

Week 11 Reading Response

Reading Design Meets Disability made me think about how deeply design shapes the way we see ability and difference. I liked how the book challenged the traditional idea that design for disability should be purely functional. When I saw examples like the stylish hearing aids or the eyeglasses that evolved from medical tools to fashion accessories, I realized how design doesn’t just solve problems—it tells stories about people and identity. I found myself really appreciating how the author described the tension between discretion and expression, especially in how disability devices can either hide or celebrate difference. It made me think about how design always carries a message, even when it pretends to be neutral.

What I liked most was the way the book connected design with emotion and culture. I loved seeing how something as simple as a prosthetic leg could become a work of art, or how Apple’s minimalist products fit into the same conversation about accessibility and beauty. The idea that “good design on any terms” can come from understanding human diversity really stayed with me. I felt inspired by the thought that inclusive design isn’t about charity but creativity—it’s about expanding what we consider beautiful and functional at the same time. This reading made me want to look more closely at everyday objects and think about the values they quietly express.

Week 10 – Musical instrument (Group Project with Isabella)

Concept

Mini Piano

Given that we had to make a musical instrument, we were inspired by a project we found in YouTube of what seemed like a piano. From this, we aimed to make a smaller scale of this project following the requirements for the assignment. The circuit consists of one piezo speaker, a potentiometer as an analog sensor, and a three buttons as digital sensors.

Circuit Illustration

Figure 2 ( Circuit design with code and simulation on “Magnificent Jaiks” by Abdelrahman

Final Results

VIDEO

 

What I’m Proud Of

One aspect of this project I’m particularly proud of is the octave multiplier implementation. Instead of having fixed notes, I used the potentiometer to create a continuous pitch control that multiplies the base frequencies by a factor between 0.5x and 2.0x. This simple line of code:

float octaveMultiplier = map(potValue, 0, 1023, 50, 200) / 100.0;

transforms a basic three-note instrument into something much more expressive and fun to play with. You can play the same melody in different registers, which makes the instrument feel more like a real musical tool rather than just a tech demo.

Challenges and Further Improvements

Despite the complexity of the circuit, Abdelrahman successfully managed to produce a design in Magnificent Jaiks which I managed to follow without any issues. After arranging the jumper wires and all the other pieces, the final result was as we aimed for. One of the biggest challenges we faced was finding a way to divide the work as a result of the distance. Nevertheless, we managed to make a plan after meeting on zoom, and finished our circuit on time. For future projects, one thing I would like to improve on is developing an instrument of this kind with more buttons in order to have a larger version of the mini piano done for this assignment.

Week 10 Reading Response

What I found most interesting in this reading was how the author explained that the point of his rant was not to give answers but to make people notice a problem. He wanted readers to realize that the way we design technology often ignores the human body and how it naturally interacts with the world. I liked how he compared the iPad to early black-and-white photography, saying that while it was revolutionary, it was also incomplete. That comparison made sense to me because it showed that something can be both amazing and limited at the same time. The author’s honesty about not having a solution made the whole thing feel more genuine. It felt less like a complaint and more like a challenge for people to think differently.

The part that stayed with me most was when he questioned what kind of future we are choosing if we keep creating tools that bypass the body. He described how we already spend so much of our lives sitting still in front of screens and how that could become permanent if we are not careful. I thought that was a powerful warning, especially when he said that children might become “finger-blind” by using touchscreens instead of exploring the world with their hands. It made me think about how technology can quietly change what it means to grow, learn, and create. By the end, I understood that his real message was about balance. Innovation should not mean leaving behind what makes us human.

Week 10 Reading Response

What really stuck with me from this reading is how much we’ve lost touch with using our hands in meaningful ways. The author points out that most of our modern “future” tech ignores what our hands are actually capable of, and that made me pause. I never thought about how numb the experience of using a phone screen really is until it was compared to holding a book or a cup. When I read that part, I actually picked up the notebook on my desk and noticed how I could feel its weight shift as I moved it. I don’t think I’ve ever noticed something that small before.

It reminded me of when I used to cook while during my leave of absence I took while living alone in Abu Dhabi. Chopping vegetables always felt natural, like my hands just knew what to do without thinking. It’s such a different kind of attention than what happens when I’m tapping on a touchscreen. That difference feels important now. I think the reading made me realize that technology doesn’t have to pull us away from the physical world. It can work with it. Maybe the future shouldn’t just look sleek or advanced, but should also let us feel connected to what we’re doing again.

Week 9 assignment

Concept

So basically, I made this little setup where I can control two LEDs using a knob and a button. One fades in and out smoothly when you twist the knob, and the other just pops on and off when you hit the button. It’s honestly pretty straightforward, but it was actually kind of fun to put together.

Demo
How It Works

The Knob: There’s a potentiometer, when you turn it, one of the LEDs gets brighter or dimmer. Twist one way and the light fades down, twist the other way and it gets brighter.

The Button: The other LED is even simpler. Press the button, light’s on. Let go, light’s off. That’s it.

Code

int pushButton = 2;
int led1 = 4;
int potentiometer = A5;
int led2 = 3;
void setup()
{
pinMode(led1, OUTPUT);
pinMode(led2, OUTPUT);
pinMode(pushButton, INPUT);
pinMode(potentiometer, INPUT);
}
void loop()
{
// digital sensor (switch) controlling ON/OFF state of LED
int buttonState = digitalRead(pushButton);
if (buttonState == HIGH){
digitalWrite(led1, HIGH);
}
else if (buttonState == LOW) {
digitalWrite(led1, LOW);
}
// analog sensor (potentiometer) controlling the brightness of LED
int potValue = analogRead(potentiometer);
int brightness = map(potValue, 0, 1023, 0, 255);
analogWrite(led2, brightness);
}

Future Improvements

Making It More Interactive: I could add more sensors to make it do cooler stuff. Like maybe a light sensor so the LEDs automatically adjust based on how bright the room is, or a temperature sensor that changes the colors based on how hot or cold it is.

Adding Colors: Right now it’s just basic LEDs, but I could swap them out for RGB LEDs. Then the potentiometer could control not just brightness but also what color shows up. Turn the knob and watch it cycle through the rainbow. That’d be way more visually interesting.

Week 9: Making Interactive Art: Set the Stage, Then Shut Up and Listen

Tom Igoe’s Making Interactive Art: Set the Stage, Then Shut Up and Listen made me think differently about what it really means to create something interactive. I liked how straightforward and almost blunt his advice was; don’t interpret your own work. That simple statement hit harder than I expected. I’ve noticed how easy it is to over-explain a project, especially when you’ve spent so much time building it. You want people to “get it,” so you tell them what it means, how to use it, and what to feel. But Igoe’s point is that doing that takes away the audience’s role completely. It turns interaction into instruction. That idea made me reflect on my own creative habits, especially how often I try to control an outcome rather than trust that people will find their own meaning through what I make.

I like how he compares interactive art to directing actors. A good director doesn’t feed emotions line by line; they set up the environment and let the actor discover the truth of the scene themselves. In the same way, an artist should set up a space or a system that invites exploration without dictating what’s supposed to happen. Igoe’s line about how the audience “completes the work” stayed with me because it reframes what success means in interactive art. It’s not about whether people interpret your piece exactly as you intended; it’s about whether they engage, react, and create their own story inside it. I think that takes a certain amount of humility and patience as an artist. You have to build something that’s open enough to allow surprise, even misunderstanding, and see that as part of the process instead of a failure.