Week 13 – Final Project User Testing

Overview

This project was inspired by the retro Etch-A-Sketch with a modern twist. Instead of knobs, individuals control the drawing with two potentiometers (for X and Y movement) and a switch for toggling between drawing states. Individuals were tasked with exploring interactive hardware-software integration by translating analog sensor input into graphical visual output. After hearing questions and feedback, I aim to further improve real-time drawing accuracy and the user interface by reducing latency, refining the toggle function, and thinking about how to better intuitively represent drawing progress.

Video Link: https://drive.google.com/file/d/10zKjQpbPppzI0duzytp0_6KQYVFO1qMh/view?usp=sharing

Strengths

  1.  Utilization of the [p5.webserial] library facilitated the reading of serial data from the Arduino without the need for desktop software.
  2. Potentiometers were providing accurate analog input, and mapping to screen coordinates was achieved after calibration.
  3. Intuitive buttons for starting, connecting/disconnecting the port, and resetting the canvas make the experience even more seamless.
  4. A new drawing color is created with each button push, creating a fun and dynamic drawing experience.

 

Additional Improvement

  1. Provide on-screen instructions on how to connect to the Serial monitor
  2. Provide instructions and indicators on the model itself (indicators of the potentiometers and button)
  3. Make a small hole for the USB Type-B cable to pass through, to prevent messiness
  4. Add a little cover over the model to cover up the wires in the arduino board
  5. The earlier versions exhibited clear lag or delay when drawing. This was addressed by optimizing serial communication and removing the [multiple.readUntil()] calls within the draw loop.
  6. The initial drawing of the red cursor was removed to avoid the distraction it caused from the artwork. It was ultimately removed for relying on the user to sense movement with the potentiometer.
  7. Still some minor jitter or lag on switching draw mode. Adding smoothing or filtering noise would likely enhance accuracy.
  8. The mode switch from welcome screen to the drawing view functions but may appear polished visually in usability terms for a better user experience.

Future ideas

  • Add an on-screen cursor to show position without drawing.
  • Add a on-screen button to save the drawing as a PNG.

User Testing

Overview

Berry Blessings is an Arduino joystick-controlled p5.js game where a duck must collect 10 berries to unlock a magical portal to the real world. Once the player reaches the portal, an Arduino-powered robot is meant to activate and move toward a fairy setup, creating a double world experience.

After receiving feedback that the previous storyline was too abstract and difficult to understand, I redesigned the narrative to be more intuitive and enjoyable. The new version provides a clear goal which is collecting berries to unlock a portal and eventually a satisfying real-world reward.

Two users participated in a no-instruction playtest of the p5.js portion of the game.

What Worked Well

  • Joystick Controls: Users figured out joystick movement after some experimentation. Movement felt smooth once they adjusted to it.

  • Berry Counter: The on-screen “Berries: X / 10” counter gave clear feedback and helped users understand progression.

  • Visual Design: The duck animations, bright environment, and sign reading “Collect 10 berries to pass” helped anchor the gameplay and made objectives more understandable.

  • Revised Storyline: The updated goal and real-life payoff made the experience feel more engaging and easier to follow.

Challenges & Observations

  • Joystick Direction Mapping: Both users were initially confused by which way the duck would move relative to joystick direction.

    Planned fix: Add visual direction indicators (e.g. arrows or compass).

  • Robot Frustration (During Earlier Testing): Friends testing the robot portion before the Arduino broke reported frustration with its slow, unstable movement.

    Plan: Replace with a more reliable wheeled robot for smoother performance.

  • Duck Centering Bug: One user pointed out that the duck appeared too far from the center when changing directions.

    Fix: Adjusted duck positioning to remain centered during direction changes.

  • Hardware Damage: The Arduino Uno was accidentally damaged due to adjusting connections while powered on, so it could not be used during this round of user testing.

Fixes & Next Steps

Already Implemented:

  • Rewrote the story to improve clarity and engagement.

  • Added a sign by the portal to explain the berry objective.

  • Included a working berry counter.

  • Fixed duck position offset when changing directions.

In Progress:

  • Sourcing a new Arduino Uno to replace the damaged one.

  • Planning to switch to a wheeled robot for improved reliability.

  • Will add visual joystick direction indicators to help users orient themselves.

I will post an update before next class once the hardware is replaced and re-integrated.

Final Project-Motion-Activated Mini Garden Visualizer

Project concept

This project was inspired by a real experience on my family’s farm—I saw foxes approaching our chickens at night, and my dad was complaining on how to deal with the situotion it made me think about how to create a simple way to detect movement at a low cost and respond immediately. I wanted something that could work without constant supervision, so I came up with the idea of using an ultrasonic sensor and lights to react when something gets close.

From there, I built a mini garden setup using individual RGB LEDs and connected it to P5.js for visual feedback. I started with a different idea—using sound and an LED strip—but changed direction after facing hardware limitations. and honestly I love how it turned better. That process helped shape the final concept,i also wanted to create something similar to lava lamps that’s why ii ended up doing the visuals in p5.js similar to the blobs in the lava lamps which now works as both an interactive installation and a practical assistive device for animal and crop protection.

My final project is the Motion-Activated Mini Garden/farm Visualizer, an interactive installation that responds to movement and presence using an ultrasonic sensor. – house and the garden I built . When someone approaches the garden, the LEDs light up based on proximity—closer movement causes brighter, more vibrant lights, while standing farther away results in dimmer, calmer effects.

-blue is for far away

-green in for in the middle

-red is for very close

The schematic digram:

Aruidno code:

// HC-SR04 Ultrasonic Sensor Pins
#define TRIG_PIN 7
#define ECHO_PIN 8

// RGB LED Pins
#define RED_LED_PIN 3
#define GREEN_LED_PIN 5
#define BLUE_LED_PIN 6

long duration;
int distance;

void setup() {
  Serial.begin(9600);

  // Sensor Pins
  pinMode(TRIG_PIN, OUTPUT);
  pinMode(ECHO_PIN, INPUT);

  // RGB LED Pins
  pinMode(RED_LED_PIN, OUTPUT);
  pinMode(GREEN_LED_PIN, OUTPUT);
  pinMode(BLUE_LED_PIN, OUTPUT);

  turnOffAll(); // Ensure LEDs are off initially
}

void loop() {
  // Trigger the ultrasonic sensor
  digitalWrite(TRIG_PIN, LOW);
  delayMicroseconds(2);
  digitalWrite(TRIG_PIN, HIGH);
  delayMicroseconds(10);
  digitalWrite(TRIG_PIN, LOW);

  // Read echo and calculate distance (cm)
  duration = pulseIn(ECHO_PIN, HIGH);
  distance = duration * 0.034 / 2;

  // Print distance to serial
  // Serial.print("Distance: ");
  Serial.println(int(distance));
  // Serial.println(" cm");

  // LED logic based on distance
  if (distance < 10) {
    setColor(255, 0, 0);   // Close → Red
  }
  else if (distance >= 10 && distance <= 30) {
    setColor(0, 255, 0);   // Medium → Green
  }
  else {
    setColor(0, 0, 255);   // Far → Blue
  }
  delay(100);
}

// Control RGB LEDs using digital logic
void setColor(uint8_t r, uint8_t g, uint8_t b) {
  digitalWrite(RED_LED_PIN, r > 0 ? HIGH : LOW);
  digitalWrite(GREEN_LED_PIN, g > 0 ? HIGH : LOW);
  digitalWrite(BLUE_LED_PIN, b > 0 ? HIGH : LOW);
}

void turnOffAll() {
  digitalWrite(RED_LED_PIN, LOW);
  digitalWrite(GREEN_LED_PIN, LOW);
  digitalWrite(BLUE_LED_PIN, LOW);
}

p5.js code:

let serial;
let distance = 0;

function setup() {
  createCanvas(windowWidth, windowHeight);
  background(0);

  serial = new p5.SerialPort();
  serial.on('connected', () => console.log("Connected to Serial!"));
  serial.on('data', serialEvent);
  serial.open("/dev/tty.usbmodem101"); // Change to your actual COM port
  console.log();
}

function draw() {
  background(0, 30);

  // let size = map(distance, 0, 1023, 10, 100); // Adjust if your micValue goes up to 1023
  let size = 30;
  let col;
  if (distance < 10) {
    col = color(255, 0, 0); // Low
  } else if (distance < 30) {
    col = color(0, 255, 0); // Medium
  } else {
    col = color(0, 0, 255); // High
  }

  fill(col);
  noStroke();

  for (let i = 0; i < 20; i++) {
    ellipse(random(width), random(height), size);
  }
}


function serialEvent() {
  let data = serial.readLine().trim();
  if (data.length > 0) {
    distance = int(data);
    console.log("mic:", distance);
  }
}

A short clip showing someone approaching the mini garden. As they move closer, the LED lights respond by changing color, and the screen displays animated, color-shifting blobs in sync with the movement:

https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq

The process:

How does the Implementation work

An ultrasonic sensor detects how close a person or object is to the garden. This distance value is read by the Arduino and mapped to RGB LED colors. The same data is also sent over serial communication to P5.js, which animates abstract blobs on the screen. These blobs shift in speed, size, and color based on how close the user is, creating a consistent and engaging visual language that mirrors the physical lighting

interaction design :

The design of the project relies on intuitive interaction. Users are not given instructions—they are simply invited to explore. As they move closer to the garden, the changes in light and digital visuals guide them to understand the system’s behavior. This makes the experience playful and discoverable.

Distance from Sensor LED Color P5.js Visual Response
Less than 10 cm Red Fast motion, bright red blobs
10–30 cm Green Medium speed, green blobs
More than 30 cm Blue Slow, soft blue motion

A clear explanation focusing  on how the device assists with animal and crop protection,

Assistive Use Cases: Protecting Animals and Crops:

This motion-activated system has strong potential as an assistive device in agricultural and animal care settings, where clear and reliable response is essential.

For Animals (Livestock Protection):

The system can be used to monitor the area around livestock enclosures such as sheep pens, chicken coops, or goat fields. When a predator like a fox, stray dog, or wild animal approaches, the ultrasonic sensor detects motion, triggering an immediate response—such as flashing lights, alarms, or future-connected alerts. This helps deter predators non-invasively and gives farmers real-time awareness of threats without being physically present.

For Crops (Field and Garden Monitoring):
In gardens, greenhouses, or open crop fields, this system can be used to detect intruders, trespassers, or large animals that may damage crops. The lights act as a deterrent, and with future improvements (like wireless communication), it could alert the farmer via phone or connected system. This is especially helpful at night or in remote locations, allowing for continuous, low-maintenance monitoring.

Assistive Use Case: Law Enforcement and Security

This motion-activated system can be effectively adapted for law enforcement and security by serving as a low-cost, responsive perimeter monitoring tool. Installed at property lines, remote checkpoints, or restricted access areas, the device detects unauthorized movement and can trigger lights, sirens, or silent alerts depending on the situation. With future enhancements, it could be linked to mobile devices or integrated with camera systems for real-time surveillance. Its compact, portable design makes it suitable for temporary deployments during investigations, search operations, or event monitoring, offering a clear and reliable response without requiring continuous human oversight.

what im proud of :

  • Creating a synchronized experience between physical light and digital visuals

  • Making the interaction intuitive and inviting, even without instructions

  • Learning how to connect Arduino to P5.js and achieve stable real-time communication

  • Areas of improvement
    • Add wireless communication (e.g., Bluetooth or Wi-Fi) to trigger mobile alerts

    • Improve the physical build by embedding LEDs into real plants or creating a more polished enclosure

    • Include a reset or mode switch to allow the user to cycle through different animation types

  • The final outcome:
  • Last video:
  • https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq

Week 13 – Final Project User Testing

Link to video : https://drive.google.com/file/d/1wrsx0LpX8-KULwrtQ7UhdwTtWr0vSpkc/view?usp=sharing

To evaluate the user experience of ExpressNotes, I conducted testing sessions where participants interacted with the system without receiving any instructions or guidance. The goal was to observe how intuitive the system is and identify any points of confusion. Users were introduced to the setup with the Arduino buttons, potentiometer, and the connected P5.js visual interface on a laptop. Most users hesitated at the beginning, unsure of how to begin the interaction, with the “Connect to Arduino” button being overlooked by several participants.

Once the connection was established and users began pressing buttons, they quickly recognized that each button triggered a unique piano note along with a visual effect on the screen. This immediate feedback created a sense of play and curiosity, and users began experimenting more confidently. The relationship between physical input and audiovisual output was generally clear, although the specific mappings between notes and visuals were not always understood without further exploration or explanation.

The potentiometer was another area where users expressed confusion. While some participants guessed that it controlled volume, others assumed it affected brightness or visual intensity. The lack of on-screen feedback for the potentiometer made its purpose harder to identify. Adding a visual indicator—such as a dynamic volume bar—could significantly improve clarity and reinforce the connection between the physical control and the system’s response.

One common suggestion from users was to enhance the physical design of the interface. Several participants mentioned that the Arduino board and buttons looked too technical or unfinished. They recommended covering the board with a piano-style overlay so the buttons resemble piano keys. This would not only make the device more visually appealing but also give immediate context to new users, helping them understand that the interaction is musically driven.

ExpressNotes was well received as an interactive and expressive experience. Users enjoyed the audio-visual feedback and were intrigued by how the visual patterns changed with each note. However, clearer onboarding, labels for controls, a visual volume indicator, and improved hardware presentation would help users engage more quickly and confidently. These observations will guide improvements to make the project more accessible, intuitive, and enjoyable for first-time users.

Week 13 – User Testing

Concept

For my final project, I developed an interactive flocking simulation that users can control through hand gestures captured via their webcam. The project uses computer vision and machine learning to detect and interpret hand positions, allowing users to manipulate a swarm of entities (called “boids”) by making specific hand gestures.

The core concept was to create an intuitive and embodied interaction between the user and a digital ecosystem. I was inspired by the natural behaviors of flocks of birds, schools of fish, and swarms of insects, and wanted to create a system where users could influence these behaviors through natural movements.

Ben Eater

User Testing Insights

During user testing, I observed people interacting with the system without providing any instructions. Here’s what I discovered:
  • Most users initially waved at the screen, trying to understand how their movements affected the simulation
  • Users quickly discovered that specific hand gestures (pinching fingers) changed the shape of the swarming elements
  • Some confusion occurred about the mapping between specific gestures and shape outcomes
  • Users enjoyed creating new boids by dragging the mouse, which added an additional layer of interactivity
Areas where users got confused:
  • Initially, people weren’t sure if the system was tracking their whole body or just their hands
  • Some users attempted complex gestures that weren’t part of the system
  • The difference between the thumb-to-ring finger and thumb-to-pinkie gestures wasn’t immediately obvious
What worked well:
  • The fluid motion of the boids created an engaging visual experience
  • The responsiveness of the gesture detection felt immediate and satisfying
  • The changing shapes provided clear feedback that user input was working
  • The ability to add boids with mouse drag was intuitive

Interaction (p5 side for now):

The P5.js sketch handles the core simulation and multiple input streams:

  1. Flocking Algorithm:
    • Three steering behaviors: separation (avoidance), alignment (velocity matching), cohesion (position averaging)
    • Adjustable weights for each behavior to change flock characteristics
    • Four visual representations: triangles (default), circles, squares, and stars
  2. Hand Gesture Recognition:
    • Uses ML5.js with HandPose model for real-time hand tracking
    • Left hand controls shape selection:
      • Index finger + thumb pinch: Triangle shape
      • Middle finger + thumb pinch: Circle shape
      • Ring finger + thumb pinch: Square shape
      • Pinky finger + thumb pinch: Star shape
    • Right hand controls flocking parameters:
      • Middle finger + thumb pinch: Increases separation force
      • Ring finger + thumb pinch: Increases cohesion force
      • Pinky finger + thumb pinch: Increases alignment force
  3. Serial Communication with Arduino:
    • Receives and processes three types of data:
      • Analog potentiometer values to control speed
      • “ADD” command to add boids
      • “REMOVE” command to remove boids
    • Provides visual indicator of connection status
  4. User Interface:
    • Visual feedback showing connection status, boid count, and potentiometer value
    • Dynamic gradient background that subtly responds to potentiometer input
    • Click-to-connect functionality for Arduino communication

Technical Implementation

The project uses several key technologies:
  • p5 for rendering and animation
  • ML5js for hand pose detection
  • Flocking algorithm based on Craig Reynolds’ boids simulation
  • Arduino integration (work in progress) for additional physical controls

Demo:

Arduino Integration (Work in Progress)
The Arduino component of this project is currently under development. The planned functionality includes:
  • Button Controls: Multiple buttons to add/remove boids for the simulation
  • Potentiometer: A rotary control to adjust the speed of the boids in real-time, giving users tactile control over the simulation’s pace and energy

Week 13 – Final Project User Testing

For the final project’s user testing, I asked my friend Sarfraz to play my game “Go Ichi-Go!” Based on his feedback, and the feedback of two other players, I got some insight and areas for improvement.

A common struggle they all had was understanding the jump and dive buttons. To jump/dive, you have to press and hold the button. But all of them only pressed it the first try. This made me realise that I should probably mark these instructions either one of the console or in the instructions at the start of the game.

Another suggestion my friend gave me was that the game was too easy, and that it could be faster. So I decided to modify the game accordingly. Either make the whole thing faster, or make it faster with each obstacle.

These suggestions helped me understand what can make my game more engaging and enjoyable. It also helped me understand where the players might be confused or lost.

Because my fabrication isn’t complete yet, my friends didn’t get the sweet treat at the end of the game, but they all loved the idea of it!

Video :

 

 

 

Assignment 13: Final Project User Testing

As we neared the due date of our final project, we were asked to conduct user testing prior to our submission.

I had two of my friends play my game, after which I had them give me feedback on what they liked, what they disliked, and any other features.

User Testing 1

User testing 2

From their experience, I was able to gain valuable insight into what I could improve in my game.

What I needed to add:

  • An Instructions Screen: the gameplay mechanic wasn’t immediately obvious to anyone who played, so I implemented an instructions screen within my game that the user would have to go through if they wanted to play the game.
  • Faster turnout time between rounds: Each new “round” of the game was taking a really long time to load, so I shortened the time between the pattern displaying
  • User Interactivity: Another thing I noticed was that there was a slight delay between when the user clicked the button connected to my Arduino and the button lighting up and playing a sound, so I had to go back to my Arduino code and integrate a debouncing delay, so that the experience felt more seamless

I plan to integrate these within my game for a more polished and complete feeling, taking into account their criticisms to improve.

User Testing-Week 13

Final Concept: Motion-Activated Mini Garden Visualizer

My final project is the Motion-Activated Mini Garden Visualizer—an interactive installation that uses an ultrasonic sensor and individualLEDs embedded in miniature garden that I built. The project simulates how a garden might come to life in response to presence and movement. Instead of using a microphone to detect sound, the system now detects distance and motion. As a person approaches, LEDs light up in various colors, mimicking the way plants might react to human presence.

This visual experience is enhanced by a P5.js sketch that features lava lamp-inspired blobs. These blobs move and shift color based on proximity—creating a unified physical-digital display that changes in mood depending on how close someone is to the garden.

Why These Changes Were Made

Originally, the project was a mood tree visualize using an LED strip and microphone sensor. However, due to hardware limitations (the Arduino could only power 5 LEDs reliably), I shifted to using individual LED lights. At the same time, I replaced the microphone sensor with an ultrasonic sensor to create a more responsive and stable interaction system. These changes also allowed me to design a mini garden setup that feels more visually integrated and conceptually clear.

Real-World Relevance & Future Use Cases

This concept has real potential for garden and farm environments:

In gardens it could act as a calming, responsive lighting system.
On farms, the same setup could help detect animals like foxes approaching livestock (e.g., sheep). In the future, it could be upgraded with:

Sirens or alerts- when something comes too close.
Automatic light deterrents- to scare animals away.
Wireless notifications- to a user’s phone.

User Testing Reflection

I conducted user testing without providing any instructions, observing how people interacted with the installation:

What worked well:
Users naturally moved closer to the garden and quickly noticed that proximity activated the lights**. The more they explored, the more they connected movement with the garden’s glowing response.

What caused confusion:
each lights colour meant what- where was the sensor exactly

What I had to explain:
I explained the system was using

motion sensing was a bit confusing , and once clarified, users became more engaged.

How I’ll improve the experience:
To make it clearer, I plan to place a small instruction that says:
“Walk closer to bring the garden to life.”
This subtle cue will help guide new users and enhance intuitive interaction.

Final Thoughts

Changing from sound-based input to motion-based sensing not only solved technical challenges, but also made the experience smoother and more immersive. The use of single LEDs in a mini garden created a more grounded and intimate installation. These changes—while unplanned—ultimately led to a more thoughtful, future-facing project that bridges creative expression with real-world functionality.

The video:

https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq

 

 

 

Week 11: Reading Response

This week’s reading stayed with me more than any other. It made me think about how disability is usually approached through design, not with pride, but with silence. So many assistive devices are made to blend in, to be as invisible as possible. But what if visibility is the point? What if difference is not something to hide? This prosthetic eye with a small butterfly detail says more than a thousand clean, neutral medical devices ever could. It does not pretend to be a real eye. It says, I am here. I am not hiding.

Prosthetic eye with butterfly detail

This prosthetic eye doesn’t try to hide. It turns a medical object into a moment of expression.

That reminded me of how often people are taught to minimize themselves. To shrink into what is considered normal. Even in design school, we are often told to prioritize simplicity and universality, which sometimes ends up erasing complexity and identity. The reading showed how glasses once carried shame, but now they are fashion. We pick them in colors, shapes, and moods. That change happened because someone dared to design with beauty in mind, not because they tried to make glasses disappear.

The part about Aimee Mullins was especially striking. She does not just wear prosthetic legs. She expresses herself through them. Some are made of carved wood. Some are translucent. Some are bold and sculptural. They are not about fitting in. They are about standing in her truth. That made me wonder why assistive design is still expected to be beige, flat, and purely functional. Why do we still act like blending in is better than standing out?

This reading helped me realize something personal. I have spent so much time trying to design things that are clean, minimal, and safe. But rarely have I asked myself if they help someone feel more like themselves. That is the kind of work I want to make going forward. Not just design that works, but design that empowers. Not just access, but expression.

Owning yourself is powerful. It means showing up as you are, even when the world wants you to stay small. And design should not be about helping people disappear. It should be about helping them be seen.

Week 10: Reading Respponse

Bret Victor’s rant made me realize how passive we’ve become about the future of design. It’s not just that we’re stuck with screens, but that we’ve stopped questioning them. Everyone seems satisfied with touchscreens and voice assistants, as if that’s all interaction could ever be. What bothered me most is the lack of imagination. For a field that’s supposed to be creative, interaction design has become weirdly repetitive.

What stood out in the responses is that Victor isn’t against progress — he’s against settling. He points out that the tech world keeps selling the same ideas with slightly updated hardware, but very little actual vision. That feels especially relevant now, when even “futuristic” designs are just smoother versions of old things. I found myself wondering how often I do the same in my own projects. Am I just remixing what already exists, or am I really thinking about what interaction could become?

This made me think more about risk. It’s easier to build something people already understand, even if it’s a little boring. But real design, the kind Victor is asking for, takes more risk. It asks what else we could be doing, not just what works today. I want to start asking those questions earlier in my process, not just at the end when everything feels finished.