Final Project

Concept

This is a self-love journey starring a little duckling who dreams of escaping its own pixel-art world for the “real world.” To unlock the portal, the duck must collect 10 berries scattered across the landscape avoiidng cats that steal berries. Once the portal opens and the duckling arrives in our world… it discovers that transformation comes at a cost. Its body literally changes and it loses control- brought to life by an Arduino-powered robot. Was the price worth it?

( one thing i was proud of is how self sufficient my set up was, i made sure to make an immersive self directed experience by having notes guiding the user/ help buttons that help you out)

The Storyline was massively tweaked last minute since the servos burnt and due to the lack of same servos available in IM lab, the robot became unstable. “Every bug is an opportunity to learn something new” – yep, I turned it into a real life learning objective. the tagline went- Dont try to blend in to a world you dont belong in, Stay true to yourself.

honestly, this storyline made a lot of people smile big and I consider a design success.

IM SHOWCASE

P5.Js Form:


Real Life form:

this is when I tried the robot after changing into new servos that arent as powerful ( It got a muffler as a makeover later on, dont worry)

Schematic

Image 1

Image 2

Implementation Overview

This project brings together a browser-based p5.js game and a physical Arduino robot in four main stages:

  1. Player Input

    • Mouse clicks for UI controls (Start, Help, Retry).

    • Joystick module for duck movement.

    • Arcade button for jump/interact actions.

  2. p5.js Game Loop

    • The draw() function renders the environment (drawEnvironment()), duck (drawDuck()), berries (drawBerries()), and portal (drawPortalAndSign()).

    • Berry collisions are detected via simple bounding-box checks; when berriesCollected reaches 10, the Portal animation kicks in and p5.js sends the signal PORTAL\n over serial.

  3. Serial Communication

    • Uses the Web Serial API at 9600 baud.

    • p5.js → Arduino: Sends the line PORTAL (terminated with \n).

    • Arduino → p5.js (optional): Could send back status messages like READY or DONE to sync up visuals.

  4. Arduino Reaction

    • Listens on Serial1 at 9600 baud in loop().

    • On receiving "PORTAL", it:

      1. Reads a HC-SR04 distance sensor to confirm the duckling’s “arrival” position.

      2. Commands four TowerPro SG90 servos (hipL, hipR, footL, footR) through a custom transformDuck() routine to animate the body-change sequence.

Interaction Design

You guide the duckling with the joystick, steering it through the world until it picks up the tenth berry. The moment that last berry is collected, the portal opens and the duckling automatically steps through. The game then shifts into a brief slideshow that reveals how the duckling’s body changes and what it pays for its journey. When the slideshow ends, the view fades out and your Arduino-powered biped robot springs into motion, reenacting that very transformation in the real world.

Arduino 2 Code

#include <Servo.h>

// Servo setup
Servo hipL, hipR, footL, footR;

// offsets
int hipLOffset  = -3;
int hipROffset  = 6;
int footLOffset = 0;
int footROffset = 4;

// Helper to apply offset
void writeCorrected(Servo s, int angle, int offset) {
  s.write(angle + offset);
}

// Neutral standing pose
void standStill() {
  writeCorrected(hipL, 90, hipLOffset);
  writeCorrected(hipR, 90, hipROffset);
  writeCorrected(footL, 90, footLOffset);
  writeCorrected(footR, 90, footROffset);
}

void setup() {
  hipL.attach(3);
  hipR.attach(5);
  footL.attach(6);
  footR.attach(9);
  
  standStill();
  delay(500);
}

void loop() {
  // Left leg forward
  writeCorrected(hipL, 55, hipLOffset);     // swing left
  writeCorrected(hipR, 90, hipROffset);     // right still
  writeCorrected(footL, 90, footLOffset);
  writeCorrected(footR, 92, footROffset);   // small push
  delay(150);

  standStill();
  delay(80);

  // Right leg forward
  writeCorrected(hipL, 90, hipLOffset);     // reset left
  writeCorrected(hipR, 55, hipROffset);     // swing right
  writeCorrected(footL, 96, footLOffset);   // stronger push
  writeCorrected(footR, 90, footROffset);
  delay(150);

  standStill();
  delay(80);
}

Honestly, this project went through a bit of an identity crisis — just like the duck. What started as a simple p5.js game about collecting berries turned into a weirdly emotional tech-story about self-love, malfunctioning servos, and a robot duck with a muffler.

When the servos burnt out last minute and I had to rewrite the whole ending, I was this close to panicking. But instead, I leaned into the chaos and built a new storyline around it. And somehow… it worked better than the original. People loved it and I made so many new contacts through the IM showcase becausue we got to talk about the project. Some even laughed at the transformation twist. That was enough for me.

The duckling wanted to escape its pixel world, but maybe it didn’t need to.
And maybe I didn’t need everything to go perfectly for the idea to land either.

So yeah — the duck glitched, the servos wobbled, but the message came through.
Stay weird. Stay you.

Enjoy some of the artworks used-
Duckling Game 4
Duckling Game 1
Duckling Game 2
Duckling Game 3
Duckling Game 5

User Testing

Overview

Berry Blessings is an Arduino joystick-controlled p5.js game where a duck must collect 10 berries to unlock a magical portal to the real world. Once the player reaches the portal, an Arduino-powered robot is meant to activate and move toward a fairy setup, creating a double world experience.

After receiving feedback that the previous storyline was too abstract and difficult to understand, I redesigned the narrative to be more intuitive and enjoyable. The new version provides a clear goal which is collecting berries to unlock a portal and eventually a satisfying real-world reward.

Two users participated in a no-instruction playtest of the p5.js portion of the game.

What Worked Well

  • Joystick Controls: Users figured out joystick movement after some experimentation. Movement felt smooth once they adjusted to it.

  • Berry Counter: The on-screen “Berries: X / 10” counter gave clear feedback and helped users understand progression.

  • Visual Design: The duck animations, bright environment, and sign reading “Collect 10 berries to pass” helped anchor the gameplay and made objectives more understandable.

  • Revised Storyline: The updated goal and real-life payoff made the experience feel more engaging and easier to follow.

Challenges & Observations

  • Joystick Direction Mapping: Both users were initially confused by which way the duck would move relative to joystick direction.

    Planned fix: Add visual direction indicators (e.g. arrows or compass).

  • Robot Frustration (During Earlier Testing): Friends testing the robot portion before the Arduino broke reported frustration with its slow, unstable movement.

    Plan: Replace with a more reliable wheeled robot for smoother performance.

  • Duck Centering Bug: One user pointed out that the duck appeared too far from the center when changing directions.

    Fix: Adjusted duck positioning to remain centered during direction changes.

  • Hardware Damage: The Arduino Uno was accidentally damaged due to adjusting connections while powered on, so it could not be used during this round of user testing.

Fixes & Next Steps

Already Implemented:

  • Rewrote the story to improve clarity and engagement.

  • Added a sign by the portal to explain the berry objective.

  • Included a working berry counter.

  • Fixed duck position offset when changing directions.

In Progress:

  • Sourcing a new Arduino Uno to replace the damaged one.

  • Planning to switch to a wheeled robot for improved reliability.

  • Will add visual joystick direction indicators to help users orient themselves.

I will post an update before next class once the hardware is replaced and re-integrated.

Week 11: Reading Response

This week’s reading stayed with me more than any other. It made me think about how disability is usually approached through design, not with pride, but with silence. So many assistive devices are made to blend in, to be as invisible as possible. But what if visibility is the point? What if difference is not something to hide? This prosthetic eye with a small butterfly detail says more than a thousand clean, neutral medical devices ever could. It does not pretend to be a real eye. It says, I am here. I am not hiding.

Prosthetic eye with butterfly detail

This prosthetic eye doesn’t try to hide. It turns a medical object into a moment of expression.

That reminded me of how often people are taught to minimize themselves. To shrink into what is considered normal. Even in design school, we are often told to prioritize simplicity and universality, which sometimes ends up erasing complexity and identity. The reading showed how glasses once carried shame, but now they are fashion. We pick them in colors, shapes, and moods. That change happened because someone dared to design with beauty in mind, not because they tried to make glasses disappear.

The part about Aimee Mullins was especially striking. She does not just wear prosthetic legs. She expresses herself through them. Some are made of carved wood. Some are translucent. Some are bold and sculptural. They are not about fitting in. They are about standing in her truth. That made me wonder why assistive design is still expected to be beige, flat, and purely functional. Why do we still act like blending in is better than standing out?

This reading helped me realize something personal. I have spent so much time trying to design things that are clean, minimal, and safe. But rarely have I asked myself if they help someone feel more like themselves. That is the kind of work I want to make going forward. Not just design that works, but design that empowers. Not just access, but expression.

Owning yourself is powerful. It means showing up as you are, even when the world wants you to stay small. And design should not be about helping people disappear. It should be about helping them be seen.

Week 10: Reading Respponse

Bret Victor’s rant made me realize how passive we’ve become about the future of design. It’s not just that we’re stuck with screens, but that we’ve stopped questioning them. Everyone seems satisfied with touchscreens and voice assistants, as if that’s all interaction could ever be. What bothered me most is the lack of imagination. For a field that’s supposed to be creative, interaction design has become weirdly repetitive.

What stood out in the responses is that Victor isn’t against progress — he’s against settling. He points out that the tech world keeps selling the same ideas with slightly updated hardware, but very little actual vision. That feels especially relevant now, when even “futuristic” designs are just smoother versions of old things. I found myself wondering how often I do the same in my own projects. Am I just remixing what already exists, or am I really thinking about what interaction could become?

This made me think more about risk. It’s easier to build something people already understand, even if it’s a little boring. But real design, the kind Victor is asking for, takes more risk. It asks what else we could be doing, not just what works today. I want to start asking those questions earlier in my process, not just at the end when everything feels finished.

Week 9: Reading Response

Physical Computing’s Greatest Hits (and misses)

I often feel like nothing is original anymore. Every time I come up with an idea, I search it and find five people who have already done it, sometimes in more impressive ways. That can be discouraging. It makes me wonder what the point is if everything has already been made. But reading Tom Igoe’s piece helped shift that mindset. He talks about the “greatest hits” of physical computing — projects like musical gloves or motion-controlled sounds — not as clichés, but as classic forms that people keep coming back to. These ideas repeat because they are approachable, fun, and full of room for variation.

What I appreciated most was the reminder that repetition doesn’t cancel out creativity. A musical glove might not be new, but the way I make it, the story I tell through it, and how I design the experience can still feel personal. Igoe encouraged adding a twist, and that made me realize I do not have to be original in concept, but in execution.

I also liked his point about meaningful gestures. A motion that triggers a sound might technically work, but if the movement feels random or doesn’t make sense in the context, the interaction loses impact. That made me think more critically about how I design user input. I want people to feel like what they do matters, and that their actions are met with responses that feel natural and thoughtful. That, to me, is the real magic of interaction.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

Tom Igoe’s post made me realize I often over-explain my work. I worry people won’t get it unless I guide them, but he makes a strong case for stepping back. In interactive art, it’s not just about what I make. It’s about what the audience does with it.

I liked how he compared it to setting a stage. I provide the space and tools, but the audience brings it to life. That means accepting unexpected interpretations and trusting the piece to speak for itself. I think good design should be guidance enough. If the design is clear and intentional, it should naturally lead the audience through the experience without me having to explain everything.

Moving forward, I want to create work that invites exploration without over-directing. That kind of openness feels more honest and more meaningful.

Week 8: Reading Response

Emotion & Design: Attractive things work better

This week’s reading made me laugh a little because it called me out directly. Norman’s idea that attractive things work better really stuck with me, but not because I think they literally work better. It’s because they make us feel better about them. That feeling changes how we treat the object. Case in point: this pink-bow mug I saw online. I would buy it instantly just because it’s cute. But if I actually tried drinking from it, I know the bow handle would probably poke me in the eye. And yet, I still want it.

Cute pink bow mug

I would 100% buy this, and it would 100% poke my eye.

It reminded me of how we collect things like stickers or fancy stationery just to admire them and never actually use them. Sometimes, function becomes secondary when something looks good enough. Norman makes the case that beauty improves usability by creating positive emotions, but I think this also raises a bigger question. How far are we willing to let go of functionality just to have something pretty? And when does that stop being design and start becoming decoration? It’s something I want to think about more in my own work. I still want my interactive projects to function well, but maybe it’s okay to prioritize joy and visual pleasure too. Sometimes, looking at something beautiful is the function.

Her Code Got Humans on the Moon

Reading about Margaret Hamilton reminded me why I love creative tech in the first place. She wasn’t just writing code. She was building the foundation of what software could even be. What really stood out to me wasn’t only that she helped put people on the moon, but that she did it at a time when software engineering wasn’t even considered real engineering. She coined the term herself because nobody else was giving it the weight it deserved. That says a lot about the kind of vision she had. She wasn’t just part of the system. She was defining it.

What I found especially inspiring was her mindset around error handling. She didn’t assume the user would always follow instructions perfectly. She designed with failure in mind, and made sure the code could still function under pressure or human error. That’s a mindset I want to carry into my own work, especially when building interactive projects. Not everything needs to be perfect, but it should be ready for the unexpected. That’s not just smart coding, it’s thoughtful design. The user might not always know what to do, but the system should be kind enough to keep going.

Week 5: reading response

Seal and puppy lookalikes

 

When I look at this photo of a seal and a puppy that somehow look like long-lost twins, my human brain gets the joke instantly. I can tell they’re two different animals, and I also get why they’re being compared. There’s context, humor, and visual nuance involved. But for a computer, that kind of recognition isn’t simple. Computer vision doesn’t work like human vision. We interpret meaning and emotion, while a computer just sees pixels, shapes, and patterns. Golan Levin’s essay really drove that home. Computers are not seeing the world, they’re processing data through whatever narrow lens we’ve given them.

To help computers understand what we want them to track, we use things like face detection, color tracking, optical flow, and trained models. These tools help narrow the field and make the computer’s “guess” more accurate. But still, it’s guessing. A puppy that looks like a seal might completely throw it off if the system wasn’t trained on edge cases like this. That’s part of what makes working with computer vision both fascinating and fragile.

In interactive art, computer vision opens up exciting possibilities. We can create responsive environments, playful installations, and performances that react to motion and presence. But the same tools are also used in surveillance and monitoring. There’s a tension between creativity and control that we can’t ignore. As an artist, I think it’s important to design with awareness. Just because the system can track someone doesn’t mean it should. I want to create interactions that feel intentional and thoughtful, not invasive. At the end of the day, I want the system to respond with care, not just accuracy.

Week 4: Data Visualization

For this week’s production assignment on data visualization and generative artwork, I wanted to capture a very real and very dramatic part of my daily life: my cat Pieni’s obsession with food. Even when she’s been fed every single time she still manages to ask for more like she hasn’t eaten in centuries.

So I thought, why not visualize her lies?

Concept:

This project is a simple, animated bar chart that compares two things across a typical day:

  • How many times I actually fed Pieni

  • How many times she asked for food anyway

How It Works

Each time slot (from 8AM to 8PM) has two bars:

  • A light blue bar representing the single time I fed her (yep, I did my part).

  • A dark blue bar that pulses with animation to show how many times she pretended to be starving during that same period.

I added a slight pulsing animation to the begging bars to reflect how annoyingly persistent and dramatic her pleas are—even when her bowl is full.

Challenges & Decisions

While this project wasn’t technically hard, the main challenge was design clarity. I didn’t want it to look like a boring spreadsheet. I wanted it to be:

  • Aesthetic and cat-meme friendly

  • Easy to understand at a glance

  • Somewhat interactive (through animation)

I spent a bit of time tweaking:

  • The color scheme (cool blues)

  • Bar spacing and layout

  • Centering and visual alignment

  • The legend—cleanly placed in the top-right instead of labeling every bar

What I Learned

This week helped reinforce how storytelling and humor can make even simple data visualizations fun and engaging. It also helped me practice:

  • Pulse animation with sin() for movement

  • Using clean design principles in p5.js

  • Balancing simplicity with personality

Week 4: Reading Response

After reading Norman, I kept thinking about how often I’ve felt genuinely embarrassed, not because I did something wrong, but because something was so badly designed that it made me look like I didn’t know what I was doing. I’ve blamed myself so many times for design failures, but Norman makes it clear that it’s not me, it’s the object. One thing that still annoys me is the sink setup at Dubai Airport. The soap, water, and dryer are all built into one sleek bar, with no sign telling you which part does what. You just keep waving your hands around and hope something responds. Sometimes the dryer blasts air when you’re trying to get soap, or nothing works at all. To make things worse, some mirrors have Dyson hand dryers built in, others have tissues hidden somewhere off to the side, and there’s no way to know without ducking and peeking like a crazy person. Norman’s point about discoverability and signifiers felt especially real here. One simple label could fix all of it.

In my interactive media work, I’m starting to think more about how people approach what I build. Norman’s ideas about system image and mental models stuck with me. If someone doesn’t know what they’re supposed to do when they see my sketch, I’ve already failed as a designer. In my  work, I try to make interactive elements obvious and responsive. If something is clickable, it should look like it. If something changes, the feedback should be clear. The goal is to make users feel confident and in control, not confused or hesitant. Good design doesn’t need to explain itself. It should just make sense.

Week 3: Reading Response on What Makes Interaction Strong?

After going through this week’s reading, I realized how often the word “interactive” gets thrown around. Chris Crawford makes a sharp distinction between reaction and interaction. Just because something responds doesn’t mean it’s truly interactive. For him, strong interaction happens only when a system “listens, thinks, and responds thoughtfully,” like a real conversation. He emphasizes that true interaction requires both sides to be active participants. That stuck with me, especially since we often label anything that reacts to input as interactive, even when it’s really just one-sided. I liked how Crawford stripped the term down to something clear. It’s not about bells and whistles, it’s about actual communication.

Looking back at my own p5.js sketches, I was intentional about making them gamified because I wanted them to feel interactive, not just look interactive. I wanted them to look more like a game than a GIF because only by interacting with an artwork do you really get to sense what went behind it and what it stands for. I love the effect of something actually happening because of a user’s input. It gives users a sense of presence, like they’re not just observing but actively shaping what unfolds. That moment of response makes people feel like they’re part of the piece, not just pressing buttons on the outside. It’s rewarding for both the user and the creator, and it’s the kind of experience I want to keep building on. To me, interactivity is a back-and-forth communication.