Week 14 Final Project Documentation

  • Include some pictures / video of your project interaction
    • user testing 2 (see this clip for full interaction):
      • https://drive.google.com/file/d/11MYknXjQfZ2JDwCdM1UDdrmKWQnlF0M7/view?usp=sharing (sorry professor I have no idea word press uploaded this long video file for an hour, so I turned to google drive)
    • connecting
      • building handshake video clip
    • p5 interface
    • robot headshot
  • Describe your concept

For my final project, I will create a physically interactive Tamagotchi. It is a mini robot creature that the user can pat, touch, spin, and talk to, but instead of being cute and grateful, it responds in unexpectedly pessimistic and slightly hostile ways. The project combines physical interaction through arduino sensors with a character interface and dialogue system in p5. The concept is inspired by classic Tamagotchi toys, small handheld digital pets that demand constant attention and reward care with affection. In contrast, my project imagines a near-future world in which artificial beings no longer need, or even want, human caretaking. This pet has a personality shaped by human environmental destruction and techno-optimism gone wrong, and it is deeply unimpressed by humans.

  • How does the implementation work?
    • Description of Arduino code and include or link to full Arduino sketch
    • Schematic of your circuit (hand drawn or using tool)
    • Description of p5.js code and embed p5.js sketch in post
        • link includes both p5 and commented arduino: https://editor.p5js.org/joyzheng/sketches/mmYysys_A

The system consists of two main components: a physical controller powered by an arduino and a visual interface running on p5. The physical body uses a potentiometer to detect rotation (spinning) and three Force Sensitive Resistors (FSRs) placed on the left, right, and back to detect touch. An 8×8 neo matrix serves as the robot’s physical face. p5 handles the complex game logic, visual assets, and audio, while the arduino handles raw sensor data acquisition and LED matrix control.

  • Description of interaction design

The interaction is designed to simulate a moody robot. Spinning the potentiometer quickly disorients it, triggering a dizzy or angry state that turns the physical neo matrix into an angry red face. Similarly, squeezing the robot by pressing both the left and right sensors evokes annoyance. In keeping with the creature’s difficult personality, soothing it requires specific back-sensor patting, which is the only way to reset the angry state to calm. When the creature is already calm, playful pokes on individual sensors trigger a single pat reaction, causing the physical face to cycle through various calm colors. Leaving it idle, however, results in aggressive dialogue. After user testing, I added visual cues such as an animated arrow and a wiggling pointer to help guide the user through these interactions. To further immerse the user, the background music dynamically shifts to match the robot’s mood, transitioning from a soothing melody to an intense track whenever the angry state is triggered.

  • Description of communication between Arduino and p5.js

The arduino sketch is responsible for reading the four sensors and driving the 8×8 neo matrix. I established a handshake protocol during setup, ensuring the arduino waits for a valid connection before processing loops. The arduino sends raw sensor strings to the computer, while p5 returns specific logic flags like isAngry and triggerColorChange. I wrote a specific algorithm for the triggerColorChange flag to ensure that when the robot cycles through colors, it never selects the same color twice in a row. The p5 sketch functions as the brain of the operation, managing the state machine that dictates whether the robot is calm or angry. It loads pixel art sprites and implements a typing effect for the dialogue, simulating the text scrolling of a retro RPG to enhance the vintage atmosphere.

  • What are some aspects of the project that you’re particularly proud of?

I am particularly proud of successfully utilizing laser-cut acrylic for the physical enclosure, marking my first time working with this material. Unlike other prototyping materials, the permanent nature of acrylic demanded rigorous measurement and planning, as there was no room for error once a cut was made. This requirement for precision significantly increased the time investment for the physical build compared to previous projects. However, I’ve overcome this learning curve in my fabrication skills, and I now look forward to designing and creating even more complex acrylic models in future iterations of my work.

  • Include a 1-2 paragraph statement “How this was made” that describes what tools you used including how you used generative AI. Explain how the code was written, the design was made, and how the writeup was done. Credit the media sources (e.g. self-created, from a site, or generated).

This project was built on the synthesis of our arduino-p5 serial communication session, so I reviewed Professor Mang’s code mainly to refresh my memory on the mechanics. I also improved my typing effect by referencing this https://editor.p5js.org/xc2736/sketches/1igkPpfX5. The core serial communication structure was adapted from class examples regarding serial potentiometers. I used AI to help plan the robot logic and debug hardware issues. Gemini helped me resolve a conflict between the 16-bit color format of the graphics library and the 32-bit color requirements of the neo matrix. For visuals, I developed assets by using ChatGPT to pixelate my scratchy hand-drawn drafts, while the pointer asset was sourced from Adobe Stock and stretched using Affinity.

  • What are some areas for future improvement?

Looking toward future iterations, I plan to expand the physical feedback by integrating neo pixel  strips into the robot’s skeleton so its mood colors radiate through its entire body rather than just the face. I also want to enhance visual feedback by adding circular gradient cues in p5 that react to specific sensor inputs, alongside refining the pixel art sprites with more detailed animation frames.

Week 13 User Testing

IMG_9844

During testing, users were initially unsure that the robot was an interactive system. This uncertainty came from the absence of clear cues inviting them to engage with it. Once interaction began, the rotational feature worked well because it was clearly displayed on the screen and responded in a way users understood. The FSR pad required more investigation, since the connection between pressure input and system feedback was less obvious. Many users spent extra time experimenting to figure out how the pressure sensor influenced the experience. Another practical issue that surfaced was wire management. When users spun the device, exposed cables sometimes tangled, which interrupted the interaction and caused hesitation. This showed that the physical setup needs better organization to keep the experience smooth and uninterrupted. Overall, while the rotation feedback was intuitive, the pressure-based control and cable layout required explanation. Clearer visual guidance, simple onboarding hints, and proper cable routing would make the system easier for first-time users to understand and operate.

 

Week 12: Commit to Final Proposal

For my final project, I will create a physically interactive Tamagotchi. It is a mini robot creature that the user can pat, touch, spin, and talk to, but instead of being cute and grateful, it responds in unexpectedly pessimistic and slightly hostile ways. The project combines physical interaction through Arduino sensors with a character interface and dialogue system in p5. The concept is inspired by classic Tamagotchi toys, small handheld digital pets that demand constant attention and reward care with affection. In contrast, my project imagines a near-future world in which artificial beings no longer need, or even want, human caretaking. This pet has a personality shaped by human environmental destruction and techno-optimism gone wrong, and it is deeply unimpressed by humans.

Physically, the project will take the form of a small creature-like object mounted on a potentiometer so that the whole robot can be spun like a tiny rotating idol. The main interactions happen through touch and rotation. The user can pat or press the creature via a force sensor embedded in its body, and they can spin it to face different directions using the potentiometer as a rotation input. Inside the body, the creature uses NeoPixels to show changing emotions through light patterns, while on the computer a p5 interface displays a larger animated avatar of the creature and shows its dialogue text.

Unlike typical virtual pets that reward attention with affection and gratitude, this creature is intentionally negative and resistant. When the user pats it, it might snap: “Go away, you silly little human.” When the user spins it so that it faces away, it might respond: “Correct. I prefer not to see you.” If the user keeps spinning it quickly, the creature may complain: “Dizzy. This is abuse, not affection.” When the robot is left facing a corner with its back turned to the user, it may mutter: “Finally. A view without humans in it.” The rotation angle therefore becomes a key part of the interaction design. Different angular positions correspond to different stances or modes of the creature, and those modes drive both the NeoPixel emotion effects on the physical object and the dialogue responses on the p5 screen.

On the Arduino side, the project relies on two main inputs that are sent to p5 over serial communication. The first is a force sensor used as a pat or squeeze input. The Arduino continuously reads the analog value from the FSR, maps the raw reading to a smaller range such as 0 to 10 representing pat intensity, and sends this information to p5 in the form of tagged serial messages like “PAT:<value>”. A reading of “PAT:0” would mean no touch, while something like “PAT:9” would correspond to an aggressive squeeze. The second input is the potentiometer that encodes the robot’s rotation angle. The creature is physically attached to the shaft of the potentiometer so that when the user spins the creature, they are directly rotating the pot. The Arduino reads the analog value from the potentiometer, originally in the range 0 to 1023, and maps it either to a normalized angle between 0 and 359 degrees or to a set of discrete orientation zones. For example, Zone 0 can represent facing the user, Zone 1 slightly turned to the left, Zone 2 slightly turned to the right, and Zone 3 completely turned away with its back to the user. The Arduino then sends periodic messages to p5 such as “ANGLE:<value>” for the continuous angle or “ZONE:<id>” for the discrete orientation. As a stretch feature, the Arduino can also compare the current angle with the previous reading to estimate spin speed and send additional messages such as “SPIN:FAST” or “SPIN:SLOW” if there is enough time to implement this.

The Arduino is also in charge of several outputs, primarily the NeoPixels that visualize the creature’s emotional state. The NeoPixels are used to display different moods and orientations through color and animation patterns. The Arduino listens for commands coming from p5, such as “MOOD:ANGRY”, “MOOD:BORED”, “MOOD:AMUSED” or “MOOD:DISGUSTED”, and possibly additional tags like “DIR:FRONT”, “DIR:LEFT”, “DIR:RIGHT” and “DIR:BACK” that encode the direction it should appear to be facing. For each combination of mood and orientation, the Arduino selects a specific pattern from a small internal lookup table of NeoPixel animations. For instance, when the creature is facing the user and annoyed, the LEDs might show sharp, high-contrast flashing patterns. When it is turned away, the colors might become dim and cold to signal that it is ignoring the user. When the user spins it quickly, it might display chaotic, flickering lights to suggest dizziness and disturbance. In this way, the Arduino acts as the body-level controller that turns high-level mood messages from p5 into concrete light and motion behaviors on the physical pet.

On the p5 side, the program handles visual behavior, dialogue, and integration of the serial data coming from Arduino. The main visual element is a two-dimensional avatar of the creature whose orientation mirrors the potentiometer readings. When Arduino reports that the creature is in Zone 0, facing the user, the avatar will be drawn facing forward. When it reports Zone 1 or Zone 2, the avatar will turn slightly left or right. When it reports Zone 3, the avatar will show its back or a dismissive side profile. Background layers or subtle interface elements can reinforce the sense of orientation, for example by using a spotlight effect when the creature faces the user, and a shadowy or desaturated background when it turns away.

The p5 sketch keeps track of several state variables. It records the current orientation zone or angle reported by Arduino, the most recent pat intensity from the “PAT:<value>” messages, and the time since the last interaction to detect whether the user is ignoring the creature or constantly bothering it. Based on these values, p5 chooses a mood state such as “Annoyed”, “Dizzy”, “Dismissive” or “Begrudgingly Attentive”. That mood state determines the avatar’s expression, including eyes, mouth shape, and posture, as well as background color or small motion effects like shaking or pulsing. Whenever the mood changes, p5 also sends the corresponding mood label back to the Arduino, for example “MOOD:DISMISSIVE”, so the NeoPixels can stay synchronized with the on-screen visuals.

Dialogue and personality are deeply connected to rotation and touch. p5 interprets the angle or orientation zone in semantic terms. When the creature is facing the user in Zone 0, it selects lines that complain about being watched, such as “Why are you staring? I do not perform on command.” When it is turned slightly away in Zones 1 or 2, it may comment on the user’s persistence with lines like “You still there? Unfortunately, yes.” When it is turned completely away in Zone 3, it chooses more extreme dismissive phrases such as “This direction is better. Less human.” If the system detects fast spinning, it can draw from a set of dizzy or abused responses like “Stop spinning me. I am not a fidget toy.”

Beyond instantaneous input, p5 maintains some simple memory over time. It tracks how often and how strongly the creature has been patted in the recent past and how often the user has spun it back and forth between zones. By combining rotation data with touch data, the system can generate interaction-dependent responses. For example, if the user keeps forcing the creature to face them by repeatedly moving it back into Zone 0 after it has “chosen” to be in Zone 3, the creature can complain about humans forcing attention with lines such as “You keep dragging me back. Typical human behavior.” If the user spins it away and then leaves it alone for a while, the system can trigger more subtle, relieved comments like “Finally. A horizon without you.”

The dialogue itself will at minimum be based on prewritten arrays of lines for each mood and orientation combination. p5 will maintain collections such as “linesFacingAnnoyed”, “linesBackTurned” or “linesDizzy” and will choose one line depending on the current mood, orientation zone, and a bit of randomness, to avoid sounding too repetitive. As a stretch goal, the project may integrate an AI API into p5. In that case, p5 would send a short prompt that includes the current mood, orientation description (such as “facing”, “back turned”, “spun fast”), and a brief summary of recent interactions. It would then receive a generated line of dialogue and optionally wrap or filter it to ensure it remains safe, in character, and consistent with the theme. In both the base and stretch versions, the personality remains negative, sarcastic, and skeptical of humans, reflecting a world where artificial beings are not necessarily grateful for their existence or their relationship with their creators.

Week 11 Final Project Brainstorm

For my final project, I want to create a physically interactive AI Tamagotchi. A mini robot creature that the user can pat, touch, hold, and communicate with, but which responds in unexpectedly pessimistic ways. The project combines physical interaction through Arduino sensors with a character AI conversational system and visual interface built in p5.

I recently got obsessed with Tamagotchi, which is a small handheld digital pets that require constant attention: need users feed them, clean up after them, and respond to their needs. Tamagotchis reward human care with affection or happiness, but with the growing cultural anxieties around automation and AI as well as environmental destruction caused by human behavior, I want to  imagines this project in a future in which artificial no longer need, or even want, human caretaking.

The final project would be a mini physical pet built with neopixels and sensors(e.g., force sensor, knob) to react (e.g., twitch, turn away, glow, flash) when touched. Users interact with it through force sensors, knobs, and touch inputs connected to an Arduino. The p5 interface displays a larger animated avatar of the creature and handles generative dialogue and visual responses.

However, unlike typical virtual pets, this creature is intentionally negative and pessimistic toward the user.

  1. When patted, it snaps: “Go away, you silly little human.”
  2. When fed (via knob input), it complains: “Ugh, you call this food?”
  3. When left alone too long, it becomes sarcastic: “Finally. Peace without humans.”

Using an AI API, the creature can answer user questions and hold brief conversations, but it always maintains a consistent disrespectful personality, reflecting a world where robots might question human motives, or judge the damage humans have done to the environment.

Week 11 Exercises Documentation

Group Member: Yiyang

(Arduino codes are commented in the bottom of p5 links)

1. make something that uses only one sensor  on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

p5 & arduino: https://editor.p5js.org/joyzheng/sketches/63Yg60k8D

Video Documentation: IMG_9638

 

2. make something that controls the LED brightness from p5

p5 & arduino: https://editor.p5js.org/yiyang/sketches/dtftbIzaK

 

3. take the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul) and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

p5 & arduino: https://editor.p5js.org/joyzheng/sketches/v77Sd41K4

Video Documentation: IMG_9640

Week 11 Reading

This article reminds me of an idea I heard from my robots class: are we all augmented human or “half robots” if there are so many designs eliminate the disability of us. When we relate this concept to real life, we see that we are constantly relying on external design to bridge our own biological limitations, whether that involves wearing glasses to correct what is considered a “mild visual impairment” (a product that has evolved from a medical appliance to a core fashion accessory), or using an advanced hearing aid. The increasing sophistication of products designed to solve specific physical or cognitive problems suggests that human capability itself is often defined by the sophisticated tools we seamlessly integrate into our lives.

The readings’ idea of how specialized, constraint-driven design eventually becomes universal is also visible all around us. This “trickle-down effect” is not just a theory, it’s the logic behind some of the most successful products today. Many of the most intuitive interfaces we use, like large buttons on smartphone screens or voice assistants like Siri and Alexa, were originally developed for users who struggled with fine motor control or had visual impairments. Now they’ve become standard because they make life easier for everyone. The dementia-friendly radio mentioned in the reading is a perfect example: its extreme simplicity wasn’t a limitation, but a breakthrough. The need to create something gentle, forgiving, and easy to navigate forced designers to rethink what “good design” actually means, and the resulting object ended up being loved far beyond its intended audience. We see this again in curb cuts designed for wheelchair users, which now help parents with strollers, travelers with luggage, and delivery workers with carts. These real-world cases show that when designers begin with the most constrained user, they often uncover solutions that improve daily life for the entire population.

Week 10 Group Music Instrument

For our interactive media sound project, my partner, Yiyang, and I decided to create a simple yet expressive instrument with a few sensors, and a buzzer on Arduino Uno. We wanted to build something that was intuitive to play and produced a unique, percussive sound. The result is this force-sensitive drum. Tapping different pads creates different notes, and a toggle switch shifts the entire instrument into a higher-pitched mode.

Concept

Our initial idea was inspired by the force sensors used in class to control sound. We thought, what if we could use multiple sensors to combine frequencies and create rhythms? We brainstormed a few possibilities. Could we assign different chords to each sensor, where pressing harder makes a certain chord more prominent? Or could the sensors act as modifiers for a continuous track?

Ultimately, we settled on a more direct approach for a playable instrument. We decided to have three Force Sensitive Resistors (FSRs) that would each trigger a distinct note, like pads on a drum machine. To meet the project requirements and add another layer of interactivity, we incorporated a digital two-way switch. Flipping this switch would transpose the notes of all three pads to a higher octave, giving the player two different sound palettes to work with.

Schematic

The build was straightforward, centered around an Arduino Uno and a breadboard.

Components Used:

  • 1x Arduino Uno

  • 1x Breadboard

  • 3x Force Sensitive Resistors (FSRs) – our analog sensors

  • 1x Two-way toggle switch – our digital sensor

  • 1x Piezo Buzzer

  • Resistors (for the FSRs and switch)

  • Jumper wires and Alligator clips

Each of the three FSRs was connected to a separate analog input pin on the Arduino. This allows the Arduino to read a range of values based on how much pressure is applied. The toggle switch was connected to a digital pin to give us a simple ON/OFF (or in our case, Mode 1/Mode 2) reading. Finally, the piezo buzzer was connected to a digital pin capable of PWM (Pulse Width Modulation) to produce the tones.

The Arduino code continuously checks the state of our mode switch and reads the pressure on each of the three force sensors. If a sensor is pressed hard enough to cross a defined hitThreshold, it calls a function to play a corresponding sound.

There was evolution of our instrument. We started with a basic concept (v0.1) and then refined it by adjusting the frequency gaps between the sensors for a more distinct and musical sound (v1.0a). Finally, we tweaked the delay to give it a more responsive and percussive, drum-like feel (v1.0b).

Video/Image Documentation

Code Snippet I’m proud of

To simulate it more as a drum effect, I made this for loop to create this pitch decay effect:

// drum pitch decay effect
  for (int f = baseFreq + 40; f > baseFreq; f -= 5) {
    tone(buzzer, f);
    delay(10);
  }

Future Improvements/ Problems Encountered

Our biggest physical challenge was the alligator clips. It was indeed a handy tool to create a prototype, but their exposed metal heads made it very easy to accidentally create a short circuit if they touched. We learned to be meticulous about checking that the rubber insulators were covering the clips properly before powering on the Arduino.

On the software side, getting the sound right was an iterative process. First, we spend time exploring the pitch gapsInitially, the pitches were too close together and didn’t sound very musical. By trial and error, we adjusted the base frequencies to create a more noticeable and pleasant musical gap between the pads. Second, rhythm and feel in hand needed to match a those of a “drum machine”. We played with the delay() value in the main loop. A shorter delay made the instrument feel much more responsive and rhythmic.

If we were to continue this project, we could add more sensors for a full octave, or perhaps use the analog pressure value to control the volume (amplitude) of the note in addition to triggering it. It would also be interesting to experiment with different waveforms or sound profiles beyond the simple tones.

Week 10 Reading

In the early iPhone era, Apple’s design mimicked physical textures(leather stitching in Calendar, felt in Game Center, the green felt of iBooks’ shelves). This skeuomorphism gives digital things tactile analogs was arguably the last mainstream attempt to preserve a sense of touch through sight.

Then came iOS 7 led by Jony Ive, where Apple decisively flattened everything: gradients gone, shadows gone, buttons became text. It was a move toward visual minimalism, but also toward what Victor warns against. The glass stopped pretending to be wood, or leather, or paper. It simply became glass.

Victor’s essay makes me realize that friction is information. When you open a book or turn a knob, your hand is in dialogue with resistance; you feel the world push back. That pushback is not inefficiency. It’s meaning.

What’s fascinating is that Apple, perhaps subconsciously, has been quietly circling back to Victor’s point. The Apple Vision Pro’s “spatial computing” rhetoric reintroduces physical space and hand gestures, but ironically, without touch. You can see and move objects in 3D, but you can’t feel them. It’s embodiment without embodiment. Victor’s “hands without feedback” problem all over again.

Every major design philosophy of the 2010s has quietly absorbed the “frictionless” ethos. Uber, Tinder, and Amazon all measure success by how little thought or effort stands between desire and fulfillment. You tap once and the world rearranges itself.

But Victor’s warning, when applied here, becomes almost moral: when everything becomes too smooth, we lose the feedback loops that teach us how the world works. Swiping on a screen doesn’t just numb the fingers, it numbs cause and effect. It’s a design culture that erases the material consequences of our actions.

Week 9 Production

Concept

This piece transforms a simple circuit into a puzzle. The connection between the switches is not obvious. The user needs some time to figure out what makes the LEDs light up.

The red LED changes brightness as you turn the potentiometer. Also, pressing only one of the push buttons does nothing you have to discover the specific “sweet spot” gesture. Only when all conditions align do the LEDs respond.

Video & Image Documentation

IMG_9510

Schematic Drawing

Future Improvements

A single puzzle could evolve into something far more complex by layering additional challenges:

    • Timing urgency: Buttons must be pressed within 2 seconds of entering the sweet spot, or the blue LED only stays lit for 3 seconds before requiring you to solve it again. This adds urgency and makes the victory feel earned rather than permanent.
    • Pattern memory: The blue LED blinks a simple pattern (like short-long-short) at startup. Users must recreate this rhythm with the buttons while in the sweet spot to unlock, transforming a spatial puzzle into a temporal one.

Week 9 Reading Response

The metaphor of interactive art as a directorial process reveals creative control and trust. When Igoe compares designing interactive work to directing actors, it exposes a fundamental tension in the fear of without constant explanation, our vision will be misunderstood or ignored. But this fear might actually be the enemy of good interactive work. Traditional artists can hide behind the permanence of their statement, for example, a painting doesn’t change whether viewers “get it” or not. However, Interactive artists must confront the uncomfortable reality that their work is incomplete without the audience’s participation. The most interesting implication here is that over-explanation isn’t just bad pedagogy. It’s a form of creative cowardice, a way of avoiding the vulnerability inherent in truly collaborative expression.

What strikes me about the “greatest hits” catalog is how it inadvertently maps the evolution of human-computer intimacy. The progression from theremin-like instruments (where gesture has “little meaning by itself”) to gloves (which borrow existing gestural vocabulary) to meditation helpers (which attempt to read involuntary bodily responses) represents increasingly ambitious attempts to collapse the distance between intention and interface. Yet Igoe’s skepticism about certain forms, particularly “remote hugs” and meditation helpers, suggests that some human experiences might be fundamentally resistant to technological mediation. The machine can measure breath rate and skin response, but it cannot know meditation; it can transmit signals between paired objects, but it cannot convey warmth. This raises an uncomfortable question for interactive designers: are we sometimes trying to technologically recreate experiences that derive their meaning precisely from their technological absence?

The recurring critique of projects that confuse “presence with attention” opens up a broader philosophical question about what interaction actually proves. A sensor detecting someone standing in front of a painting tells us almost nothing about engagement, yet many interactive projects treat physical proximity as evidence of meaningful exchange. This seems related to a contemporary cultural anxiety about whether we’re truly connecting with anything. We’ve become so focused on measurable interaction (clicks, views, sensor triggers) that we’ve lost sight of the immeasurable dimension where actual meaning resides. Perhaps the most radical interactive artwork would be one that deliberately resists confirming whether interaction happened at all, forcing both creator and participant to sit with uncertainty rather than seeking the reassurance of sensor data. The blink of an LED becomes a form of emotional comfort: see, something happened, you’re not alone.