Week 14 – Final Project (PawPortion)

🐾 PawPortion🐾 Smart Pet Feeding Assistant

Concept

I started this project from a very simple everyday problem: feeding a pet on time. It sounds like something that should be easy to remember, but in real life, routines can get busy, and small responsibilities can be delayed or forgotten. I did not want to make a fully automatic feeder because that would remove the user from the process completely. Instead, I wanted to create a system that supports the user while still keeping them involved in the act of care.

PawPortion is an interactive pet feeding assistant that combines an Arduino-controlled physical feeder with a p5.js digital dashboard. The system keeps track of feeding time, reminds the user when it is time to feed, and responds when the feeding action is completed. The project is not just about dispensing food. It is about creating a clear relationship between the user, the interface, and the physical machine.

The main idea behind PawPortion is assisted responsibility. The system does not take over the task entirely, but it helps make the task more visible, structured, and responsive. The user can see when the pet was last fed, when the next feeding should happen, and whether the feeding has been missed. The pet’s mood on the screen also changes depending on the state of the system, which makes the interaction feel more emotional and less robotic.

Video of Project

Images of Project


Schematic



Assembly Instructions 


User Testing Videos

During user testing, I wanted to see whether people could understand the system without me explaining every step. The main interaction was mostly clear. Users understood that the p5 dashboard was giving them feeding information, and they were able to recognize the “Feed Now” button as the main action. Once the servo moved and food was released, the connection between the digital interface and the physical feeder became much clearer.

One important issue appeared during testing with the IR sensor. I had placed a printed image near the sensor to guide the user, but the image made some people interpret the IR sensor as a physical button. Instead of triggering it by placing something near it, they tried to press it directly. The problem was not that the sensor failed. The problem was that my visual instruction gave the wrong impression.

This taught me that physical interaction is not only about whether the circuit works. It is also about how the user reads the object. A small label, icon, or printed sign can completely change how someone understands what they are supposed to do. After seeing this confusion, I changed the printed label so it was clearer that the sensor detects presence rather than pressure. This made the interaction easier to understand and reduced the need for verbal explanation.

How Does the Implementation Work?

Description of Interaction Design

The interaction is built around a cycle between the user, p5, and Arduino. First, the p5 dashboard tracks time and displays the feeding schedule. The user can see when the pet was last fed, when the next feeding is due, and how much time is left. This makes the system readable before anything physical happens.

When feeding time arrives, the dashboard changes state. The pet becomes hungry, and the status message tells the user that it is feeding time. This is meant to guide the user instead of forcing the system to act automatically. The user then presses the “Feed Now” button on the p5 interface, which sends a command to Arduino.

Arduino receives the command and moves the servo motor. The servo acts like a small door or gate that opens to release food, then closes again. When Arduino finishes dispensing, it sends a message back to p5. After p5 receives that message, the dashboard updates the last feeding time, resets the countdown, and changes the pet’s mood to happy.

If the user does not feed within the grace period, the system marks the feeding as missed. The pet becomes sad, which makes the missed action more visible. I wanted this to feel gentle, not dramatic, but still noticeable enough that the user understands the consequence of ignoring the schedule, and that they should probably feed their pet.

Description of Arduino Code

Github Full Code

The Arduino code controls the physical side of the project. It is responsible for reading the IR sensor, controlling the servo motor, and communicating back to p5 when feeding is complete. I used named constants for the pins instead of random numbers, which makes the code easier to understand and edit later.

Snippet: defining the sensor pin, servo pin, and servo object.

#include <Servo.h>

const int SENSOR_PIN = 7;

const int SERVO_PIN  = 6;

Servo tap_servo;

The main function in the Arduino code is dispenseFood(). This function opens the servo, waits while food is released, closes the servo, and then sends “DONE” to p5. This message is important because it tells the dashboard that the physical feeding action has finished.

Snippet: the feeding function that opens the servo, closes it, and sends confirmation to p5.

void dispenseFood() {

  isDispensing = true;

  tap_servo.write(110);

  delay(2000);

  tap_servo.write(0);

  delay(300);

  lastFedTime = millis();

  isDispensing = false;

  Serial.println("DONE");

}

The Arduino also listens for serial messages from p5. When p5 sends “FEED”, Arduino immediately calls the feeding function. This is what connects the digital button to the physical movement.

Snippet: receiving the FEED command from p5.

if (Serial.available() > 0) {

  String cmd = Serial.readStringUntil('\n');

  cmd.trim();

  if (cmd == "FEED") {

    dispenseFood();

  }

}

I also added a cooldown so the IR sensor does not trigger repeatedly if something stays in front of it. Without this, the servo could keep dispensing again and again. The cooldown makes the physical interaction more controlled.

Description of p5.js Code

 

The p5.js code controls the digital dashboard. It handles the schedule, the pet mood, the buttons, and the communication with Arduino. The dashboard shows the last feeding time, the next feeding time, and a countdown. It also shows the pet’s mood, which changes depending on what is happening.

Snippet: timing values for the feeding schedule.

const FEED_INTERVAL_SEC = 60;

const MISSED_GRACE_SEC = 15;

The main schedule logic checks whether feeding time has arrived. If there is still time left, the pet stays neutral. If feeding time has arrived but is still within the grace period, the pet becomes hungry. If the grace period passes, the pet becomes sad.

Snippet: schedule logic that updates the pet mood.

if (secUntilFeed > 0) {

  petMood = "neutral";

  if (serialConnected) {

    statusMsg = "Waiting for next feeding";

  }

} else if (secUntilFeed > -MISSED_GRACE_SEC) {

  petMood = "hungry";

  statusMsg = "Feeding time!";

} else {

  petMood = "sad";

  statusMsg = "Feeding missed!";

}

When the user clicks “Feed Now,” p5 sends the word “FEED” to Arduino through serial communication. I used a newline at the end because it helps Arduino read the command as one complete message.

Snippet: sending the FEED command to Arduino.

const encoder = new TextEncoder();

const writer = port.writable.getWriter();

await writer.write(encoder.encode("FEED\n"));

writer.releaseLock();

The p5 code also listens for messages from Arduino. When it receives “DONE”, it updates the dashboard. This is what makes the interface respond only after the physical action finishes.

Snippet: checking for DONE from Arduino.

if (line === "DONE") {

  onFeedingDone();

}

I also added fullscreen functionality using the F key. This makes the dashboard easier to present during the final demo and makes it feel more like a complete interface rather than just a small sketch window. The course documentation specifically mentions fullscreen and responsive resizing as part of final project programming considerations.

Description of Communication Between Arduino and p5.js

The communication between Arduino and p5.js is one of the most important parts of the project. The system works through a two-way serial communication loop. p5 sends “FEED” to Arduino, Arduino moves the servo to dispense food, and then Arduino sends “DONE” back to p5.

This means the dashboard does not simply assume that feeding happened. It waits for Arduino to confirm that the physical action was completed. This made the project feel more reliable because the screen and the machine were connected through actual feedback.

This was also one of the hardest parts of the project because if the serial communication failed, the entire interaction felt broken. The dashboard could look fine, and the Arduino could work alone, but the project only felt complete when both sides were speaking to each other correctly.

Aspects Of The Project I’m Proud Of 

One of the strongest parts of this project is how far it goes compared to where I started at the beginning of the course. If I had seen this project in week one, I genuinely would not have believed that I could build it. It combines physical computing, serial communication, a digital dashboard, user interaction, timing logic, an animated interface, and a mechanical feeding system. That feels ambitious for me, and I am proud that I was able to execute it in a way that actually works.

I am also proud of how complete the project feels as an experience. It is not just Arduino moving a servo, and it is not just a p5 interface on a screen. The two parts depend on each other. The dashboard guides the user, the Arduino performs the physical action, and the screen updates after receiving confirmation. That connection makes the project feel like a full interactive system.

Another part that I think worked well is the personality of the interface. The pet mood makes the system feel more alive, and it helps communicate the feeding state without needing complicated instructions. The sad, hungry, happy, and resting states make the dashboard easier to understand and more engaging.

I am also proud that I improved the project through user testing. The IR sensor confusion could have been ignored, but I used it as a design lesson. Changing the printed label made the interaction clearer, which showed me that the physical design and the code are equally important.

Resources Used

p5.js

https://p5js.org/reference/ https://p5js.org/tutorials/get-started/ https://www.youtube.com/watch?v=c3TeLi6Ns1E

Serial Communication

https://itp.nyu.edu/physcomp/labs/labs-serial-communication/lab-webserial-input-to-p5-

js/https://itp.nyu.edu/physcomp/labs/labs-serial-communication/lab-webserial-output-from-p5-

js/https://makeabilitylab.github.io/physcomp/communication/p5js-serial.html

https://medium.com/@yyyyyyyuan/tutorial-serial-communication-with-arduino-and-p5-js-cd39b3ac10ce

Servo Motor

https://docs.arduino.cc/tutorials/generic/basic-servo-control/

https://learn.adafruit.com/adafruit-arduino-lesson-14-servo-motors/overview

https://www.youtube.com/watch?v=1mDnaiEytAI

IR Sensor

https://arduinogetstarted.com/tutorials/arduino-infrared-obstacle-avoidance-sensor

https://projecthub.arduino.cc/aboda243/obstacle-detector-using-ir-module-tutorial-101320

https://www.youtube.com/watch?v=vi4hkrrkwkY

https://www.youtube.com/watch?v=ESqhOKgKt5c

AI Usage

I used AI as a support tool during the development of this project. The most important use was debugging my p5.js code, especially because this was one of my first larger coding projects involving serial communication between Arduino and p5. Debugging was difficult because problems could come from the Arduino code, the p5 code, the serial connection, or the browser. AI helped me break down the problem and understand where the issue might be coming from.

AI also helped me understand how to structure parts of the p5 dashboard, especially the schedule logic, and serial communication. I still tested, adjusted, and integrated the code myself to make sure it worked with my actual Arduino setup.

if (secUntilFeed > 0) {
  petMood = "neutral";

  if (serialConnected) {
    statusMsg = "Waiting for next feeding";
  }

} else if (secUntilFeed > -MISSED_GRACE_SEC) {
  petMood = "hungry";
  statusMsg = "Feeding time!";

} else {
  petMood = "sad";
  statusMsg = "Feeding missed!";
}

I also used AI to generate visual design elements for the project, including the PawPortion logo and printable signs for the physical machine. These visuals helped make the project feel more polished and easier for users to understand.

Challenges Faced and How I Tried to Overcome Them

The biggest challenge was serial communication between Arduino and p5. This was one of the first times I worked on a larger project where hardware and software had to communicate continuously. When something did not work, it was hard to know whether the problem was in the Arduino code, the p5 code, the USB connection, or the browser’s serial port.

I overcame this by testing each part separately. First, I tested the servo on Arduino by itself. Then I tested whether p5 could connect to the Arduino. Then I tested sending one simple command. After that, I tested receiving “DONE” back from Arduino. Breaking it into smaller steps made the project less overwhelming.

Another challenge was making the interaction clear to users. The IR sensor confusion showed me that a working sensor does not automatically mean a clear interaction. Users interpreted the printed image as a button because that was the visual language I accidentally created. I fixed this by changing the label and making the physical instruction clearer.

A third challenge was making the project feel polished. Since the project includes both physical and digital parts, it needed to look intentional from both sides. I worked on the dashboard design, printable labels, and project logo so the final setup felt like one system rather than separate pieces.

Future Improvements

If I had more time, I would improve the physical build of the feeder. I would make the container more stable, the food release cleaner, and the overall structure more durable. Right now, the prototype communicates the idea, but a more refined version could look and function more like a real product.

I would also improve the interface by adding sound feedback. For example, a small sound could play when feeding time arrives, when food is dispensed, or when feeding is missed. This would make the system more noticeable and more satisfying to use.

Another future improvement would be adding more customization to the schedule. Instead of using a short demo interval, the user could choose real feeding times, like morning and evening. This would make the system more practical outside of the class demo.

I would also conduct more user testing with people who have not seen the project before. Watching how people understand the physical setup, the IR sensor, and the dashboard would help me refine the interaction even more.

Overall, I am very very proud of how far i’ve come and can confidently say that this class was a very formative step for me in my university journey and the first step to taking on my passion for Interactive Media. 

Stipend Breakdown (50$):

Total Spend: 47.92$

Week 14 – PawPortion User Testing

Some users were able to understand how the system worked overall, but there was confusion around the IR sensor. A few people tried to press it like a physical button, and they were unsure how hard to press, how close their hand should be, or how many fingers to use. I think this confusion happened because they interpreted it as a button rather than a proximity sensor, so the mapping between the control and the action wasn’t immediately clear.

Despite that, most users understood the general flow of the experience and were able to move between interacting with the machine and observing the outcome. One part that worked especially well was the portion size. The amount of pet food dispensed felt accurate, and the servo motor’s opening and closing motion was smooth and consistent. However, there was a small issue where some pieces of pet food would fly out from the sides, which could be improved with better containment.

The main part I had to explain was how to use the IR sensor properly. For one user, I also had to point out that they needed to place the bowl under the dispenser, even though there was a visual instruction. This might have been specific to that user, but it suggests that the instructions could be made more noticeable or clearer for first-time users.

Week 12 – Final Project Proposal: PawPortion Smart Pet Feeding Assistant

For my final project, I am creating PawPortion, a smart pet feeding assistant that combines a physical feeding system with a digital interface. The goal of this project is not to fully automate feeding, but to support the user in a way that feels clear and intentional. Instead of the system doing everything on its own, it tracks feeding time, prompts the user when it is time to feed, and gives immediate feedback when the feeding happens. I wanted the interaction to feel simple but still meaningful, where the user understands exactly what is happening at every step.

The system uses both Arduino and p5.js, connected through serial communication. On the Arduino side, the program controls a servo motor, an LED, and an IR sensor. The IR sensor detects when something is placed in front of it, which acts as the trigger for feeding. When the sensor is activated, the servo rotates to open the container and allow the food to pour out into a bowl, and the LED turns on at the same time to make the action visible and obvious. The Arduino also receives a “FEED” command from p5.js, performs the same feeding action, and then sends back a “DONE” signal so the interface can update.

On the p5.js side, I designed a full-screen interface that works as a feeding dashboard. It shows the next feeding time, the last time the pet was fed, and the current system status. There is also a “Feed Now” button that allows the user to trigger the feeding manually. The interface tracks time and changes the pet’s mood depending on the situation. The pet appears neutral while waiting, hungry when it is time to feed, happy after feeding, and sad if the feeding is missed. This makes the system feel more interactive and easier to understand visually. The interface sends the “FEED” command to Arduino and waits for the “DONE” response before updating the state.

So far, I have completed a draft of both the Arduino code and the p5.js interface. The communication between them is set up, and the overall system logic is working. At this stage, my main focus is building the physical structure and refining the aesthetic design. I will be constructing the feeder using cardboard and assembling the Arduino setup based on the wiring I have already mapped out. I want the final result to look clean and intentional rather than just a collection of components, so I will focus on hiding wires and making the structure feel complete.

For materials, I used my stipend to buy items that mainly support the physical design and presentation of the project. I purchased cardboard sheets to build the main feeder structure, colored paper to add visual layers and make the design more cohesive, and a small pet bowl to clearly show where the food is dispensed. I also bought paw print stickers to match the theme and make the project feel more playful, along with wooden sticks and decorative tape to help with building and structuring the feeder. These materials will help me create a clean and visually engaging final product without overcomplicating the technical side. I will also purchase and IR sensor obviously.

Overall, PawPortion focuses on creating a clear interaction between the user, the interface, and the physical system. The sensor or user input triggers the feeding, the Arduino performs the action, and the interface reflects it in real time. By combining a simple mechanism with a responsive interface and thoughtful design, the project turns a basic task into a more engaging and understandable experience.

Amazon Links: 

Week 11 – Arduino and p5 Exercises

Exercise 1

Arduino GitHub File

Arduino Set-up:

Arduino Illustration:

Project Demo:

Exercise 2

Arduino GitHub File

Arduino Set-up:

 

Arduino Illustration:

Project Demo:

 

Exercise 3

Arduino GitHub File

Arduino Set-up:

Arduino Illustration:

Project Demo:

Concept

This week was all about making things communicate. Before this, everything we made stayed inside Arduino. This time, we connected Arduino to p5 so the physical and digital sides could talk to each other.

At first it sounded simple, but it ended up being one of the hardest things we’ve done so far. Not because of the ideas, but because getting everything to connect properly took a lot of trial and error.

Exercises

Across the three exercises, we explored different ways of communication.

In the first exercise, we sent data from Arduino to p5. We used the potentiometer to move a circle across the screen. This helped us understand how sensor values can control something visually.

In the second exercise, we reversed the direction. p5 controlled the LED on Arduino. This made us realize that communication goes both ways, and that sending data is just as important as receiving it.

In the third exercise, we combined both directions. The potentiometer controlled movement in p5, and when something happened on the screen, it sent a signal back to Arduino to turn on the LED. This was the most interesting part because everything was connected and reacting together.

Code Highlight

One part that really helped us was reading the sensor data correctly in p5

let str = port.readUntil("\n");

if (str.length > 0) {
  str = trim(str);
  let sensorValue = int(str);

  if (!isNaN(sensorValue)) {
    x = map(sensorValue, 0, 255, 0, width);
  }
}

 

This made sure we only used valid data and mapped it properly to the screen. Once this worked, everything became much smoother.

Problems Encountered

We had a lot of issues this week, mostly with the connection. The serial port wouldn’t open, or p5 wouldn’t receive any values. Sometimes things worked and then suddenly stopped working again.

We realized most of these problems were small things, like forgetting to close the Serial Monitor or not formatting the data correctly.

Reflection

This week helped us understand that interaction is not just about building something, but about connecting things together. Once the connection worked, everything felt more interactive and responsive.

It also made us more patient with debugging. We learned to check things step by step instead of assuming something bigger was wrong.

Collaboration

We worked on this together, which made a big difference. When one of us got stuck, the other could help figure out what went wrong. It also made debugging less frustrating because we weren’t trying to solve everything alone.

Working together helped us understand the system better and move forward faster.

References

p5.js Web Editorhttps://editor.p5js.orgp5 to Arduino

p5.js Web Editorhttps://editor.p5js.orgp5 to Arduino

p5.js and Arduino serial communication – Send a digital sensor to a p5.js sketchYouTube · Scott Fitzgerald28 Mar 2020

Week 11 – Final Project Preliminary Concept

Driving Game with a Physical Steering Wheel (Steering Beyond the Keyboard)

For my final project, I want to create a simple driving game that is controlled using a real steering wheel instead of a keyboard. The idea is to change how we normally interact with games. Instead of pressing keys, the user will physically turn a steering wheel to control the car on the screen. I am interested in how this makes the experience feel more natural and more connected to the body.

I will be using an actual steering wheel that I can buy online and then connecting it to my Arduino setup. The steering wheel will be attached to a potentiometer so that when it rotates, the Arduino can read the change in position. This value will then be sent to p5.js using serial communication. I will also include a button that can be used to start or reset the game.

On the p5.js side, I will build a simple driving game. The player will control a car that can move left and right across the screen while the road moves forward. Obstacles will appear, and the player has to steer to avoid them. The position of the car will be directly linked to how much the steering wheel is turned, so the movement should feel smooth and responsive. I plan to include basic features like a score counter, a crash state, and a reset option.

The main focus of this project is not just the game itself, but the interaction. I want the system to feel clear and easy to understand. When the user turns the wheel, the car should immediately respond on the screen. This direct connection between input and output is important. It should feel like the user is actually controlling the movement in a physical way, not just pressing buttons.

I also want to make the setup look clean and intentional. I will build a simple base to hold the steering wheel and hide the wires. This will help the project feel more like a complete system rather than just loose components. I may also add small details like labels or lights to make the interaction clearer.

For inspiration, I looked at projects that use unconventional controllers and turn the interface into part of the experience. I also looked at previous student projects where the interaction was very clear and immediate. Those projects showed that even simple inputs can feel impressive if the response is well designed and easy to understand.

To build this, I will use basic Arduino tutorials for reading a potentiometer and sending data through serial communication. For p5.js, I will use examples that show how to read serial data and map values to movement on the screen. These will help me create a stable system before I focus on improving the visuals and overall experience.

Overall, this project is about making a simple but effective interactive system. By using a real steering wheel, I want to create a more engaging and physical way to play a digital game. The final result should feel responsive, intuitive, and easy for anyone to try without needing instructions.

Reading Reflection – Week 11

The Design Meets Disability reading made me rethink how I usually see design, because I realized I tend to treat accessibility as something separate from “good design,” like something added later to fix a problem instead of something considered from the start. Pullin challenges that idea by showing how assistive devices often aim to hide disability or make someone appear more “normal,” and I found myself questioning why that feels like the default goal. When I think about it, that approach says more about what society is comfortable with than what people need or want, and I had not really noticed how strong that assumption is until now.

What stayed with me is his focus on expression, because he suggests that assistive design does not have to disappear or blend in, it can reflect identity in a visible way, and that shift feels important. When I think about prosthetics designed to look exactly like a natural limb, I see how design tries to erase difference, but when I picture a prosthetic designed with color or shape or style, it changes the meaning of the object completely. It becomes something personal instead of something corrective. That made me realize how much design controls what gets seen as normal, and how easy it is to follow that without questioning it.

After reading this, I feel more aware of how I approach design decisions, even in small ways, because I see how choices about form, appearance, and function are never neutral. If you design for one type of user, you set a standard that excludes others without saying it directly, and I think that is what I will carry forward from this reading. I need to think about who is included from the beginning, not after, and I need to question why certain designs try so hard to hide difference instead of allowing it to exist openly.

Reading Reflection – Week 10

In A Brief Rant on the Future of Interaction Design, it honestly feels like Bret Victor is calling everyone out for settling. He’s basically saying that what we think is “advanced” interaction is actually kind of limited, and the line about an interface being “less expressive than a sandwich” made it sound funny at first but also a bit embarrassing once you think about it. Like our hands can do so much in real life, but then we go to screens and everything becomes flat, just tapping and swiping on what he calls “pictures under glass.” It made me realize how normal that feels even though it’s actually such a reduced version of interaction.

Then in Responses to A Brief Rant on the Future of Interaction Design, it feels more grounded because people are basically asking, okay but what now? And he admits that “the solution isn’t known,” which I actually liked because it didn’t feel like he was pretending to have everything figured out. It made the whole thing feel less like complaining and more like pushing people to think further instead of just accepting what already works. When he says things like the iPad is “good! For now!” it kind of shows that he’s not rejecting current tech, just refusing to treat it like the final version.

Putting both together, it feels less like he’s saying everything is wrong and more like we got comfortable too fast. The idea that “the future is a choice” stuck with me because it makes it feel like if interaction stays limited, it’s not by accident, it’s because people stopped questioning it.

Week 10 – Musical Instrument

Arduino GitHub File

(pitches.h) file

Arduino Set-up:

Arduino Illustration:

Project Demo:

 

Concept

This project is a simple musical instrument that uses both digital and analog input at the same time.

The button acts as the digital sensor because it only has two states, either pressed or not pressed. When we press it, the sound plays, and when we release it, the sound stops.

The potentiometer acts as the analog sensor because it gives a continuous range of values instead of just two states. We used that range to select different musical notes, so turning the knob changes the pitch.

What we liked about this setup is that both inputs have completely different roles. The button controls when the instrument is played, while the potentiometer controls what sound is produced. It made the difference between digital and analog feel really clear and actually useful.

Code Snippet We’re Proud Of

int index = map(potValue, 0, 1023, 0, 7);

if (buttonState == LOW) {

  tone(buzzerPin, notes[index]);

} else {

  noTone(buzzerPin);

}

This part is where everything comes together. The potentiometer gives a value from 0 to 1023, and we use map() to convert that into a smaller range that matches the number of notes we have. Then we use that number to pick a note from the array.

At the same time, the button decides whether the note should actually play. So one input controls the pitch, and the other controls when the sound happens, which made it feel more like a real instrument instead of just a buzzer making random noise.

Problems Encountered

The biggest challenge was honestly the wiring. Even when everything looked right, one wire in the wrong row would break the whole circuit. We had to be really precise with the breadboard and double check every connection.

The button also gave us trouble at first. It either didn’t work or stayed on all the time, and we realized it was because of how it was placed across the gap and which rows we were using. Once we fixed that, it started behaving correctly.

Another challenge was understanding how the potentiometer connects to the sound. At first it felt random, but once we understood that the Arduino reads values from 0 to 1023 and that we needed to map that to our notes, it made a lot more sense.

Reflection

This project helped us actually understand the difference between digital and analog input instead of just memorizing it. The button made sense as something binary, while the potentiometer showed how values can change continuously.

It also made us more comfortable working with sound. Before this, the buzzer just felt like something that makes noise, but now we understand how pitch is controlled and how different inputs can affect it.

If we were to improve this project, we would probably expand the number of notes or organize them into a more structured scale so it feels more like a playable instrument.

Overall, it was a really successful assignment and working in pairs made it a lot easier to think and refine ideas and carry each other throughout the trial and error process of the entire project!

References 

https://projecthub.arduino.cc/SURYATEJA/use-a-buzzer-module-piezo-speaker-using-arduino-uno-cf4191

Arduino Project | Play Melody with Passive buzzer using Arduino UnoYouTube · IoT Frontier3 Jul 2023

YouTube · Tech Explorations1.3K+ views  ·  1 year ago[240] Arduino Getting Started: Make noise and beeps with the passive buzzer

Reading Reflection – Week 9

The first reading made me rethink what computing even is, because I feel like I’ve always thought of it as something very screen based, like typing, clicking, code on a laptop. But the way Tigoe describes it, starting from the body instead of the machine, kind of flips my idea completely. He says physical computing begins with “how humans express themselves physically,” and that idea stayed in my head because it makes computing feel less technical and more human which i really like actually. It’s not about controlling a computer, it’s about translating things like movement, light, or touch into something the computer can understand. I liked that shift, but at the same time it made me realize how limited most of our interactions with technology actually are. Like we’re so used to keyboards and screens, and we accept that as normal, even though it’s such a small part of how we actually exist in the world, especially alongside technology.

The second reading pushed my idea even further, especially when it talks about how computers usually only involve small, controlled movements, like just sitting and using your fingers, even though “that is not your whole life.” That line felt weirdly personal in a way because it made me think about how passive most digital interactions are. You’re just there, barely moving, and everything is basically just clicks. Physical computing on the other hand, is more about interactions like sensing and responding, almost like a conversation between the body and the system. I think what I liked about this reading is that it doesn’t treat technology as something separate from us and more as something that extends what we already do every day anyway. But at the same time, it also made me question whether we’ve gotten too used to limited forms of interaction, like we’ve accepted a very small version of what technology could actually be.

Putting both readings together, I felt like they were both kind of criticizing the way we already use technology without saying it directly. Both of them keep coming back to the idea that computers should connect more to the body, more to the real world, instead of staying stuck behind screens. And I think what made it interesting for me is that it made something like physical computing feel less like a special, hard field and more like what computing should have been all along. It kind of made everything else feel a bit limited in a way, like we’ve been interacting with technology in the smallest way possible when there was always, always more potential, which is kind of sad if you think about it.

Week 9 – Analog & Digital Sensors

Arduino file on GitHub

Arduino Set-up:

Arduino Illustration:

Project Demo:

Concept

In this project, I combined both digital and analog inputs in one system, and the difference between them becomes really clear through how each LED is behaving.

The button is a digital input, so it only has two states. It’s either pressed or not pressed. Because of that, the red LED is either fully on or fully off, and there’s no in between state, which makes it binary and feel very direct.

The potentiometer works differently because It’s an analog input, so instead of just two states, it produces a whole range of values. That’s why the yellow LED doesn’t just turn on or off, it gradually changes brightness and dimness depending on how much I turn the knob.

Seeing both of these side by side made the difference between digital and analog feel a lot clearer. Basically, one is fixed, and the other is adjustable.

Code Snippet I’m Proud Of

int potValue = analogRead(potPin);
int brightness = map(potValue, 0, 1023, 0, 255);
analogWrite(yellowLed, brightness);

This part looks simple, but it took me a second to actually understand what was happening. The potentiometer gives a value from 0 to 1023, which is way bigger than what the LED can use. The map function basically translates that into a range the LED understands, from 0 to 255.

Once I understood that, the analog part finally clicked in my head.

Problems Encountered

Honestly, most of my problems came from the breadboard. Everything can look right, but if a wire is even one row off, nothing works. That was probably the most frustrating part because it’s such a small detail but it affects everything.

The button was also confusing at first. It didn’t respond at all, and I eventually realized it was because of how it was placed. The rotation and position of the button actually matters, and once I adjusted which rows I was using, it started working immediately.

The analog side was also a bit tricky. At one point the LED was just always bright, which made it seem like nothing was changing. That ended up being a problem with a mix of wiring and how the values were being read and mapped.

Reflection

This project was frustrating at times, but it definitely helped me understand what I’m doing instead of just following steps. The biggest thing I learned is how sensitive circuits are to small details.

I also feel like I understand the difference between digital and analog input way more now. Before, it was just a definition, but now I’ve actually seen how they behave differently in the same system.

If I were to improve this, I would make the wiring cleaner and more organized, because that would make debugging way easier, and also maybe try a more creative approach the more i get the hang od the ardunio board.

Overall, I feel like I moved from just trying to get it to work to actually understanding why it works, which I’m really proud of.

Refrences

https://docs.arduino.cc/tutorials/uno-rev3/AnalogReadSerial/

https://docs.arduino.cc/learn/electronics/potentiometer-basics/

AI Usage

ChatGPT was used to help identify what was going wrong when my LEDs wouldn’t turn on, and to address any confusion or debugging needed in my code.