Week 13- User Testing

Are they able to figure it out? Where do they get confused and why? Do they understand the mapping between the controls and what happens in the experience?

The one thing I did not make very clear is where they should point the light to, which are the light sensors. I think testing this with people that have done IM classes before was not suitable because they knew how each sesnor looked like therefor knew how to trigger them. So the light sensors and water sensors should be intergrated in a way that allows the user to understand what they should do to trigger the sensor.

  • What parts of the experience are working well? What areas could be improved?

I think the overall functionalities of the sensors work well, and the thresholds are defined well. I want to make all the wires connected to the Arduino and bread board sit more neatly because they do get tangled at time. I also want to make sure that the the wooden case around my project has a small hole or opening to let the connection from the laptop to the Arduino.

  • What parts of your project did you feel the need to explain? How could you make these areas more clear to someone that is experiencing your project for the first time?

One part of my project that I needed to explain more clearly was how users should interact with the sensors—specifically, where to point the light for the light sensors and how to activate the water sensor. Because many of the people I tested with had prior IM experience, they already recognized what the sensors looked like and intuitively knew how to trigger them, which made it harder for me to notice confusion. For a first-time user, however, the mapping between their actions and the project’s responses is not immediately obvious. In future iterations, I want to better integrate or label the sensors so that users can naturally understand where to shine the light or where to place the water without requiring external explanation.

DEMO of User Testing:

IMG_8795

 

Week 12- Final Project Proposal

Finalised Concept for the Project

Plant Care Station is an interactive, game-like system that combines physical sensors through Arduino with a browser-based visual interface through p5.js.
The project teaches users about plant care (light, water, soil, and touch) through playful interactions, animations, glowing suns, and confetti celebrations.

My project uses sensor data from the Arduino : including four light sensors,  capacitive touch sensors using foil, and a soil moisture sensor — which are visualized in p5.js as a growing, living plant environment. As the user helps the plant receive enough light, fertilizer and moisture, the interface responds through movement, glowing suns, page transitions, and celebratory effects.

Arduino to P5:

  • 4 Light Sensors: Once the light has hit the sensors at a specific threshold, we use the p5 interface to show the user which sensors need more light. Once the light threshold is met, then confetti is thrown and the user can move to the next page.
  • Capacitative Touch sensors: I used aluminium foil to act as a capacitative touch sensor. This foil will be stuck to the scissors so that everytime the reading goes from NO-CONTACT to CONTACT: I count that as one leaf cut.
  • Capacitative Touch sensors/Pressure sensors: Once the fertilizer is paced in the soil we confirm that on the p5 screen and allow the user to move to the next page
  • Soil Moisture Sensor: Once the plant is watered to a specific threshold we notify the user on the p5 screen

 P5 to Arduino:

  • Once all the Steps are complete, p5 will send a signal to Arduino and the colourful LEDs will start flashing- to celebrate the completion.

 

Final Project Proposal: Plant Care Station

Final Project Proposal: Plant Care Station

The final project idea I have in mind is to make a connection between the Arduino and p5.js to take care of plants. Through buttons the user can fill the meter for water (through a blue buttons), then add fertilizer (yellow button), to trim (red button), to move sunlight closeness(yellow buttons one for arrow up one for arrow down).

You can even get a score or a rating based on how well you took care of the plant. Checking each metric. The Arduino part would be multiple buttons and a box that covers these buttons with larger buttons on the wooden box (design is inspired by the beat hoven project displayed in class).

ARDUINO

p5

Week 12: Serial Communication

For this week’s assignment, Zeina and I worked on three different exercises that focused on serial communication.

NOTE: We mostly used the exact copy of the arduino and p5 that was demoed in class with just an addition of 2-3 lines for each program. For these 2-3 lines we added comments, and for the rest of the lines we kept the same comments already included by the code

1- Make something that uses only one sensor  on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

ARDUINO CODE

void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensorValue = analogRead(A1); // 0–1023
  Serial.println(sensorValue);             // send to p5.js
  delay(50);
}

P5 CODE

let port;
let connectBtn;
let baudrate = 9600;
let lastMessage = "";
let sensorValue = 0;

function setup() {
  createCanvas(400, 400);
  background(220);

  //Setting the global variable port to a new serial port instance inside setup:
  port = createSerial();

  // we can open ports we have used previously without user interaction
  let usedPorts = usedSerialPorts(); //array of used ports
  if (usedPorts.length > 0) {  
    port.open(usedPorts[0], baudrate); //if any used port is in the array, open that port with 9600 baudrate
  }

  // any other ports (new ones) can be opened via a dialog after user interaction (see connectBtnClick below)
  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(width/2, 270);
  connectBtn.mousePressed(connectBtnClick);

}

function draw() {
  background("white");
 
  // Read from the serial port. This is a non-blocking function. If a full line has come in (ending in \n), it returns that text. If the full line is not yet complete, it returns an empty string "" instead.
  let str = port.readUntil("\n");
  if (str.length > 0) {   // if str -a string- has any characters
    // print(str);
    lastMessage = str;
    sensorValue = int(lastMessage);  
  }
 
  //draw ellipse mapped to horizontal axis
  let x = map(sensorValue, 0, 1023, 0, width);  
  ellipse(x, height / 2, 40, 40);
 
  // Display the most recent message
  text("Last message: " + lastMessage, 10, height - 20);

  // change button label based on connection status
  if (!port.opened()) {
    connectBtn.html("Connect to Arduino");
  } else {
    connectBtn.html("Disconnect");
  }
}

function connectBtnClick() {
  if (!port.opened()) {
    port.open("Arduino", baudrate);
  } else {
    port.close();
  }
}

 

 

2-Make something that controls the LED brightness from p5

DEMO

IMG_8392 (2)

ARDUINO CODE

// Week 12 Example of bidirectional serial communication

int leftLedPin = 2;
int rightLedPin = 5;

void setup() {
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);

  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH);
    Serial.println("0,0");
    delay(300);
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH);

    int left = Serial.parseInt();
    int right = Serial.parseInt();

    if (Serial.read() == '\n') {

      // -----------------------
      // ONLY CHANGE IS HERE:
      // -----------------------
      digitalWrite(leftLedPin, left);     // left stays ON/OFF
      analogWrite(rightLedPin, right);    // right is now BRIGHTNESS (0–255)
      // -----------------------

      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.print(sensor);
      Serial.print(',');
      Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

p5 CODE

let port; // making a var to hold the serial port
let baudrate = 9600; // speed for talking to arduino
let brightnessSlider; // slider to pick brightness
let smoothBrightness = 0; //transition into the brightness instead of jumping

function setup() {
  createCanvas(400, 200); // just making a small canvas for ui 
  textSize(18); // bigger test

  brightnessSlider = createSlider(0, 255, 0); // slider from 0 to full bright
  brightnessSlider.position(20, 80); // where it shows up on screen
  brightnessSlider.style('width', '200px'); // make it a bit wider

  port = createSerial(); // create a serial object so we can connect to arduino

  let used = usedSerialPorts(); // check if we already used a port before
  if (used.length > 0) {
    port.open(used[0], baudrate); // auto connect to the last used port
  }
}

function setupSerial() {
  if (!port.opened()) { // if no connection yet
    port.open("Arduino", baudrate); // try to open one 
  } else {
    port.close(); // if already open then close it (toggle)
  }
}

function draw() {
  background(240); // light grey 

  if (!port.opened()) {
    text("Press SPACE to connect", 20, 30); // tell the user what to do
  } else {
    text("Connected!", 20, 30); // connection message
  }

  let target = brightnessSlider.value(); // get the slider value

  // do transitional brightness
  smoothBrightness = lerp(smoothBrightness, target, 0.07);
 

  text("Brightness: " + int(smoothBrightness), 20, 70); // show the number

  // actually send the brightness to the arduino
  if (port.opened()) {
    let sendString = "0," + int(smoothBrightness) + "\n"; // left=0 right=smooth
    port.write(sendString); // send it over serial
  }
}

function keyPressed() {
  if (key === " ") {
    setupSerial(); // hitting space toggles the port connection
  }
}

3-Take the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul) and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

ARDUINO CODE

void setup() {
  pinMode(2, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  int sensorValue = analogRead(A1);
  Serial.println(sensorValue);

  //check for bounce 
  if (Serial.available() > 0) {
    //if 1, light up led
    if (Serial.parseInt() == 1) {             
      digitalWrite(2, HIGH);
      delay(100);                          
      digitalWrite(2, LOW);
    }
  }
  delay(10);
}

P5.JS CODE

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let port;
let connectBtn;
let baudrate = 9600;
let lastMessage = "";
let sensorValue = 0;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.5 * mass);
  wind = createVector(0, 0);

  //Setting the global variable port to a new serial port instance inside setup:
  port = createSerial();

 
  // we can open ports we have used previously without user interaction
  let usedPorts = usedSerialPorts(); //array of used ports
  if (usedPorts.length > 0) {  
    port.open(usedPorts[0], baudrate); //if any used port is in the array, open that port with 9600 baudrate
  }
// any other ports (new ones) can be opened via a dialog after user interaction (see connectBtnClick below)
  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(width/2, 270);
  connectBtn.mousePressed(connectBtnClick);
}

function draw() {
  background(255);
 
  // Read from the serial port. This is a non-blocking function. If a full line has come in (ending in \n), it returns that text. If the full line is not yet complete, it returns an empty string "" instead.
  let str = port.readUntil("\n");
  if (str.length > 0) {   // if str -a string- has any characters
    // print(str);
    lastMessage = str.trim();
    sensorValue = int(lastMessage);    
  }
 

  //wind controlled by analog sensor
  wind.x = map(sensorValue, 0, 1023, -1, 1);
  console.log("Sensor value: " + sensorValue);
 

  applyForce(wind);
  applyForce(gravity);

  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);

  ellipse(position.x, position.y, mass, mass);

  // Bounce detection
  if (position.y > height - mass / 2) {
    velocity.y *= -0.9; // Dampening
    position.y = height - mass / 2;

    //send bounce signal to Arduino
    port.write("1\n");
  }
// Display the most recent message
  text("Last message: " + lastMessage, 10, height - 20);

  // change button label based on connection status
  if (!port.opened()) {
    connectBtn.html("Connect to Arduino");
  } else {
    connectBtn.html("Disconnect");
  }
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed(){
  if (keyCode==LEFT_ARROW){
    wind.x=-1;
  }
  if (keyCode==RIGHT_ARROW){
    wind.x=1;
  }
  if (key==' '){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
}

function connectBtnClick() {
  if (!port.opened()) {
    port.openDialog(); // user selects port
  } else {
    port.close();
  }
}

For this week’s reading assignment :

This reading made me really rethink the relationship between disability, design, and visibility. The author’s point about discretion was especially interesting, for example, how so many medical devices are intentionally designed to disappear, as if invisibility equals good design. But the reading also made me question what that invisibility actually communicates. If a product is meant to blend into the skin or look like it’s barely there, does that subtly imply that disability should be hidden? What is the further message?

The contrast with fashion added another interesting layer in my opinion. Fashion openly accepts the idea of being seen, of expressing identity, which is almost the opposite of traditional medical design. I liked the example of eyewear, which shows that a product can address a disability without carrying social stigma, and also can become something expressive and desirable. That overlap suggests disability-related products don’t need to be trapped between being “invisible” or “medical-looking,” and that they can have personality without becoming sensationalized.

 

Week 10 Sound, Servo motor, Mapping

Inspiration:

For this week’s project, the main inspo for our instrument was Stormae’s Song “Alors on Danse”. We were mainly inspired by the way that the songs main notes are split into 3 notes of varying pitches, with one sound constantly playing in the background. For that reason we varied the pitches of the three sounds our project produces with a 4th note that is constantly playing when the button is pressed.

Concept:

For this week’s project, we used 3 light sensors to play sounds on the piezo speaker, with one note being played constantly when a button is clicked. With the light sensor, once the user’s finger covers the sensor that is when the note is played. Furthermore, we have three sensors each of which plays a different pitch on the piezo speaker. The condition that allows for this is the reading of the sensor in comparison to the threshold we defined. An additional component we added was the button that allows for the sounds to be played on the piezo speaker and then stopped once the button is pressed again.

Code:

int photoPins[3] = {A0, A1, A2};// first we define a list of integers holding the analog pins
int buttonPin = 2; // digi pin 2 for the buttons
int piezzoPin = 8; //digi pin 8 for the piezzo speaker 

int threshold = 700; //this is the threshold fo rte light/no light intensity that worked wit our light sensors in our environment/lighting

bool prevPhoto[3] = {false, false, false}; //keeping track of whether the light sebsir was interacted with or not false initially 
bool prevButton = false; //initially false
bool buttonState = false;//initially false

void setup() {
  pinMode(buttonPin, INPUT_PULLUP); //for the button pint as an input for the arduino
  pinMode(piezzoPin, OUTPUT); //setting the buzzor pin as output so the arduino sneds the sound signal 
  Serial.begin(9600); // serial monitor for debugging
}

void loop() {
  for (int i = 0; i < 3; i++) { //looping over the 3 sensors to reasd their analog value
    int value = analogRead(photoPins[i]); 
    bool tapped = value < threshold; //comparing the value captured by the sensor and the defined threshold
    if (tapped && !prevPhoto[i]) { //checking for tap in the current state compared to prev
      if (i == 0) tone(piezzoPin, 440, 200); // translates to A0
      if (i == 1) tone(piezzoPin, 523, 200); // translates to A1
      if (i == 2) tone(piezzoPin, 659, 200); // translates to A2
    }
    prevPhoto[i] = tapped; //MAKING SURE TO NOTE it as tapped to have a singular tap rather than looping

    Serial.print(value); //serial print
    Serial.print(",");
  }

  bool pressed = digitalRead(buttonPin) == LOW; //setting the reading of the button to low meaning the button is pressed
  if (pressed && !prevButton) { //when the button is pressed state changes from not pressed(false) to presssed(true)
    buttonState = !buttonState;
    if (buttonState) tone(piezzoPin, 784);// if toggled on play a continuoue G5 tone
    else noTone(piezzoPin); //otherwise stop the buzzer
  }
  prevButton = pressed; 

  Serial.println(pressed ? "1" : "0"); //for monitoring purposes

  delay(50);//short delay 
}

Disclaimer: Some AI/ChatGPT was used to help with debugging and allowing multiple elements to work cohesively.

More Specifically:

1- When trying to debug and understand why the button was not working, we used chatGPT’s recommendation to have a statement check if button is pressed is true on the serial monitor (this line: Serial.println(pressed ? “1” : “0”); //for monitoring purposes)
2- Recommended values for frequency in hertz to mimic Alors on Danse (if (i == 0) tone(piezzoPin, 440, 200); // translates to A0 if (i == 1) tone(piezzoPin, 523, 200); // translates to A1 if (i == 2) tone(piezzoPin, 659, 200); // translates to A2) The SECOND parameter

Schematic:

Demo:

IMG_8360

Future Improvements:

As for the future improvements, one main thing we wanted to capture in this project is being to overlap the sounds, but since we were working with one piezo speaker, we were not able to do that. To address this we aim to learn more about how we can maybe start playing the sounds from our laptops instead of the physical speaker we connect to the Arduino. Other improvements could be exploring how we can incorporate different instrument sounds maybe and create an orchestra like instrument.

 

Reading Reflection:

Reading A Brief Rant on the Future of Interaction Design really made me rethink how we use technology and what we’ve lost in the process. The author points out that what we often label as “futuristic” design like touchscreens and flat device isn’t really that visionary or futurisitc. It’s actually very far from how we connect with the world. When he describes our current technology as “Pictures Under Glass,” it clicked for me how minimal that kind of interaction feels. We’re constantly sliding our fingers across flat screens, but we’ve given up the physical feedback that makes using our hands so intuitive and satisfying. I never thought about how much information we get through touch until he brought up examples like turning the pages of a book or feeling the weight of a glass of water.

What stood out most to me is his idea that our hands and really, our whole bodies are tools for understanding and shaping the world. Technology should amplify that, not suppress it. I realized how much design today caters to convenience and simplicity, but in doing so, it often strips away depth and engagement. His point about how touchscreens make us “finger-blind” hit hard—it’s true that we spend so much time tapping and swiping that we forget what it feels like to truly handle something.

Arduino: analog input & output- Elyazia Abbas

Concept:

For this week’s assignment, I made the Plant Care Assistant. It helps users monitor their plant’s water needs through simple visual feedback. A green LED will light up when the sun exposure is high(simulate with flashlight from phone). A red LED will ligzht up when the sun exposure is low(simulate with moving away flashlight from phone). When users click the yellow button, if it flashes yellow, then the plant is in good state and if it does not flash at all, the plant needs to be taken care of.

Code:

int ledFlashOn = 11;   // led on pin 11 flashes if the plant has enough sunlight
int ledFlashOff = 12;  // led on pin 12  flash red if the plant doesn has enough sunlight
int sensorPin = A2; //sensor pin for the light sensor 

void setup() {
  Serial.begin(9600);
  pinMode(ledFlashOn, OUTPUT);
  pinMode(ledFlashOff, OUTPUT);
}

void loop() {
  int sensorValue = analogRead(sensorPin); //gettign the value of the pin 
  Serial.println(sensorValue); //printing t in the serial monitor

  if (sensorValue > 950) { //if sensor value is greater than 950 
    digitalWrite(ledFlashOn, HIGH); //light up green dont light up red
    digitalWrite(ledFlashOff, LOW);
  } else {
//light up red dont light up green   
    digitalWrite(ledFlashOff, HIGH);
  }

  delay(50); // small delay for stability
}

In my Arduino code we continuously read analog values from a light sensor connected to pin A2, where higher values indicate more light. When the sensor reading exceeds 950 (bright sunlight), the code turns on the green LED on pin 11 while keeping the red LED on pin 12 off, signaling that the plant is receiving adequate sunlight. If the reading falls below 950, it switches to illuminate the red LED while turning off the green one, warning that the plant needs more light. The serial monitor displays real-time sensor readings for debugging, and a 50ms delay ensures stable readings without flickering.

DEMO:

IMG_8353

Hand-Drawn Schematic:

Reflections & Future Improvements:

For future improvements, I’d like to add PWM control to make the LEDs fade in and out for smoother transitions between states, and implement multiple thresholds to show “perfect,” “okay,” and “needs more light” zones instead of just binary feedback. Adding a data logging feature to track light exposure over time would help understand the plant’s daily light patterns.

Reading Reflection:

One example that stood out to me was the drum glove project. It transforms the familiar act of tapping one’s fingers into a musical interface, blending intuitive human gesture with technology. I found this especially interesting because it captures the essence of Igoe’s argument which is:  it doesn’t dictate what the user should feel or do, it just invites them to explore sound through motion. Through this glove creativity emerges from the user’s spontaneous actions rather than the artist’s fixed script. Both readings remind me that successful interactive art balances structure and openness, designing just enough to spark imagination, but never enough to silence it.

Week 8- Unusual Switches + Reading Reflection

Concept:

For this week’s assignment to create an unusual switch, I built a jaw-activated LED circuit that lights up a red LED when the user opens or closes their mouth. The switch consists of two contact plates that complete the circuit when pressed together.

To construct this, I repurposed old highlighter tubes and wood pieces to create a frame that holds the contact plates. I attached this frame to a headband, which positions the mechanism alongside the user’s jaw when worn. When the headband is in place and the wearer moves their jaw (opening or closing their mouth), the jaw movement pushes the contact plates together. This completes the circuit and activates the red LED.

The design essentially turns jaw movement into a hands-free switch – the mechanical motion of opening and closing your mouth becomes the trigger for the electronic circuit. The highlighter tubes and wood pieces work as the structural components that translate the jaw’s movement into the pressing motion needed to activate the switch.

Implementation:

My Arduino circuit uses two pushbuttons to control an LED. Each button is connected to digital pins 2 and 3 on the Arduino, with 10 kΩ pull-down resistors ensuring the inputs stay LOW when not pressed. When a button is pressed, it sends a HIGH signal to the Arduino, which then turns the LED connected to pin 13 ON through a 330 Ω resistor that limits current. The LED’s short leg is connected to ground, and the red and blue rails on the breadboard distribute 5 V and GND from the Arduino. The setup allows the LED to light up when either button is pressed, demonstrating how digital inputs and outputs can be used together to build simple interactive circuits.

Code:

const int button1Pin = 2;
const int button2Pin = 3;
const int ledPin = 13;

void setup() {
  pinMode(button1Pin, INPUT);
  pinMode(button2Pin, INPUT);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  int button1State = digitalRead(button1Pin);
  int button2State = digitalRead(button2Pin);

  if (button1State == LOW || button2State == LOW) {
    digitalWrite(ledPin, HIGH);
  } else {
    digitalWrite(ledPin, LOW);
  }
}

This Arduino code controls an LED using two pushbuttons. It first defines constants for each pin: button 1 on pin 2, button 2 on pin 3, and the LED on pin 13. In the setup() function, pinMode() sets the two button pins as inputs so they can read signals, and the LED pin as an output so it can turn ON or OFF. In the loop(), the program continuously reads each button’s state using digitalRead(), storing the result in button1State and button2State. The if statement checks if either button is pressed (returns LOW because the circuit uses pull-down or pull-up logic depending on wiring). If one or both buttons are pressed, the LED turns ON with digitalWrite(ledPin, HIGH); if neither is pressed, it turns OFF with digitalWrite(ledPin, LOW). This creates a simple interactive system where pressing any button lights the LED.

DEMO:

IMG_8130

 

Reading Reflection:

Both readings tell a story about why good design literally saves lives. Margaret Hamilton was programming the Apollo missions in the 1960s, juggling motherhood with rocket science – bringing her 4-year-old daughter Lauren to MIT on weekends while she coded. One day, Lauren was playing with the simulator and accidentally crashed it by hitting P01, a program that wasn’t supposed to run during flight. Hamilton immediately thought “we need to protect against this,” but NASA shut her down: “We had been told many times that astronauts would not make any mistakes… They were trained to be perfect”. Because stressed-out humans hurtling through space in a tin can are definitely going to be perfect. Sure enough, during Apollo 8, astronaut Jim Lovell accidentally selected P01 and wiped out all their navigation data. Hamilton’s backup procedures saved the day, but the whole mess could’ve been avoided if NASA had listened to her in the first place about building in safeguards.

This is exactly what Don Norman talks about in his design article – you can’t just assume people won’t make mistakes, especially under pressure. He points out that “products designed for use under stress follow good human-centered design, for stress makes people less able to cope with difficulties and less flexible in their approach to problem solving.” Hamilton understood this intuitively. She knew that even brilliant, highly trained astronauts are still human. Norman also argues that “attractive things work better” because when we’re in a good mental state, we’re actually better at problem-solving – “positive affect makes people more tolerant of minor difficulties and more flexible and creative in finding solutions.” The flip side is that badly designed systems become even worse under stress. The Apollo story shows how one woman’s insistence on human-centered design – despite everyone telling her the users would be “perfect” – literally got humans to the moon and back safely.

 

Midterm Project- Panda Math Stack- Elyazia Abbas

Concept: Learning Through Play with Panda Math Stack

For my midterm project, I chose to integrate a simple math operation game into my p5 sketch.  The Panda Math Stack transforms simple arithmetic into an interactive adventure where learning meets play. Designed with cheerful visuals and smooth animation, the game combines the task of a running panda collecting pancakes to make math practice fun and rewarding. Players solve addition problems by catching and stacking pancakes that match the correct answer. With every correct answer, the panda celebrates as cheerful sound effects and background music enhance the sense of accomplishment.

The game’s 30-second timer adds excitement, encouraging quick thinking while keeping learners engaged. Beyond its playful surface, Panda Math Stack promotes cognitive growth, especially for kids, through repetition and visual reinforcement, showing that learning can be both joyful and challenging in the right balance of design, music, and motion.

Explanation of The Main Components:

1. Use of Sounds
Sound plays an important emotional and feedback role in Panda Math Stack. The game includes three primary sound effects: background music (bgMusic), a stacking sound (boinkSound), and a game-over sound (gameOverSound). The looping background track establishes an upbeat rhythm that keeps players engaged throughout gameplay. The short “boink” effect provides instant positive feedback every time a pancake successfully lands on the tray, reinforcing the satisfaction of correct stacking. Finally, the game-over sound signals the end of a round, helping players transition between play sessions. Together, these sounds create a responsive and immersive auditory experience that strengthens player focus and motivation.

2. Use of Shapes
Shapes in this project are used to render most visual elements directly in p5.js without relying on external images. Circles and ellipses form pancakes, clouds, and decorative symbols, while rectangles and triangles are used for UI buttons and the grassy ground. By layering and coloring these basic shapes, the design achieves a friendly, cartoon-like appearance. The pancakes themselves use multiple overlapping ellipses in different brown tones, giving them dimension and warmth. This approach demonstrates how simple geometric forms can produce visually appealing and cohesive game elements that maintain a light, playful aesthetic.

3. Use of Images for Sprite Sheet
The panda character’s animation is handled using a sprite sheet (panda.png or panda2.png), divided into multiple frames organized in rows and columns. Each frame represents a different pose of the panda walking or idle. During gameplay, the code cycles through these frames (currentFrame) to simulate movement. This sprite-based animation technique allows smooth transitions without heavy computation, making the character appear lively and expressive as it moves and interacts with falling pancakes. The use of images here contrasts with the drawn shapes, introducing a visually rich and character-focused element that enhances personality and storytelling in the game.

 

4. On-Screen Text Evolution
Text evolves dynamically to communicate progress and emotions to the player. At the start, large pixel-style fonts announce the title and instructions, creating a retro arcade feel. During gameplay, text displays math problems, timers, and feedback messages that change in color and size depending on correctness ( for success, for errors). In the end screen and notebook view, typography shifts toward clarity and encouragement, summarizing scores or reviewing mistakes. This continuous adaptation of on-screen text keeps information readable while reinforcing the game’s educational purpose — transforming numbers and feedback into part of the visual storytelling.

5. Object-Oriented Programming in notebook.js and pancake.js
Both notebook.js and pancake.js apply object-oriented programming (OOP) principles to organize behavior and state efficiently. The Pancake class defines how each pancake behaves — from falling physics (update) to visual rendering (show) — encapsulating movement, collisions, and display logic into self-contained objects. Similarly, the Notebook class manages data storage and visualization of wrong answers, using methods like addWrong(), display(), and drawStack() to handle user progress and draw interactive visual feedback. This modular, class-based approach makes the code easier to scale, reuse, and maintain, as each object represents a distinct component of the game world with its own properties and responsibilities.

Code I am Proud of:

function checkAnswer() {
  if (!roundActive) return;

  if (stack.length === targetStack) {
    // Correct number of pancakes
    correctRounds++;
    message = `✅ Correct! (${num1} + ${num2} = ${targetStack})`;
    messageTimer = millis();
    roundActive = false;
    setTimeout(() => newMathProblem(), 1500);
    if (boinkSound) boinkSound.play();
  } else {
    // Incorrect answer
    message = "❌ Try Again!";
    messageTimer = millis();
    let question = `${num1} + ${num2}`;
    notebook.addWrong?.(question, targetStack, num1, num2);
    stack = [];
    pancakes = [];
  }
}

The checkAnswer() function is the core of the game’s math logic — it checks whether the number of pancakes the player stacked matches the correct answer to the math problem shown on screen. When the player presses “Check,” the function compares the length of the stack array (how many pancakes were caught) to targetStack (the sum of num1 + num2). If they’re equal, the player’s answer is correct — their score (correctRounds) increases, a success message and sound play, and a new math problem appears after a short delay. If the count is wrong, an error message is displayed, the pancakes reset, and the player can try again, making this function the key link between the math challenge and interactive gameplay.

Through Beta Testing:

Through beta testing, I received valuable feedback from my friends, that helped shape the final version of Panda Math Stack. They suggested transforming the concept into a math-based problem-solving game rather than just a stacking challenge, as this would make it more educational and purposeful.

 

They also recommended developing the notebook feature to track incorrect answers, allowing players to review and learn from their mistakes after each round. Incorporating these suggestions not only improved the gameplay experience but also strengthened the project’s learning component, making it both fun and pedagogically meaningful.

Embedded Sketch:

Future Improvements:

For future improvements, I plan to expand Panda Math Stack by adding more playable characters with unique animations and personalities to make the experience more engaging and customizable. I also aim to introduce multiple levels of difficulty, where each stage presents new visual themes and progressively challenging math problems. In addition to basic addition, future versions will include other arithmetic operations such as subtraction, multiplication, and division, allowing players to practice a wider range of skills. These updates will transform the game into a more dynamic and educational experience that adapts to different ages and learning levels.

Week 5 – Midterm Progress and Reading Response

Initial Concept:

The core concept of my project is to create a game where a panda chef stacks pancakes on its tray while pancakes fall from random positions in the sky . The gameplay is intentionally simple—pancakes fall from the sky, and the player moves the panda left and right to catch them before they hit the ground. I wanted the concept to be both whimsical and approachable, something that could appeal to all ages while still having the potential for engaging mechanics like timers, scoring, and fun visuals.

Design

The panda sprite sheet is animated using a frame-based system that cycles through images depending on movement, while background elements like clouds and grass are generated with simple loops and p5 shapes for efficiency. Pancakes are handled as objects using a dedicated class, which keeps the code modular and easy to expand. I also separated core functions—like drawing the welcome screen, updating the game state, and spawning pancakes—so the program remains organized and readable. This approach makes the design not only playful on-screen but also manageable under the hood.

Potential Struggles:

The most frightening part of the project was integrating sounds into the game. I know that audio is essential to making the experience immersive, but I was unsure about the technical steps required to implement it smoothly, especially since I had issues in the past when it comes to overlapping sounds and starting and stopping it accurately. Questions like when to trigger sounds, how to loop background music, or how to balance audio levels without them being distracting added to the challenge.

How I Plan to Tackle:

To reduce this risk, I turned to existing resources and examples, especially by watching tutorials and breakdowns on YouTube. Seeing other creators demonstrate how to load and trigger sounds in p5.js gave me both practical code snippets and creative inspiration. By learning step by step through videos, I am hoping to be able to gradually integrate audio without it feeling overwhelming.

Embedded Skecth:

Reading Response:

Computer vision i would say is different from how humans see. We naturally understand depth, context, and meaning, but computers just see a grid of pixels with no built-in knowledge or ability to infer and make connections like humans do. They need strict rules, so even small changes in lighting or background can throw a computer off.

To help computers “see” better, we often set up the environment in their favor, meaning that we cater to their capabilities. Simple techniques like frame differencing (spotting motion), background subtraction (comparing to an empty scene), and brightness thresholding (using contrast) go a long way. Artists and designers also use tricks like infrared light, reflective markers, or special camera lenses to make tracking more reliable.

What’s interesting in art is how this power of tracking plays out. Some projects use computer vision to make playful, interactive experiences where people’s bodies become the controller. Others use it to critique surveillance, showing how uncomfortable or invasive constant tracking can be. So, in interactive art, computer vision can both entertain and provoke — it depends on how it’s used.

Week 4- Loading Data, Displaying text Elyazia Abbas

Concept:

The goal of this sketch is to generate daily affirmations in the form of “I am …” statements. Each click on the canvas refreshed the draw function to bring a new affirmation assembled from a CSV file of positive words. The affirmations are placed on top of a sunset-inspired Perlin noise background using snippets of code from the decoding nature class.

Daily Doses of Positive are the Best Prevention for the Blues!

Code:

//here i am splitting these global variables to make each map to a spot or place in the phrase 
let SUBJECT = 0;
let QUALITY = 1;
let VERB = 2;
let ACTION = 3;
let PLACE = 4;




let strings = []; //this array holds all thre lines from the csv file 
let zoff = 0; // this is the third dimensionw e are using for the perlin noise


let sunsetPalettes = [ // this is the color pallette i use to make a sunset theme 
  ["#FF9E80", "#FF6E40", "#FF3D00", "#DD2C00"], 
  ["#FFB74D", "#FF8A65", "#F06292", "#BA68C8"], 
  ["#FFD180", "#FFAB40", "#FF7043", "#8E24AA"], 
  ["#FFE082", "#FFB74D", "#F48FB1", "#9575CD"]  
];

let currentPalette;

function preload(){ //preloading the csv file holding the words before setup
  strings = loadStrings("words.csv"); 
}

function setup() {//setting the canvas size and the font for the writing 
  createCanvas(600, 400);
  textFont("Georgia");
  textAlign(CENTER, CENTER);
  noLoop();
  pickPalette();
}

function draw() {
  background(255);
  noStroke();
  // this nested for loop will step between every 20 pixels +- horizontally and vertically
  for (let y = 0; y < height; y += 20) {
    for (let x = 0; x < width; x += 20) {
      let n = noise(x * 0.01, y * 0.01, zoff);// in that position we call the noise function and store th eresult in n 
      let c = color(random(currentPalette)); // get a  random sunset color
      c.setAlpha(90); // adding transparency using Alpha
      fill(c); // fill with the random color
      ellipse(x, y, n * 40, n * 40); //draw the ellipse here 
    }
  }
  zoff += 0.02; //editing the noise dimension after the draw

  
  
  //here we are randomly choosing a line to work with

  let line = "";
  do {
    line = strings[int(random(strings.length))]; // once we get a line that is not empty 
  } while (line.trim().length === 0); 

  let row = split(line, ','); // split the chosen line into n arrray of 5 tokens 

  let subject = row[SUBJECT];
  let quality = row[QUALITY];
  let verb = row[VERB];
  let action = row[ACTION];
  let place = row[PLACE];


  
  
  
  
  fill(30);
  textSize(32);
  text(subject + " " + quality, width / 2, height / 2 - 40); // here we put togehter the affirmation text 

  textSize(22);// here we put togehter the affirmation text 
  text(subject.replace("I am","I") + " " + verb + " to " + action + " in the " + place, width / 2, height / 2 + 20);


  textSize(14); // this si the footer that just asks sht euser to click 
  fill(60);
  text("Click anywhere on the screan for a new affirmation", width / 2, height - 30);
}

function mouseClicked() {
  pickPalette(); // picking. a new color pallette 
  redraw(); //draw again or call the drae function again 
}

function pickPalette() {
  currentPalette = random(sunsetPalettes); // pick random color to 
}

Embedded Skecth:

Conclusion: 

In the future, I’d like to refine grammar for more natural phrases. Maybe even make the array that holds the sentences longer so that we can get more complex phrases. I also want to let the Perlin noise flow continuously, and change colors for future sketches possibly.

 

Creative Reading Response:

  • What’s something (not mentioned in the reading) that drives you crazy and how could it be improved?

One thing that always drives me crazy is traditional TV remotes. There are so   many small buttons that all look the same, and most of them I never even touch. When I just want to change the volume or switch channels, I end up pressing the wrong thing and I’m stuck in some random settings menu. It feels like the design makes everything equally important, when really most people only use a few basic functions. If remotes had bigger, clearly marked buttons for the essentials and maybe hid the less-used ones, plus some simple feedback like a backlight, they’d be so much easier to use.

  • How can you apply some of the author’s principles of design to interactive media?

Norman’s design principles fit really naturally into interactive media because the whole field is about making technology feel intuitive and meaningful. Take affordances and signifiers, for example—these are really important when we design an interface. If a button actually looks like it can be clicked, or an arrow or sign shows you that you should swipe, users don’t have trouble guessing what to do next. In projects like games, apps, or interactive installations, these little cues make the experience smooth instead of frustrating. It’s basically about letting the design speak to the user so they can focus on enjoying the content rather than fighting with the controls. When people don’t have to think too hard about how to use something, they can actually connect with the creative side of the project.