Week 13 – Final Project Progress

User Testing Video With only P5 (using keyboard instead of arduino):

IMG_7745

User Testing Video of One Push Button:

IMG_7748

User Testing Video of Progress with more buttons and design:

IMG_7754

For my user testing session, I asked my friend to try out my project without giving her any instructions. I wanted to see if she could figure everything out on her own just from the way the interface and the game are designed. The first thing I noticed was that she read the instructions on the page and understood the goal immediately, which was honestly a relief. She didn’t ask any questions, so the basic mapping between the buttons and the falling emojis made sense to her. That part of the experience seems to be working exactly how I intended.

Once she started playing, her feedback was really helpful. She enjoyed the game a lot, in fact, she kept wanting to play even after I stopped recording, and she tried beating her own high score multiple times. That told me that the core idea is fun and engaging. But she also pointed out a few things that I definitely need to improve. The main issue she mentioned was responsiveness. Sometimes the game feels a little slow to respond even when she hits the button at the right moment. When multiple emojis fall quickly, pressing two buttons in a short time makes the game slightly laggy. This helped me realize that the input handling needs to be optimized so that the game processes button presses more instantly, especially when things get fast.

She also said that adding sound effects for correct hits would make the experience more satisfying, which I completely agree with. That’s something I want to implement next. Another important improvement is creating a proper end screen with the final score and a high score tracker, because right now the game just keeps going without a clear ending moment. An end page would make the game feel more complete and polished.

On the physical side of the project, I had a few challenges. I wasn’t on campus for most of the week, so I only had Wednesday to work on the hardware. I tried soldering the buttons onto the board, but when I tested them, the connections kept failing. Because of that, I had to temporarily tape them with conductive tape just to get the buttons working for testing. This also means the buttons aren’t fully placed into the holes yet, and 2 leds weren’t working at all, so I had to test it without them.  Once I solder everything properly, the inside of the box will look much neater, and the wires will be able to pass through from underneath the LED buttons. After that, I’m planning to close the back of the box, but I’ll leave a small opening so I can still access the components whenever I need to fix or update something.

Overall, this user testing session helped me see what’s working and what still needs attention. The concept, gameplay, and instructions are all clear, and the interaction makes sense even without any explanation. But I also realized that I still have a lot to improve in terms of responsiveness, speed, physical build quality, and overall design. The good thing is that my friend actually enjoyed playing, which reassures me that I’m on the right track, the project just needs more refinement to reach its full potential.

Final Project Proposal – Week 12

For my original concept, I planned to build an Emotion Communicator Keyboard designed mainly for babies and people with speech difficulties. Although meaningful, I realized the idea was too simple, too targeted to a specific group, and not interactive enough for a final project. After discussing the importance of two-way interaction and creating something that pulls reactions out of the user, I shifted my approach. I wanted something that anyone (children, teenagers, and adults) would find fun, challenging, and engaging. From that brainstorming, I developed my finalized concept: Emotion Sprint, a fast-paced reflex game that uses a custom Arduino emotion keyboard and a p5.js game interface.

Emotion Sprint is an interactive reaction-based game where emojis, expressions, or short emotional prompts appear on the p5 screen, and the user must press the matching physical emotion button at the right moment. The game mixes fast visuals, timing accuracy, audio feedback, and emotional recognition. As the game progresses, objects fall faster, the reactions get more complex, and the player must instantly choose the correct emotion.

This concept is accessible, replayable, competitive, and genuinely exciting for all ages. The physical interface (Arduino keyboard) and digital interface (p5.js animations and timing) depend on each other to create a fast, tight feedback loop.

Physical Sketch

Hardware Elements

  • 10–12 buttons for the Emotion Keyboard
  • Cardboard
  • LED Pixels (feedback indicators)
  • Piezo buzzer (tone feedback)
  • Arduino Uno
  • Resistors + wiring

Arduino Design and Description

Inputs (Reading User Reactions): Arduino acts as the physical senses + physical voice of the game

  • Continuously read digital pins for button presses
  • Debounce inputs to ensure clean, precise timing
  • Detect when a player presses multiple buttons quickly

Outputs (Immediate Physical Feedback)

  • Trigger a short tone on the buzzer based on how accurate the timing was
    • Perfect → Higher pitch
    • Good → Mid pitch
    • Miss → Low “error” tone
  • Light up the LED under the pressed emotion button

Communication With P5
Arduino → p5 (Serial):

  • Sends emotion labels (ex: “happy”, “angry”, “surprised”)
  • Sends timestamps of button presses
  • Sends “double reaction” when two buttons are pressed fast
  • Sends “applause button” activation (for positive reinforcement moments)

p5 → Arduino (Serial):

  • Sends feedback signals such as:
    • “correct” → Arduino lights LED green
    • “incorrect” → LED flashes red
    • “bonus” → special buzzer melody
  • Sends timing difficulty changes (game speed up/down)

P5 Design and Description

Game Engine

  • Generates falling emojis/emotions with target timing
  • Tracks the position and movement of each object
  • Calculates accuracy of each press using timestamps
  • Adjusts game difficulty (speed increases over time)

Visual Output

  • Display falling emojis
  • Flash screen colors based on success/failure
  • Show score, multiplier, streak
  • Show special video/image/meme prompts during reaction rounds
  • End-of-game summary screen

Audio Output

  • Play digital sound effects for:
    • Correct timing
    • Misses
    • Combos
    • Bonus rounds

Communication With Arduino
p5 → Arduino:

  • Sends green/red flashing commands
  • Sends buzzer melody for bonus streak
  • Sends “new round” or “speed increase” signals

Arduino → p5:

  • Sends emotion pressed
  • Sends timestamp of press
  • Sends sequence or “combo input”
  • Sends “applause button” activation

I’ll be developing the p5 game first, focusing on the visuals, falling emojis, scoring, and special reaction rounds. Once the game mechanics are solid, I will integrate the Arduino keyboard to connect the physical inputs, buzzer, and LED feedback. As I continue working on the project, I will keep modifying and improving the system if I come up with new ideas or ways to make the gameplay more interactive and engaging. I will document all progress here on the blog, showing updates, refinements, and the evolving design of both the software and hardware components.

Final Project Idea – Week 11

Concept 

For my final project, I will create an Emotion Communicator Keyboard; a simple interactive system designed mainly for babies and people with speech difficulties. The device will include several large, colorful buttons, each representing a basic emotion or need (such as happy, sad, hungry, sleepy, or scared). When a user presses a button, the system will communicate that emotion through both physical sound (from Arduino) and visual + digital sound feedback (from P5). This is a preliminary concept and it is still subject to change as I continue developing and testing the project.

Purpose & Audience

Babies and people who have speech difficulties often struggle to communicate how they feel. This device gives them a clear, physical way to express emotions using big, easy-to-press buttons. I got the idea after seeing videos on social media of mothers with children who have speech difficulties. Many of them talked about how challenging it can be to understand what their kids want or how they feel. I noticed that some families use simple communication apps that display basic words or emotions, and even though the tools look very simple, they make a huge difference. I would notice that when the children learned how to express their feelings better through this tool, you could literally see their entire face change. Their eyes would brighten, their expression would relax, and they looked relieved and proud that someone understood them. It made me realize how powerful and meaningful even a very simple communication tool can be. It’s not just about sending a message; it’s about giving the child a way to feel seen, heard, and understood.
The current version is intentionally simple for accessibility, but for adults it would be too basic. In the future, I could expand this into a more advanced assistive communication tool with more emotions, more complex sensors, or personalized options.

Arduino Components 

  • 10-12 large push buttons (each representing one emotion)
  • LEDs
  • Piezo buzzer (to play emotion-specific tones)
  • 10kΩ resistors (pull-downs for button circuits)
  • USB connection (for serial communication with P5)
  • I can build the keyboard with cardboard and cutting holes for the buttons and designing it to look like a keyboard.

Arduino will:

  • sense which button is pressed
  • send messages like “happy” or “sad” to P5
  • play an immediate tone or sound pattern on the buzzer depending on the emotion

P5 receives the emotion label from Arduino and triggers:

  • an animated visual (color changes, shapes, movement)
  • a matching digital sound or short audio clip
  • Maybe text labels for clarity

Interaction Flow

  1. User presses a button
  2. Arduino detects it instantly
  3. Arduino plays a corresponding tone on the buzzer
  4. Arduino sends the emotion name to P5
  5. P5 displays a visual animation + plays a sound + maybe led lights up

Ideas I can add:
1. Users can press two or more buttons in sequence to express combined emotions (e.g., Happy + Hungry → “I’m happy and hungry.”)

2. As positive reinforcement is something that plays a big role in this, an extra large button can trigger a celebratory animation and sound when the user successfully expresses an emotion or sequence. to motivate them to continue using this tool as a way to express their feelings better.

Future Expansion

Later, this system could be expanded with:

  • more advanced sensors
  • a larger emotion vocabulary
  • more meaningful sound design for adults
  • different modes of communication (LEDs, vibration feedback)

3 exercises – Week 11

The 3 tasks we worked on were practice on how to make both Arduino and P5j work together.

Group: Deema Al Zoubi and Rawan Al Ali 

Exercise 1 :

let connectButton;
let port;
let reader;
let sensorValue = 0;
let keepReading = false;

function setup() {
  createCanvas(600, 400);

  // Create a connect button 
  connectButton = createButton('Connect to Arduino');
  connectButton.position(10, 10);
  connectButton.mousePressed(connectSerial);

  textAlign(CENTER, CENTER);
  textSize(14);
}

async function connectSerial() {
  // If already connected, close first
  if (port && port.readable) {
    await closeSerial();
    connectButton.html('Connect to Arduino');
    return;
  }

  try {
  
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });

    console.log('Port opened', port);
    connectButton.html('Disconnect');

    const decoder = new TextDecoderStream();
    // Pipe the readable stream from the port to the decoder
    port.readable.pipeTo(decoder.writable);
    reader = decoder.readable.getReader();

    keepReading = true;
    readLoop(); // start read loop
  } catch (err) {
    console.error('Error opening serial port:', err);
    alert('Could not open serial port. Make sure your device is connected and try again.');
  }
}

async function closeSerial() {
  keepReading = false;
  try {
    if (reader) {
      await reader.cancel();
      reader.releaseLock();
      reader = null;
    }
    if (port && port.close) {
      await port.close();
      console.log('Port closed');
    }
    port = null;
  } catch (err) {
    console.warn('Error closing port:', err);
  }
}

async function readLoop() {
  let partial = '';
  try {
    while (keepReading && reader) {
      const { value, done } = await reader.read();
      if (done) {
        console.log('Reader closed');
        break;
      }
      if (!value) continue;

      // split by newline
      partial += value;
      let lines = partial.split(/\r?\n/);
      // Keep the last partial line in buffer
      partial = lines.pop();

      for (let line of lines) {
        line = line.trim();
        if (line === '') continue;        
        const num = parseInt(line, 10);
        if (!Number.isNaN(num)) {
          // clamp to expected range in case of weird data
          sensorValue = Math.max(0, Math.min(1023, num));
          console.log('sensorValue:', sensorValue);
        } else {
          console.log('non-numeric line ignored:', line);
        }
      }
    }
  } catch (err) {
    console.error('Read loop error:', err);
  }
}

function draw() {
  background(240);

  let x = map(sensorValue, 0, 1023, 25, width - 25);
  fill(100, 150, 240);
  ellipse(x, height / 2, 50, 50);
}
window.addEventListener('beforeunload', async (e) => {
  if (port && port.readable) {
    await closeSerial();
  }
});
void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensorValue = analogRead(A0);
  Serial.println(sensorValue);
  delay(50); // ~20Hz
}

We used one sensor (a potentiometer) on the Arduino to control the horizontal position of an ellipse in p5.js. The ellipse stayed in the middle of the screen vertically, and all movement was controlled by the Arduino sensor, p5 didn’t send any commands to Arduino.

Video 1: IMG_1012

Schematic 1:

Exercise 2:

let connectButton;
let port;
let writer;
let slider;

let targetBrightness = 0;
let currentBrightness = 0;

function setup() {
  createCanvas(400, 200);

  connectButton = createButton("Connect to Arduino");
  connectButton.position(10, 10);
  connectButton.mousePressed(connectSerial);

  slider = createSlider(0, 255, 0);
  slider.position(10, 60);
  slider.style('width', '300px');

  textSize(16);
}

async function connectSerial() {
  try {
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });
    writer = port.writable.getWriter();
    connectButton.html("Connected");
  } catch (err) {
    console.error("Connection error:", err);
  }
}

function draw() {
  background(230);

  // Slider sets the target brightness
  targetBrightness = slider.value();

  // Gradually move toward target (both up and down)
currentBrightness = lerp(currentBrightness, targetBrightness, 0.15);

  // Snap to 0/255 when very close so it truly turns off/on
  if (abs(currentBrightness - targetBrightness) < 1) {
    currentBrightness = targetBrightness;
  }

  fill(0);
  text("LED Brightness: " + int(currentBrightness), 10, 110);

  // Send brightness
  if (writer) {
    writer.write(new Uint8Array([int(currentBrightness)]));
  }
}
int ledPin = 9; // LED connected to PWM pin 9
int value = 0;  // variable to store incoming brightness

void setup() {
  pinMode(ledPin, OUTPUT);
  Serial.begin(9600); // match the baud rate in p5
}

void loop() {
  if (Serial.available() > 0) {
    // read incoming string until newline
    String data = Serial.readStringUntil('\n');
    data.trim(); // remove whitespace
    if (data.length() > 0) {
      value = data.toInt();          // convert to integer
      value = constrain(value, 0, 255); // keep within PWM range
      analogWrite(ledPin, value);   // set LED brightness
    }
  }
}

We created a slider in p5.js to control the brightness of an LED connected to the Arduino. Moving the slider to the right gradually increased the LED brightness, and moving it back to the left gradually turned it off.

Video 2: https://intro.nyuadim.com/wp-content/uploads/2025/11/IMG_1030.mov

Schematic 2:

Exercise 3:

let velocity;
let gravity;
let position;
let acceleration;
let drag = 0.99;
let mass = 50;

let brightnessValue = 512; // potentiometer value (0–1023)
let ballDropped = false;
let ledOn = false;

let port, writer, reader;
let serialActive = false;
let serialBuffer = '';

function setup() {
  createCanvas(640, 360);
  textSize(18);

  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.2 * mass);

  // Connect button
  let btn = createButton("Connect to Arduino");
  btn.position(10, 10);
  btn.mousePressed(connectAndStart);
}

function draw() {
  background(255);

  // Show instructions before dropping ball
  fill(0);
  if (!ballDropped) {
    text("Press B to drop the ball", 20, 30);
    return; // stop here until ball is dropped
  }

  // Show serial status
  if (serialActive) {
    text("Connected", 20, 30);
    text(`Potentiometer: ${brightnessValue}`, 20, 50);
  } else {
    text("Serial Port Not Connected", 20, 30);
  }

  // Apply gravity
  velocity.y += gravity;
  velocity.y *= drag;

  // Horizontal control from potentiometer
  let windX = map(brightnessValue, 0, 1023, -2, 2);
  velocity.x = windX;

  position.add(velocity);

  // Bounce on floor
  if (position.y >= height - mass / 2) {
    position.y = height - mass / 2;
    velocity.y *= -0.9;

    // Flash LED
    if (serialActive && !ledOn) {
      writeSerial("F");
      ledOn = true;
    }
  } else if (ledOn) {
    ledOn = false;
  }

  // Bounce on top
  if (position.y - mass / 2 < 0) {
    position.y = mass / 2;
    velocity.y *= -0.9;
  }

  // Keep ball inside canvas horizontally
  position.x = constrain(position.x, mass / 2, width - mass / 2);

  // Draw ball
  fill(100, 150, 240);
  ellipse(position.x, position.y, mass, mass);
}

function applyForce(force) {
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed() {
  if (key == "B" || key == "b") dropBall();
}

function dropBall() {
  position.set(width / 2, 0);
  velocity.set(0, 0);
  mass = 50;
  gravity = 0.05 * mass;
  ballDropped = true;
}

// Serial functions
async function connectAndStart() {
  try {
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });

    writer = port.writable.getWriter();

    const decoder = new TextDecoderStream();
    port.readable.pipeTo(decoder.writable);
    reader = decoder.readable.getReader();

    serialActive = true;

    // Start reading
    readLoop();
  } catch (err) {
    console.error("Serial error:", err);
  }
}

async function readLoop() {
  while (serialActive && reader) {
    const { value, done } = await reader.read();
    if (done) break;
    if (value) parseSerial(value);
  }
}

function writeSerial(msg) {
  if (writer) writer.write(new TextEncoder().encode(msg + "\n"));
}

function parseSerial(data) {
  serialBuffer += data;
  let lines = serialBuffer.split('\n');
  serialBuffer = lines.pop();

  for (let line of lines) {
    let val = parseInt(line.trim());
    if (!isNaN(val)) brightnessValue = val;
  }
}
int ledPin = 9; // LED connected to pin 9
int potPin = A0; // potentiometer connected to A0
int potValue = 0; 

void setup() {
  pinMode(ledPin, OUTPUT);
  Serial.begin(9600); // match baud rate with p5.js
}

void loop() {
  // Read potentiometer (optional if p5 reads directly)
  potValue = analogRead(potPin);
  Serial.println(potValue); // send to p5 for horizontal control

  // Check for serial input from p5
  if (Serial.available() > 0) {
    char command = Serial.read(); // read single character
    if (command == 'F') {        // 'F' = flash LED
      digitalWrite(ledPin, HIGH);
      delay(100);                // keep LED on briefly
      digitalWrite(ledPin, LOW);
    }
  }
  
  delay(10); // small delay for stability
}

We made the ball bounce continuously up and down in p5.js, and connected an LED on the Arduino that lights up briefly every time the ball hits the floor. A potentiometer on the Arduino was used to control the horizontal movement of the ball: turning the knob to higher values moves the ball to the right, and lower values move it to the left. The potentiometer’s outer pins were connected to 5V and GND, and the middle pin to A0. Using Web Serial, we read the potentiometer values in p5 and mapped them to the horizontal position of the ball while it keeps bouncing.

Video 3: IMG_1030

Schematic 3:

Reflection: 

These exercises helped us understand how p5.js and Arduino can work together. We saw that Arduino can send real-world sensor data, like potentiometer values, to p5 to control visuals or simulations, and that p5 can also send commands back to Arduino, like turning an LED on or off. This gave us a clear idea of how much influence p5 can have on Arduino outputs, and how Arduino inputs can drive digital interactions in p5. Practicing this will be really useful for our final project, as it shows how to combine physical sensors, real-time controls, and visual feedback in a simple interactive system.

Week 11 – Reading Reflection

Design Meets Disability 

In this reading, the author argues that simplicity can be just as important as universality in design. While universal design aims to include every possible user, trying to add too many features can actually make a product harder to understand and use. The author emphasizes that cognitive accessibility, how easy something is to learn and navigate, is often overlooked because it is harder to measure than physical accessibility. By valuing simplicity over complexity, designers can sometimes create products that are more inclusive in practice, even if they do not meet every theoretical definition of universal design. The author also suggests that good designers learn to balance the ideal design brief with what will actually work best for users, sometimes choosing to remove features for a better overall experience.

I agree with the author’s argument that simplicity can be a powerful form of accessibility. In my experience, many products become confusing or overwhelming when they try to offer every possible option to every type of user. However, I also think that sometimes having more features can be beneficial for users. In many cases, a product with extra options feels like a better deal because it allows for more customization or more advanced use when needed. While simplicity is appealing, I don’t always want to sacrifice functionality for the sake of minimalism. I believe the best approach depends on the context: some products should stay simple, but others can genuinely improve the user experience, and even accessibility, by offering richer features as long as they remain intuitive to use.

Week 10 – Reading Reflection

Bret Victor’s A Brief Rant on the Future of Interaction Design argues that most futuristic designs, like touchscreens and “smart” glass devices, aren’t really futuristic. They still limit how humans can use their bodies. He points out that we interact with the world using our hands, senses, and movement, yet technology reduces all that to swiping and tapping. He calls this “Pictures Under Glass,” where we just touch flat screens instead of truly feeling or manipulating things.

In the follow-up, Victor explains that he wasn’t trying to give solutions but to spark awareness. He doesn’t hate touchscreens, he just wants people to imagine beyond them. He warns that if we keep improving only on what already exists, like adding small features to tablets, we’ll miss the chance to design something that fully connects with our physical abilities. True interaction design, he says, should use our hands’ full potential, our sense of touch, and our natural way of exploring and learning through movement.

What stood out to me most is how Victor connects design to human capability. It made me realize how much of what we call “interaction” today actually leaves our bodies out. I immediately thought about how different it feels to press a real piano key or strike a drum versus tapping a flat screen. When I play piano, I’m not just using my fingers, I’m using my arms, posture, timing, and even breath. There’s weight, resistance, and texture. You feel the sound before you even hear it. That’s something a touchscreen can’t replicate. It’s the same in sports, when you shoot a basketball, your body memorizes the angle, force, and balance, your muscles learn the rhythm. Victor’s idea reminded me that our body is a part of thinking and learning, not separate from it.

I also really liked how he said that tools should amplify human ability, not narrow it. Imagine if technology worked like that, instead of us adapting to it, it adapts to how we move and feel. A “future” instrument, for example, could let you physically mold sound with your hands, or a learning app could respond to your gestures, rhythm, or even posture, not just clicks. Victor’s message isn’t just about design, it’s about reimagining creativity, learning, and expression in a more human way. It’s like he’s saying, the real future of technology isn’t in shinier screens, it’s in rediscovering how alive and capable we already are.

Week 10 – The Diet Drum (Deema and Rawan)

Our Concept: 

Our project drew inspiration from last week’s readings on human-computer interaction, particularly the ways in which technology can respond to subtle human behaviors. We explored how interactive systems often mediate our engagement with the environment and even with ourselves, creating experiences that feel responsive, social, or even playful.

With this perspective, we asked ourselves: what if an instrument didn’t just make sound, but responded directly to human behavior? Instead of rewarding interaction, it could intervene. Instead of passive engagement, it could create a performative, almost social response.

From this idea, the Diet Drum emerged — a device that reacts whenever someone reaches for a snack. The system is both humorous and relatable, externalizing the human struggle of self-control. When a hand approaches the snack bowl, a servo-powered drumstick strikes, accompanied by a short melody from a passive buzzer. The result is a playful, judgmental interaction that transforms a familiar, internal tension into an amusing and performative experience.

How It Works

  • Photoresistor (LDR): Detects hand movements by monitoring changes in light. As a hand blocks the sensor, the reading drops.

  • Servo motor: Moves a drumstick to perform a percussive strike, physically reinforcing the “warning” aspect of the interaction.

  • Passive buzzer: Plays a short melody as a playful, auditory cue.

  • Arduino Uno: Continuously monitors the sensor and triggers both motion and sound.

When the LDR senses that a hand has blocked the light, the Arduino makes the servo play the melody and hit the drum. This creates a clear, immediate connection between what a person does and how the system responds, showing ideas from our readings about how devices can react to gestures and sensor input.

Video Demonstration

assignment10

Challenges

Throughout development, we encountered several challenges that required both technical problem-solving and design fixing:

  • System reliability: While the setup initially worked smoothly, leaving it for some time caused it to fail. Figuring out the problem took us some time because we didn’t know what went wrong and whether it was the setup or the code. So we had to use AI, to help us figure out what to do in order to know if the problem was from the wiring or the code, and then after knowing it’s from the wiring, we had to partially rebuild and retune the system to restore functionality.

  • Mechanical stability: Keeping the drumstick steady during strikes was more difficult than anticipated. Any slight movement or misalignment affected the accuracy and consistency of the strikes, requiring several adjustments.

  • Audio timing: The melody initially played too long, delaying servo motion and disrupting the intended interaction. Shortening the audio ensured that the strike and sound remained synchronized, preserving the playful effect.

  • We used AI, to help with some code difficulties we had, to fit with our original idea.

Code Highlights

One part of the code we’re especially proud of is how the sensor input is mapped to the servo’s movement.

float d = constrain(drop, MIN_DROP, MAX_DROP);
float k = (d - MIN_DROP) / float(MAX_DROP - MIN_DROP); 
int hitAngle = SERVO_HIT_MIN + int((SERVO_HIT_MAX - SERVO_HIT_MIN) * k);
unsigned long downMs = STRIKE_DOWN_MS_MAX - (unsigned long)((STRIKE_DOWN_MS_MAX - STRIKE_DOWN_MS_MIN) * k);

strikeOnce(hitAngle, downMs);

This makes the drumstick respond based on how close the hand is, so each action feels deliberate rather than just an on/off hit. It lets the system capture subtle gestures, supporting our goal of reflecting nuanced human behavior. AI helped us with this in terms of knowing when exactly and how hard the strike should hit.

Future Improvements

Looking forward, we see several ways to expand and refine the Diet Drum:

  • Adaptive audio: Varying the melody or warning tone based on how close the hand is could enhance the playfulness and expressiveness.

  • Mechanical refinement: Improving the stability of the drumstick and optimizing servo speed could create smoother strikes and more consistent feedback.

  • Compact design: Reducing the size of the device for easier placement would make it more practical for everyday use.

  • Visual cues: Adding optional LEDs or visual signals could enhance the  feedback, making the system even more engaging.

Github Link:

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/assignment10.ino

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/pitches.h

Reading Reflection – Week 9

Physical Computing’s Greatest Hits (and misses) 

Reading Tigoe’s article on “Physical Computing’s Greatest Hits (and Misses)” gave me a new appreciation for how simple ideas can be both educational and engaging in physical computing. I found it interesting that recurring themes, like theremin-like instruments, gloves, floor pads, and video mirrors, aren’t just repeated because they’re easy, but because they allow for creativity and experimentation. Even projects that seem simple, like LED displays or mechanical pixels, can produce surprising, beautiful, or playful results when combined with unique designs or gestures. I also liked how the article emphasized that physical computing focuses on human input and experience rather than just the machine’s output, which makes the interaction more meaningful and enjoyable.

I was especially inspired by projects that blend physical interaction with emotional or playful elements, such as remote hugs, interactive dolls, or meditation helpers. These projects show how technology can respond to people in subtle ways, creating experiences that feel alive or personal. I can see how many of these themes, like body-as-cursor or multitouch surfaces, could be adapted to new ideas, highlighting the importance of creativity over originality. Reading this made me think about how I might design my own physical computing projects, focusing on the human experience, interaction, and the joy of discovery rather than trying to invent something completely new from scratch.

Making Interactive Art: Set the Stage, Then Shut Up and Listen 

This reading helped me understand that interactive art is fundamentally different from traditional art. It’s not about presenting a fixed statement; it’s about creating a space or instrument where the audience can explore and participate. I found it interesting that the artist’s role is to suggest actions and provide context without telling people what to think or do. The comparison to directing actors made this clear: just as a director provides props and intentions, interactive artists set up the environment and let participants discover meaning through their actions. I realized that this approach makes the audience an essential co-creator of the artwork.

I was inspired by the idea that interactive art is a conversation between the creator and the audience. It made me think about how designing for discovery and participation can lead to more meaningful and engaging experiences. I liked how the reading emphasized listening to the audience, observing their reactions, and letting the work evolve based on their interaction. This approach feels very open and collaborative, encouraging creativity both from the artist and the participants. It made me consider how I could apply this perspective to projects or experiences I create, focusing on engagement and exploration rather than fixed outcomes.

 

Week 9 – Arduino: analog input & output Assignment

My Concept:

For this project, I wanted to control two LEDs using sensors in a creative way. One LED would respond to a potentiometer, changing its brightness based on how much I turned it. The other LED would respond to a pushbutton, turning on or off each time I pressed it. The idea was to combine analog input (potentiometer) and digital input (pushbutton) to control outputs in two different ways. I also wanted to make sure everything was wired properly on the breadboard with the Arduino so it all worked together. At first, I wanted to do a breathing light where it kind of switches on and off gradually like it’s a human breathing, but it took me hours to try and it still wasn’t working, so that was a big challenge for me, because I couldn’t figure it out. I had to try a couple of times and it wasn’t giving me what I wanted. So, I decided to stick on to the simple foundation of it, because i realized I needed more time to get comfortable with using the potentiometer and switch.

Video Demonstration: 

assignment_week9.mov

Hand-Drawn Schematic: 

Code I’m most proud of:

// Pin Definitions 
const int potPin = A0;       // Potentiometer middle pin connected to A0
const int buttonPin = 2;     // Pushbutton connected to digital pin 2
const int greenLedPin = 8;   // Green LED connected to digital pin 8 
const int blueLedPin = 9;    // Blue LED connected to digital pin 9 

// Variables 
int potValue = 0;           // Stores analog reading from potentiometer
int ledBrightness = 0;      // Stores mapped PWM value for blue LED
int buttonState = HIGH;     // Current reading of the button
int lastButtonState = HIGH; // Previous reading of the button
bool greenLedOn = false;    // Green LED toggle state

void setup() {
  pinMode(buttonPin, INPUT_PULLUP); // Enable internal pull-up resistor
  pinMode(greenLedPin, OUTPUT);
  pinMode(blueLedPin, OUTPUT);
  
  Serial.begin(9600); 
}

void loop() {
  //Read potentiometer and set blue LED brightness
  potValue = analogRead(potPin);                   // Read 0-1023
  ledBrightness = map(potValue, 0, 1023, 0, 255); // Map to PWM 0-255
  analogWrite(blueLedPin, ledBrightness);         // Set blue LED brightness

  // Read pushbutton and toggle green LED 
  buttonState = digitalRead(buttonPin);

  // Detect button press (LOW) that just changed from HIGH
  if (buttonState == LOW && lastButtonState == HIGH) {
    greenLedOn = !greenLedOn;                       // Toggle green LED state
    digitalWrite(greenLedPin, greenLedOn ? HIGH : LOW);
    delay(50);                                     // Debounce delay
  }

  lastButtonState = buttonState;

The code I’m most proud of is  basically the overall code where the green LED toggles ON/OFF with the pushbutton while the green LED changes brightness with the potentiometer. It was tricky at first because the button kept flickering, but I solved it with a debounce and I asked chatgpt to help me on what I had wrong and why this might have been happening. Here’s the code: 

I’m proud of this code because it solves the switch problem elegantly; the green LED reliably turns on and off without flickering, while the blue LED smoothly responds to the potentiometer. It made me feel like I had full control over both analog and digital inputs in the same circuit.

Reflection and Improvements:
This project taught me a lot about wiring and sensor behavior. One big challenge was the pushbutton;  at first, it would flicker, or the LED would turn on/off unexpectedly. Also, when I tried pressing the button with my finger, the breadboard wires kept moving because there were so many of them. Eventually, I realized using a pen to press the button made it much more stable and consistent.

For future improvements, I’d like to:

  • Use more organized jumper wires to make the breadboard less messy
  • Maybe add more sensors or another LED with creative behaviors
  • Explore smoother debouncing techniques in software to make the switch even more responsive

Even with the challenges, seeing both LEDs working together exactly how I wanted was really satisfying. It showed me how analog and digital inputs can interact with outputs, and how careful wiring and coding logic are equally important.

Github Link: 

https://github.com/deemaalzoubi/Intro-to-IM/blob/c9dc75423cd71b8ea21d50a5756c4d5e7f420ba5/assignment_week9.ino

Weel 8 Reading – Her Code Got Humans on the Moon

While reading “Her Code Got Humans on the Moon“, I could learn and see how Margaret Hamilton is such an inspiring person . She was a computer scientist and software engineer who played a huge role in NASA’s Apollo missions back in the 1960s. What’s so amazing about her is that she didn’t just write code; she created entire systems for software reliability, which was a completely new idea back then. For the Apollo missions, she developed error-detection routines, priority scheduling, and backup systems so that the spacecraft could continue operating even if something went wrong. Basically, she designed software that could think ahead and handle mistakes, which was crucial when human lives were literally on the line. Her work laid the foundation for modern software engineering practices. What’s also incredible is how determined she was that she even brought her daughter to work, proving that her focus and dedication didn’t let anything stop her from solving these insanely complex problems.

What really fascinates me is how her technical contributions changed the way people think about software. Before her, software was often seen as just “a bunch of instructions,” but Margaret Hamilton treated it as an engineered system that could prevent disaster and adapt to unpredictable situations. This mindset completely shifted the tech world; today, every piece of software we use, from smartphones to cars, benefits from principles she pioneered. Learning about her makes me realize that software isn’t just about writing code; it’s about designing systems that are safe, reliable, and resilient. Her work shows that combining technical skill with persistence and creativity can literally launch humans to the moon and reshape the way we build technology forever.