User Testing – Foot Loose

User Testing Video:

IMG_1108

1. Have people try your project without giving them any prompts/instructions and see how they use it.

I let my friend step onto the dance pad and try Foot Loose completely blind. No instructions, no hint about the pads, nothing. I just pressed Start and watched. Right away, she understood the basic idea: arrows fall down their lanes, you step on a matching pad. She wasn’t lost or confused about the “goal” of the game, which was a huge win.

But the moment she tried to play seriously, the weaknesses started showing. She hesitated, second-guessed pads, and kept glancing down because she wasn’t fully sure what counted as a “hit.” The center pad especially caused chaos; it triggered randomly and confused her more than it helped.

2. Are they able to figure it out? Where do they get confused and why? Do they understand the mapping between the controls and the experience?

What she figured out:

  • Arrows = instructions.
  • Colors = lanes.
  • Pads = controls.

Where she got confused (and why):

  • Instructions were too vague.
    She didn’t know exactly when to step. She asked me later:
    “Do I hit when it touches the line? Or when it’s between the two lines? Or when it’s dead center?”
    That hesitation slowed her down.
  • The center pad was unclear.
    It kept triggering even when she didn’t mean to step on it, and she didn’t know what it was for. The “●” symbol made sense in theory, but in practice it caused more accidental hits than intentional ones.
  • The game felt too fast.
    She understood the mapping, but the speed didn’t give her time to react. On easy mode, she still felt rushed.

So yes, she understood the mapping, but the timing window + speed made the experience harder to grasp on her first try.

3. What parts of the experience are working well?

  • The directional pads (L, D, U, R) worked great.
    They triggered reliably and matched the arrows perfectly.
  • Visual clarity:
    The colored lanes + falling arrows made sense instantly. She said, “Oh, okay, I step where it matches.”
  • The core mechanic:
    Hit arrow → get GOOD / MISS → see score go up.
    She understood the flow without me narrating anything.
  • Pacing of feedback:
    The “GOOD” and “MISS” flashes were readable and rewarding.

In short, the skeleton of the game works extremely well. The player can understand the entire concept just by watching it for 5 seconds.

4. What areas could be improved?

  • Instruction clarity:
    I need to explicitly say:
    “Step when the arrow is inside the dashed zone.”
  • Remove the center pad from gameplay.
    It’s physically useful as a standing spot, but as a sensor it creates noise, misfires, and confusion. It’s not worth the chaos.
  • Slow the game down on Easy mode.
    Beginners need space to understand the rhythm before it speeds up.
  • Broaden the hit zone.
    The current timing window is too strict. Expanding the dashed lines will make the game fairer and easier to enjoy.

All of these changes directly help first-time players “get it” without needing me to explain anything.

5. What parts did you feel the need to explain? How could you make these areas more clear?

What I felt the urge to explain while she played:

  • “Step ONLY in the dashed zone.”
  • “Ignore the center pad; it’s just a place to stand.”
  • “It’s fast, don’t worry, the game is supposed to slow down.”
  • “You’ll get a GOOD only if you step in the exact timing window.”

Basically, anywhere I wanted to explain was a UI failure, not a player failure.

How I will make it clearer:

  • Rewrite instructions to be exact, not general.
  • Remove ●/center input entirely.
  • Increase the hit zone size.
  • Slow the spawn rate on easy mode.
  • Maybe add a small mini-tutorial or animated demo before the game starts (optional).

Final Takeaway

The user testing confirmed that the concept is strong and intuitive, but the details such as timing clarity, center pad behavior, and pacing need refinement. Once those are fixed, the game will be fully understandable on its own, which is the whole point of this assignment.

Final Project Proposal & Progress Update

Concept

For my final project, I am creating Foot Loose, a physically interactive rhythm game where players step on large acrylic pads to hit falling arrows displayed on screen. The game draws inspiration from Dance Dance Revolution but is redesigned for the IM Fair using Arduino-based sensing, p5.js animation, and a layered LED feedback system.

The interaction is simple and energetic:

  • If the player steps at the correct moment, the system flashes a green LED beside the screen.

  • If the player’s timing is off, a red LED flashes instead.

This creates clear, readable, immediate feedback that works even in a crowded fair environment.

Each stepping pad will also have its own small, colorful LED taped beside it, lighting up the moment the player presses down. This pad-level LED provides instant, physical confirmation of the step before p5 even evaluates the timing. Together, the pad LEDs + the red/green judgment LEDs create a layered, intuitive communication system:

  • Pad LEDs → confirm the user physically stepped

  • Screen & red/green LEDs → confirm whether the timing was good or missed

The design makes the interaction active, performative, and easy for anyone to understand. Players use their whole body—not just their hands—to interact with the system.

System Overview

Foot Loose is built around a tight, responsive interaction loop:

Arduino LISTENS

Reads piezo sensors under the pads and detects steps.

p5.js THINKS

Controls the falling arrows, checks timing, displays visuals, and decides GOOD or MISS.

Arduino SPEAKS BACK

Flashes LEDs based on p5’s decision, creating physical feedback.

This two-way communication keeps players grounded in both the physical dance pads and the on-screen rhythm.

 

Arduino Design (Inputs, Outputs, Behavior)

Inputs: Piezo Sensors Under Each Pad

Each stepping pad will contain a piezo sensor mounted underneath the acrylic.
To stabilize signals, every piezo uses a 1MΩ resistor.

This week, I successfully prototyped the sensing system using:

  • A single piezo

  • Folded tissue padding

  • A hardcover book as a temporary “stepping pad”

Idle readings were stable, and step readings spiked clearly.
Arduino sends a simple message for every detected step:

“L”

Later on, each pad will have its own letter (e.g., L, R, U, D, C).

Outputs: LED Feedback (Two Types)

1. Pad-Level LEDs (instant physical confirmation)

Each pad has a small LED that lights up the moment the user steps on it.
These LEDs make the pads feel “alive” and give instant tactile feedback.

2. Main Feedback LEDs (green = GOOD, red = MISS)

Two LEDs sit next to the screen:

  • Green = correct timing

  • Red = missed timing

Arduino listens for commands from p5:

  • “G” → flash green

  • “M” → flash red

This separates physical detection from timing judgment, creating a polished interaction.

p5.js Design (Visuals, Logic, Timing)

p5 handles:

  • Falling arrows

  • Hit zone

  • Animation timing

  • Displaying GOOD/MISS

  • On-screen feedback color changes

  • Sending “G” or “M” back to Arduino

When p5 receives “L”, it records the timestamp.
When an arrow reaches the hit zone, p5 checks if the hit was within the timing window.
The logic is simple and clean:

  • Inside timing window → GOOD → send “G”

  • Any other case → MISS → send “M”

The combination of movement, visuals, and LED feedback creates the feel of a real arcade rhythm game.

System Interaction Flow

1. Player steps

Piezo spike → Arduino detects → sends “L” to p5

2. p5 evaluates timing

Compares arrow position with the timestamp

3. p5 sends result back to Arduino

GOOD → “G”
MISS → “M”

4. Arduino flashes LEDs

Green LED = good timing
Red LED = miss

5. Pad LEDs

Always flash instantly on physical step, independent of timing

 

Physical Design (Stepping Pads)

There will be five pads:

  • Left

  • Right

  • Up

  • Down

  • Center (jump)

Each pad will:

  • Be made from transparent acrylic

  • Sit above a piezo sensor

  • Have a small LED attached for instant confirmation

  • Be outlined with LED strips for visual flair

  • Use soft padding underneath for stability and shock absorption

Piezo Sensor Test

I tested the piezo sensor by creating a demo foot pad to see if it would catch onto the pressure when someone is stepping on it. 

IMG_1073

Next Steps

  • Build all five acrylic pads

  • Add LED stripes for styling

  • Integrate WebSerial

  • Create multiple arrow lanes

  • Add music

  • Final polish for IM Fair

 

Closing Reflection

This week I finalized the concept and successfully tested the core sensing system. I now have reliable step detection, a complete feedback design, and a two-way communication structure between Arduino and p5.

Foot Loose combines physical movement, sensory feedback, and visual interaction in a way that is fun, readable, and exciting for players. I’m confident in the direction and excited to build the full installation next.

Preliminary Concept for Final project

Concept (1)

For my final project, I am creating Doom Drive, an interactive driving game where the player navigates a chaotic road using a physical steering wheel while the environment itself actively sabotages the world. The idea behind Doom Drive is to turn sensory interruption into a playable mechanic. Instead of relying solely on player input, the game listens to real-time changes in the environment, particularly light, and transforms those changes into unpredictable events on the road.

The goal is to explore how external stimuli can impact focus, control, and experience. By letting the IM Fair audience unintentionally (or intentionally) alter the driving conditions, the installation becomes social, performative, and slightly overwhelming in a comedic way. Doom Drive exaggerates the feeling of “everything is happening all at once” and turns overstimulation into an entertaining challenge.

Arduino Inputs (Listening)

The physical interaction relies on two sensors connected to the Arduino:

Potentiometer (Mounted Inside the Steering Wheel)
Controls the player’s steering in real time.
Turning the wheel left or right rotates the potentiometer.
Arduino sends smooth analog values to p5.js to control car movement.

Light Sensor (Photoresistor)
Reads ambient light levels in the environment.
Flashes, shadows, or sudden brightness changes trigger in-game chaos.
Different light thresholds activate different game modes, such as:
Glare Mode (screen washout and reduced visibility)
Night Mode (darkness and narrow road)
Tunnel Mode (sharp directional constraints)
Panic Mode (screen shake and heavy obstacles)

Arduino continuously sends both sensor readings to the computer through serial communication. Together, these inputs give the game a mixture of user control and environmental unpredictability.

p5.js Output (Thinking and Speaking)

The p5.js sketch interprets the incoming sensor data and constructs the game world dynamically.

Steering values from the potentiometer directly move the player’s car left and right.
Light sensor fluctuations trigger visual effects, obstacle frequency, color palette shifts, and difficulty spikes.
p5.js uses thresholds and cooldowns so the game remains challenging but still playable in a busy environment like the IM Fair.

The screen speaks back to the player through animation, color changes, distortion, and motion. For example, covering the light sensor might plunge the game into sudden darkness, while a bright flash could create a blinding glare effect.

What I Want to Explore

I chose this project because it blends physical and digital interaction in a way that feels both intuitive and chaotic. Driving is a familiar, easy action for players to understand, but the environment’s influence transforms it into something playful and unpredictable. By allowing the room, its lighting, its people, and its energy to directly affect the game world, Doom Drive becomes a shared experience rather than an isolated one.

This project also lets me explore sensory overload as a mechanic rather than a narrative. Instead of explaining it, the game shows it: the road changes, the screen shifts, and the player has to adapt to external forces beyond their control.

 

Concept (2)

For my final project, I am creating an interactive dance game that combines body tracking with physical floor pads. The player follows on-screen prompts and uses both their feet and upper body to complete dance moves. The game uses p5.js with a pose detection library (similar to what we saw in class) to track body gestures through the webcam, while Arduino handles input from floor pads and other physical controls.

The goal of the project is to explore how digital choreography can be shaped by both full-body movement and tactile interaction. Instead of limiting the experience to either camera tracking or button pressing, this project blends both. The player has to move their body in space and also step on specific pads at the right time, which makes the game feel more physical and performance-like. I want the experience to be playful and encouraging, with visual feedback such as “Good,” “Great,” or “Miss” appearing on the screen depending on timing and accuracy.

Arduino Inputs (Listening)

The Arduino focuses on physical interaction through the feet and optional controls:

Floor Pads

I will create two or three simple dance pads on the floor, each acting as a digital input to the Arduino. These can be built using large buttons or DIY pressure pads. Each pad corresponds to a direction or beat:

  • Left pad (for left-step moves)

  • Right pad (for right-step moves)

  • Optional center pad (for jumps or special moves)

When the player steps on a pad, the Arduino registers a press and sends a message to p5.js over serial communication indicating which pad was activated.

Additional Controls (not sure yet)

  • Potentiometer: used as a physical knob to select difficulty level or song choice before the game starts.

  • Light Sensor: can be used to trigger “disco mode” or visual effects when the environment changes, for example when someone shines a light or covers the sensor.

Arduino continuously sends the state of the pads (and any extra sensors) to the computer. This gives the system a clear sense of when and where the player is stepping.

p5.js Output (Thinking and Speaking)

The p5.js sketch is responsible for three main elements:

  1. Dance Prompts and Timing
    p5 shows prompts on the screen, such as arrows, circles, or icons that represent foot placement and upper body moves. These prompts appear in time with a rhythm or song. For example, a left arrow might indicate “step on left pad,” while an icon near the top of the screen might indicate “raise your hands.”

  2. Body Tracking
    Using a body tracking library with p5 (such as PoseNet), the program detects key points on the player’s body, like hands, arms, and head. It checks whether the player’s pose matches the required gesture at the correct moment. This allows the game to recognize upper body moves, such as raising both hands, leaning to one side, or performing a simple arm gesture.

  3. Feedback and Scoring
    When Arduino sends a pad press and p5 confirms that the player also matched the required pose (or at least the timing), the game displays feedback like “Good,” “Great,” or “Miss.” It also keeps track of score or combo streaks. Visual feedback can include color changes, particle bursts, or short animations to make the experience feel rewarding and energetic.

The screen functions as the “instructor” and “judge,” guiding the player and showing how well they are doing.

What I Want to Explore

I chose this project because it brings together several aspects of interaction that I find interesting: full-body movement, tactile input, and real-time visual feedback. Dance games are usually either camera-based or pad-based, but rarely both at once. By combining floor pads (through Arduino) with body tracking (through p5.js), I want to design a system that feels more embodied and theatrical.

I am also interested in how feedback can encourage people rather than intimidate them. Instead of focusing on strict accuracy, I want the game to celebrate participation and motion, with clear but playful responses like short messages, bursts of color, and simple scores. At the IM Fair, the goal is for people to feel comfortable stepping up, moving around, and laughing at their attempts, while still experiencing a technically structured interactive system that listens, thinks, and responds across both hardware and software.

Week 11 – Reading Reflection

This reading made me rethink how much design pretends to care about “everyone” while actually designing for some imaginary default person. The whole debate about hiding versus showing assistive devices hit me the most. The hearing aid example frustrated me. It reminded me of how often people, including me at times, feel pressured to tone themselves down just to blend in. Seeing designers hide assistive devices made that pressure feel even more obvious. If a device supports someone’s life, why shouldn’t it be allowed to exist proudly?

I liked how the reading returned to the idea that objects carry identity. Even in my IM projects, I can feel that tension between making something perfectly sleek or letting it look like something I actually touched. My work always ends up somewhere in the middle: functional, but still a little sentimental, a little messy, a little me.

The idea of bringing artists and fashion designers into accessibility design made complete sense. It made assistive tech feel less like “equipment” and more like something that can match someone’s personality. A prosthetic can be a tool, but it can also be a statement. A hearing aid can be medical, but it can also be stylish. That’s the kind of design I actually care about: things that work, but also let people feel like themselves.

Week 11 – 3 Exercises

The 3 tasks we worked on were practice on how to make both Arduino and P5j work together.

Group: Deema Al Zoubi and Rawan Al Ali 

Exercise 1 :

let connectButton;
let port;
let reader;
let sensorValue = 0;
let keepReading = false;

function setup() {
  createCanvas(600, 400);

  // Create a connect button 
  connectButton = createButton('Connect to Arduino');
  connectButton.position(10, 10);
  connectButton.mousePressed(connectSerial);

  textAlign(CENTER, CENTER);
  textSize(14);
}

async function connectSerial() {
  // If already connected, close first
  if (port && port.readable) {
    await closeSerial();
    connectButton.html('Connect to Arduino');
    return;
  }

  try {
  
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });

    console.log('Port opened', port);
    connectButton.html('Disconnect');

    const decoder = new TextDecoderStream();
    // Pipe the readable stream from the port to the decoder
    port.readable.pipeTo(decoder.writable);
    reader = decoder.readable.getReader();

    keepReading = true;
    readLoop(); // start read loop
  } catch (err) {
    console.error('Error opening serial port:', err);
    alert('Could not open serial port. Make sure your device is connected and try again.');
  }
}

async function closeSerial() {
  keepReading = false;
  try {
    if (reader) {
      await reader.cancel();
      reader.releaseLock();
      reader = null;
    }
    if (port && port.close) {
      await port.close();
      console.log('Port closed');
    }
    port = null;
  } catch (err) {
    console.warn('Error closing port:', err);
  }
}

async function readLoop() {
  let partial = '';
  try {
    while (keepReading && reader) {
      const { value, done } = await reader.read();
      if (done) {
        console.log('Reader closed');
        break;
      }
      if (!value) continue;

      // split by newline
      partial += value;
      let lines = partial.split(/\r?\n/);
      // Keep the last partial line in buffer
      partial = lines.pop();

      for (let line of lines) {
        line = line.trim();
        if (line === '') continue;        
        const num = parseInt(line, 10);
        if (!Number.isNaN(num)) {
          // clamp to expected range in case of weird data
          sensorValue = Math.max(0, Math.min(1023, num));
          console.log('sensorValue:', sensorValue);
        } else {
          console.log('non-numeric line ignored:', line);
        }
      }
    }
  } catch (err) {
    console.error('Read loop error:', err);
  }
}

function draw() {
  background(240);

  let x = map(sensorValue, 0, 1023, 25, width - 25);
  fill(100, 150, 240);
  ellipse(x, height / 2, 50, 50);
}
window.addEventListener('beforeunload', async (e) => {
  if (port && port.readable) {
    await closeSerial();
  }
});

We used one sensor (a potentiometer) on the Arduino to control the horizontal position of an ellipse in p5.js. The ellipse stayed in the middle of the screen vertically, and all movement was controlled by the Arduino sensor, p5 didn’t send any commands to Arduino.

Video 1: IMG_1012

Schematic 1:

 

Exercise 2:

let connectButton;
let port;
let writer;
let slider;

let targetBrightness = 0;
let currentBrightness = 0;

function setup() {
  createCanvas(400, 200);

  connectButton = createButton("Connect to Arduino");
  connectButton.position(10, 10);
  connectButton.mousePressed(connectSerial);

  slider = createSlider(0, 255, 0);
  slider.position(10, 60);
  slider.style('width', '300px');

  textSize(16);
}

async function connectSerial() {
  try {
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });
    writer = port.writable.getWriter();
    connectButton.html("Connected");
  } catch (err) {
    console.error("Connection error:", err);
  }
}

function draw() {
  background(230);

  // Slider sets the target brightness
  targetBrightness = slider.value();

  // Gradually move toward target (both up and down)
currentBrightness = lerp(currentBrightness, targetBrightness, 0.15);

  // Snap to 0/255 when very close so it truly turns off/on
  if (abs(currentBrightness - targetBrightness) < 1) {
    currentBrightness = targetBrightness;
  }

  fill(0);
  text("LED Brightness: " + int(currentBrightness), 10, 110);

  // Send brightness
  if (writer) {
    writer.write(new Uint8Array([int(currentBrightness)]));
  }
}

We created a slider in p5.js to control the brightness of an LED connected to the Arduino. Moving the slider to the right gradually increased the LED brightness, and moving it back to the left gradually turned it off.

Video 2: https://intro.nyuadim.com/wp-content/uploads/2025/11/IMG_1030.mov

Schematic 2:

 

Exercise 3:

let velocity;
let gravity;
let position;
let acceleration;
let drag = 0.99;
let mass = 50;

let brightnessValue = 512; // potentiometer value (0–1023)
let ballDropped = false;
let ledOn = false;

let port, writer, reader;
let serialActive = false;
let serialBuffer = '';

function setup() {
  createCanvas(640, 360);
  textSize(18);

  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.2 * mass);

  // Connect button
  let btn = createButton("Connect to Arduino");
  btn.position(10, 10);
  btn.mousePressed(connectAndStart);
}

function draw() {
  background(255);

  // Show instructions before dropping ball
  fill(0);
  if (!ballDropped) {
    text("Press B to drop the ball", 20, 30);
    return; // stop here until ball is dropped
  }

  // Show serial status
  if (serialActive) {
    text("Connected", 20, 30);
    text(`Potentiometer: ${brightnessValue}`, 20, 50);
  } else {
    text("Serial Port Not Connected", 20, 30);
  }

  // Apply gravity
  velocity.y += gravity;
  velocity.y *= drag;

  // Horizontal control from potentiometer
  let windX = map(brightnessValue, 0, 1023, -2, 2);
  velocity.x = windX;

  position.add(velocity);

  // Bounce on floor
  if (position.y >= height - mass / 2) {
    position.y = height - mass / 2;
    velocity.y *= -0.9;

    // Flash LED
    if (serialActive && !ledOn) {
      writeSerial("F");
      ledOn = true;
    }
  } else if (ledOn) {
    ledOn = false;
  }

  // Bounce on top
  if (position.y - mass / 2 < 0) {
    position.y = mass / 2;
    velocity.y *= -0.9;
  }

  // Keep ball inside canvas horizontally
  position.x = constrain(position.x, mass / 2, width - mass / 2);

  // Draw ball
  fill(100, 150, 240);
  ellipse(position.x, position.y, mass, mass);
}

function applyForce(force) {
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed() {
  if (key == "B" || key == "b") dropBall();
}

function dropBall() {
  position.set(width / 2, 0);
  velocity.set(0, 0);
  mass = 50;
  gravity = 0.05 * mass;
  ballDropped = true;
}

// Serial functions
async function connectAndStart() {
  try {
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });

    writer = port.writable.getWriter();

    const decoder = new TextDecoderStream();
    port.readable.pipeTo(decoder.writable);
    reader = decoder.readable.getReader();

    serialActive = true;

    // Start reading
    readLoop();
  } catch (err) {
    console.error("Serial error:", err);
  }
}

async function readLoop() {
  while (serialActive && reader) {
    const { value, done } = await reader.read();
    if (done) break;
    if (value) parseSerial(value);
  }
}

function writeSerial(msg) {
  if (writer) writer.write(new TextEncoder().encode(msg + "\n"));
}

function parseSerial(data) {
  serialBuffer += data;
  let lines = serialBuffer.split('\n');
  serialBuffer = lines.pop();

  for (let line of lines) {
    let val = parseInt(line.trim());
    if (!isNaN(val)) brightnessValue = val;
  }
}

We made the ball bounce continuously up and down in p5.js, and connected an LED on the Arduino that lights up briefly every time the ball hits the floor. A potentiometer on the Arduino was used to control the horizontal movement of the ball: turning the knob to higher values moves the ball to the right, and lower values move it to the left. The potentiometer’s outer pins were connected to 5V and GND, and the middle pin to A0. Using Web Serial, we read the potentiometer values in p5 and mapped them to the horizontal position of the ball while it keeps bouncing.

Video 3: IMG_1030

Schematic 3:

Reflection: 

These exercises helped us understand how p5.js and Arduino can work together. We saw that Arduino can send real-world sensor data, like potentiometer values, to p5 to control visuals or simulations, and that p5 can also send commands back to Arduino, like turning an LED on or off. This gave us a clear idea of how much influence p5 can have on Arduino outputs, and how Arduino inputs can drive digital interactions in p5. Practicing this will be really useful for our final project, as it shows how to combine physical sensors, real-time controls, and visual feedback in a simple interactive system.

 

Week 10 – The Diet Drum (Deema and Rawan)

 

Our Concept: 

Our project drew inspiration from last week’s readings on human-computer interaction, particularly the ways in which technology can respond to subtle human behaviors. We explored how interactive systems often mediate our engagement with the environment and even with ourselves, creating experiences that feel responsive, social, or even playful.

With this perspective, we asked ourselves: what if an instrument didn’t just make sound, but responded directly to human behavior? Instead of rewarding interaction, it could intervene. Instead of passive engagement, it could create a performative, almost social response.

From this idea, the Diet Drum emerged — a device that reacts whenever someone reaches for a snack. The system is both humorous and relatable, externalizing the human struggle of self-control. When a hand approaches the snack bowl, a servo-powered drumstick strikes, accompanied by a short melody from a passive buzzer. The result is a playful, judgmental interaction that transforms a familiar, internal tension into an amusing and performative experience.

How It Works

  • Photoresistor (LDR): Detects hand movements by monitoring changes in light. As a hand blocks the sensor, the reading drops.
  • Servo motor: Moves a drumstick to perform a percussive strike, physically reinforcing the “warning” aspect of the interaction.
  • Passive buzzer: Plays a short melody as a playful, auditory cue.

Arduino Uno: Continuously monitors the sensor and triggers both motion and sound.

When the LDR senses that a hand has blocked the light, the Arduino makes the servo play the melody and hit the drum. This creates a clear, immediate connection between what a person does and how the system responds, showing ideas from our readings about how devices can react to gestures and sensor input.

Video Demonstration

assignment10

Challenges

Throughout development, we encountered several challenges that required both technical problem-solving and design fixing:

  • System reliability: While the setup initially worked smoothly, leaving it for some time caused it to fail. Figuring out the problem took us some time because we didn’t know what went wrong and whether it was the setup or the code. So we had to partially rebuild and retune the system to restore functionality.
  • Mechanical stability: Keeping the drumstick steady during strikes was more difficult than anticipated. Any slight movement or misalignment affected the accuracy and consistency of the strikes, requiring several adjustments.
  • Audio timing: The melody initially played too long, delaying servo motion and disrupting the intended interaction. Shortening the audio ensured that the strike and sound remained synchronized, preserving the playful effect.
  • We used AI, to help with some code difficulties we had, to fit with our original idea.

Code Highlights

One part of the code we’re especially proud of is how the sensor input is mapped to the servo’s movement.

float d = constrain(drop, MIN_DROP, MAX_DROP);
float k = (d - MIN_DROP) / float(MAX_DROP - MIN_DROP); 
int hitAngle = SERVO_HIT_MIN + int((SERVO_HIT_MAX - SERVO_HIT_MIN) * k);
unsigned long downMs = STRIKE_DOWN_MS_MAX - (unsigned long)((STRIKE_DOWN_MS_MAX - STRIKE_DOWN_MS_MIN) * k);

strikeOnce(hitAngle, downMs);

This makes the drumstick respond based on how close the hand is, so each action feels deliberate rather than just an on/off hit. It lets the system capture subtle gestures, supporting our goal of reflecting nuanced human behavior.

Future Improvements

Looking forward, we see several ways to expand and refine the Diet Drum:

  • Adaptive audio: Varying the melody or warning tone based on how close the hand is could enhance the playfulness and expressiveness.
  • Mechanical refinement: Improving the stability of the drumstick and optimizing servo speed could create smoother strikes and more consistent feedback.
  • Compact design: Reducing the size of the device for easier placement would make it more practical for everyday use.
  • Visual cues: Adding optional LEDs or visual signals could enhance the  feedback, making the system even more engaging.

Github Link:

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/assignment10.ino

https://github.com/deemaalzoubi/Intro-to-IM/blob/b321f2a0c4ebf566082f1ca0e0067e33c098537f/pitches.h

Week 10 – Reading Reflection

Bret Victor’s rant made me rethink what we even mean when we call something “the future.” He argues that touchscreens, gesture controls, and all these “advanced” interfaces are actually making us less connected to our own abilities. Our hands are one of the deepest ways we understand the world. They know tension, pressure, texture. They think with us. But we’ve decided progress means tapping around on cold glass. When I read that, the first thing I thought of was LEGO. There is this unspoken language when you build: the way your fingers already know which brick fits, the tiny resistance before a perfect click. That sound. That feeling. It’s not just play; it is intelligence happening through the body. No screen has ever replicated that.

I’ve tried the digital LEGO builders before, and they always feel wrong. You can assemble something on the screen, sure, but there is no weight, no friction, no small ritual of digging through pieces and recognizing one by touch alone. Same with crocheting. The yarn runs differently through your fingers depending on tension, mood, the hook, your posture. You feel progress. You feel mistakes. Your hands correct before your mind catches up. Victor’s point clicked for me here: creativity is not just in the mind. It is in the wrists, fingertips, joints, and muscle memory. When interfaces ignore the body, they are not futuristic. They are incomplete.

The responses page made it clear he is not saying we need to go backwards. He is saying we should refuse a future that flattens our senses. There are richer, more human possibilities if we let our full selves participate in the interaction. For me, the future I want is textured, clickable, tuggable, threaded, snapped together. A future that feels like LEGO: discovery through touch, play, accident, correction, and joy. Innovation that doesn’t just live on a screen, but lives in your hands.

Week 9 – Reading Reflection

These readings made me think about how much pressure I put on myself to be “original.” Tom Igoe’s point that most interactive ideas have already been done was strangely comforting. It made me realize that creativity isn’t about inventing something entirely new , it’s about doing something familiar in a way that feels personal. My projects don’t have to be revolutionary; they just have to feel like mine.

What stood out most to me was his idea of stepping back and letting the audience take over. I tend to over-explain my work, wanting people to understand what I meant. But maybe it’s more powerful to just let them interact and form their own meaning. Igoe’s “set the stage, then shut up and listen” line hit hard , it’s something I need to apply not only to my projects but to how I share them.

These readings reminded me that physical computing is not just about sensors or LEDs. It’s about trust , trusting that the user will understand, trusting the materials to behave, and trusting myself to stop editing and just let the work breathe.

Week 9 – Brain on Break

Brain on Break

Concept & Inspiration

This project started from the chaos of studying late nights and running on caffeine. I wanted to make a circuit that visualizes the mental shift between being focused and completely done,  the moment when my brain decides, without warning, “we’re taking a break now.”

The concept connects light, motion, and emotion. The light sensor represents my surroundings and productivity levels , the brighter the space, the more alert I am. The foil switch stands for that physical collapse when I lean head onto the desk. Together, they create a system that reads like a tiny, glowing version of my attention span.

How It Works

The project combines one analog sensor and one digital sensor, controlling two LEDs in different ways.

  • Light sensor (Analog Input): Reads the brightness of the environment. The green LED glows brighter in bright light, symbolizing focus and clarity.

  • Foil Switch (Digital Input): Made of two pieces of aluminum foil connected to D2 and GND. When my elbow or cheek touches them together, it signals “brain on break.”

  • Red LED (Digital Output): Turns on when the foil pads touch — representing mental shutdown.

  • Green LED (Analog Output): Fades according to light level but turns completely off when the foil switch is activated.

This mix of analog and digital behavior mirrors how people work — not everything in us is gradual or logical. Sometimes focus fades; sometimes it just stops.

Circuit Design

When the light changes, the green LED fades smoothly using analogWrite(). When the foils touch, the orange LED turns on and the green one shuts off completely.

Coding:

const int LDR_PIN = A0;        // Analog input from photoresistor
const int SWITCH_PIN = 2;      // Digital input from foil switch
const int GREEN_LED = 9;        // Analog (PWM) output LED
const int RED_LED = 8;      // Digital ON/OFF LED

void setup() {
  pinMode(SWITCH_PIN, INPUT_PULLUP);  
  pinMode(GREEN_LED, OUTPUT);
  pinMode(RED_LED, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  int lightValue = analogRead(LDR_PIN);     // Light level
  int switchState = digitalRead(SWITCH_PIN); // Foil contact: LOW when touched

  
  // Map light value to LED brightness (inverted so bright light = brighter LED)
  int brightness = map(lightValue, 600, 880, 0, 255);
brightness = constrain(brightness, 0, 255);
  analogWrite(GREEN_LED, brightness);

  
  // If foil switch is touched, turn on red LED
  if (switchState == LOW) {
  // Foil pads touching → brain on break
  digitalWrite(RED_LED, HIGH);   // burnout light ON
  analogWrite(GREEN_LED, 0);         // focus light OFF
} else {
  // Normal state → focused
  digitalWrite(RED_LED, LOW);    // burnout light OFF
  analogWrite(GREEN_LED, brightness); // focus light fades with light
}
 //serial monitor
  Serial.print("Light Value: ");
  Serial.print(lightValue);
  Serial.print(" | Brightness: ");
  Serial.print(brightness);
  Serial.print(" | Switch: ");
  Serial.println(switchState == LOW ? "Touched" : "Not touched");

  delay(50);
}

Testing & Results

In bright light, the green LED glows intensely — that’s focus mode. When the room dims, the light softens, mimicking a fading mind. But the real moment comes when I touch the foil pads. This happens when my head touches the desk, indicating that I fell asleep. The red LED flashes alive, and the green one shuts off instantly. It’s like watching my brain say, “enough.”

The light sensor worked better once I narrowed the range (600–880) so the fade became more dramatic. The foil switch needed tighter contact, but once secured, it triggered perfectly. 

Video demo:

8088D1BA-7146-49C3-97C8-CCAB51934422

Challenges

  • Calibrating the light sensor so the fading felt visible but not jumpy.

  • Making sure the foil switch responded to soft touches without staying on permanently.

Future Improvements

If I develop it further, I’d like to:

Include a buzzer or heartbeat sound to show the switch between focus and burnout.

Reflection

This assignment pushed me to merge function with symbolism. The project isn’t just about inputs and outputs — it’s about mood as circuitry. The light patterns represent focus, fatigue, and the strange middle space between both.

It reminded me that even in electronics, balance matters. Circuits need both current and resistance. Brains need both light and rest.

Week 8 – Unusual Switch Assignment

Concept & Inspiration

This project began with a video example shown in class where a mustache prop was used as a switch. I was fascinated by the idea that something worn on the face could become part of an electronic interaction. It reminded me that the body itself can be the input device and that playful design can still be technically meaningful. That influenced my first idea. I wanted to place aluminum foil pads above my eyebrows so that every time I scrunched them together, the circuit would close and the LED would react. It felt like a fun and expressive interaction because eyebrows are a natural part of communication.

As I started building, I realized a limitation. The wires available were not long enough to comfortably reach my face while plugged into the Arduino. The setup became impractical and would not stay connected. Instead of forcing the idea, I adapted it while keeping the core concept: using a body gesture that does not involve hands. I moved the conductive pads from my face to my elbows, which allowed me to keep the same interaction logic without fighting the hardware constraints.

The result is a simple but playful design. When the user touches their elbows together, their body closes the circuit which becomes a digital input to the Arduino that changes the LEDs. This transforms a physical gesture into a clear visual response and reinforces the connection between the human body and digital behavior.

How It Works

Two small pieces of aluminum foil are taped to the elbows. Each foil pad is connected to the Arduino: Left elbow foil → Digital Pin 2 (input) Right elbow foil → GND When the elbows are apart, the circuit is open, and the Arduino reads a HIGH signal using an internal pull-up resistor. The red LED turns on to indicate no contact. When the elbows touch each other, the conductive path through the body closes the circuit, pulling the input LOW. The green LED turns on, signaling that contact is detected. This simple interaction demonstrates digital input detection, human conductivity, and conditional output control.

Circuit Diagram:

I included a labeled schematic showing the Arduino Uno, foil pads, and LED wiring. Red LED connects to Pin 9 through a 330 resistor, Green LED to Pin 10 through a 330 Ω resistor, and all components share the same GND reference.

Arduino Code:

const int SWITCH_PIN = 2;
const int RED_LED = 9;
const int GREEN_LED = 10;

void setup() {
  pinMode(SWITCH_PIN, INPUT_PULLUP);
  pinMode(RED_LED, OUTPUT);
  pinMode(GREEN_LED, OUTPUT);
}

void loop() {
  int state = digitalRead(SWITCH_PIN);

  if (state == LOW) {
    digitalWrite(GREEN_LED, HIGH);
    digitalWrite(RED_LED, LOW);
  } else {
    digitalWrite(GREEN_LED, LOW);
    digitalWrite(RED_LED, HIGH);
  }
}

The INPUT_PULLUP keeps the signal stable when the body is not closing the circuit.

Heres the video demonstration (I used my little sister as demonstration):

C18BC5EF-F946-4D9D-9A64-42D32D1BC5B3

Challenges:

Ensuring the elbow foil stayed in place during arm movement was a big challenge since the jumper wires are pretty short.

This challenge was resolved by connecting one of the wires to from the Arduino to the breadboard and then connecting another on the same row to give me more extension to work with.

Future Improvements:

More inputs, using additional body contact points.

I could potentially add other outputs, such as sound.

I could learn a way to extend the wires so I have access to make funner projects with no limitations.