Week 12 – Documentation on Final Project

The final project is a Mini DJ Booth, a flat cardboard controller styled like a DJ console with painted “records” and four colored buttons on top. The cardboard box houses the Arduino and wiring, while the laptop screen becomes the virtual stage using p5.js. Each physical button represents a different beat or sound layer (for example: kick, snare, base, drums). When a user presses combinations of buttons, they can live-mix these layers while watching synchronized visuals, mimicking the feel of a compact DJ setup.

Generated with Chat GPT

I haven’t started the actual coding yet, but right now I’m more focused on the physical build, especially because this will be my first time soldering and I want to make sure I do it cleanly and safely. During our lab tour, we saw the LED push buttons, and they’re honestly perfect for this project, they’re big, colorful, satisfying to press, and will make the DJ booth feel much more realistic and professional. My plan is to use scrap cardboard from the back of the lab to construct the box itself, painting it to look like a real DJ setup with turntables and controls. Once the physical build is stable, I’ll move on to connecting the buttons to the Arduino and then writing the p5.js code to bring the whole system to life.

On the Arduino side, the program will continuously read the state of the four input buttons using `digitalRead()`. Each button is wired with a pull-down (or pull-up) resistor so the Arduino can reliably detect presses. In the main loop, the Arduino will encode the button states as a simple serial connection  and send this to the computer via USB using Serial.println(). The Arduino will not generate audio; its main role is to sense button presses quickly and transmit clean, structured data to p5.js so there is minimal latency between touch and feedback.

Here is a schematic example that I could potentially adapt for my final project:

Source: https://www.the-diy-life.com/multiple-push-buttons-on-one-arduino-input/

In the p5.js program, the sketch will receive the serial data, log the four button values, and use them to control both sound playback and visuals. For each button, p5 will toggle/keep looping the beat and  (or optionally stop the sound, depending on the final interaction choice). In the p5 sketch I will download four different audio tracks and will generate distinct color schemes and animations for each active button, for example, different color blocks, pulsing circles over each painted “record,” or waveform-style bars that move with the beat. At this stage, communication is mostly one-way (Arduino → p5), but there is room to extend it later (e.g., p5 sending messages back to Arduino to drive LEDs of each button). I just want to note that I havent started working on the p5 visuals yet in terms of code because I’m still planning the overall mood and ambiance I want to set.

I found royalty free beats I could download for the audio:

Week 11 – 3 Exercises (group work)

For week 11, the task was to work in groups to finish 3 exercises. Our group was Me (Asma), Hajar, and Mukhlisa 🙂

Exercise 1: ball moving horizontally with potentiometer

Group Member: Asma (Me)

Video:


Schematic:

Arduino Code:

const int potPin = A0;

void setup() {
  Serial.begin(9600);  
}

void loop() {
  int val = analogRead(potPin);   // 0..1023
  Serial.println(val);            
  delay(10);                     
}

 

P5js Code:

// === Arduino + p5.js WebSerial (directional movement, fixed) ===
// Pot controls direction/speed by how its value changes. p5 does NOT control Arduino.

let port;
let reader;
let connectButton;
let isConnected = false;

let latestData = null; // last parsed int from serial (0..1023)
let prevData   = null; // previous sample to compute delta
let lineBuffer = '';   // accumulate serial chunks until '\n'

let posX = 0;          // ellipse position
let speed = 0;         // horizontal velocity

function setup() {
  createCanvas(windowWidth, windowHeight);
  background(240);
  textFont('monospace');

  connectButton = createButton('Connect to Arduino');
  connectButton.position(20, 20);
  connectButton.mousePressed(connectToSerial);

  // start centered; we'll keep it centered until first data arrives
  posX = width / 2;
}

async function connectToSerial() {
  try {
    // Request and open port
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });
    isConnected = true;
    console.log(' Port opened');

    // Create a text decoder stream and reader for clean line-by-line reads
    const textDecoder = new TextDecoderStream();
    const readableClosed = port.readable.pipeTo(textDecoder.writable);
    reader = textDecoder.readable.getReader();

    // Kick off read loop
    readSerialLines();
  } catch (err) {
    console.error(' Connection failed:', err);
    isConnected = false;
  }
}

async function readSerialLines() {
  try {
    while (true) {
      const { value, done } = await reader.read();
      if (done) break; // reader released
      if (!value) continue;

      // Accumulate and split by newline
      lineBuffer += value;
      let lines = lineBuffer.split(/\r?\n/);
      lineBuffer = lines.pop(); // save incomplete tail

      for (let line of lines) {
        line = line.trim();
        if (!line) continue;
        const v = parseInt(line, 10);
        if (!Number.isNaN(v)) {
          // Clamp to expected 10-bit range
          latestData = Math.min(Math.max(v, 0), 1023);
          // Initialize prevData on first valid sample
          if (prevData === null) prevData = latestData;
        }
      }
    }
  } catch (err) {
    console.error(' Read error:', err);
  } finally {
    try { reader && reader.releaseLock(); } catch {}
  }
}

function draw() {
  background(240);

  if (!isConnected) {
    fill(200, 0, 0);
    noStroke();
    textAlign(CENTER, CENTER);
    textSize(20);
    text("Click 'Connect to Arduino' to begin", width / 2, height / 2);
    return;
  }

  // If we haven't received any valid data yet, show waiting status
  if (latestData === null || prevData === null) {
    fill(0);
    textSize(16);
    textAlign(LEFT, TOP);
    text('Waiting for data...', 20, 60);
    // Keep ellipse centered until first data arrives
  } else {
    // Change in pot reading determines direction and speed bump
    const delta = latestData - prevData;

    // Deadband to ignore small noise
    const deadband = 4;
    if (delta > deadband) {
      speed = constrain(speed + 0.6, -12, 12); // turn right -> move right
    } else if (delta < -deadband) {
      speed = constrain(speed - 0.6, -12, 12); // turn left -> move left
    } else {
      // friction when knob still
      speed *= 0.90;
    }

    // Integrate position and clamp
    posX += speed;
    posX = constrain(posX, 0, width);

    // Update prev for next frame
    prevData = latestData;
  }

  // Draw ellipse at vertical center
  noStroke();
  fill(50, 100, 255);
  ellipse(posX, height / 2, 80, 80);

  // HUD
  fill(0);
  textSize(14);
  textAlign(LEFT, TOP);
  const shown = latestData === null ? '—' : latestData;
  text(`Sensor: ${shown}`, 20, 60);
  text(`Speed:  ${nf(speed, 1, 2)}`, 20, 80);
}

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
  // Keep position on-screen if you resize smaller
  posX = constrain(posX, 0, width);
}

Reflection:

I built a simple circuit using the Arduino and a 10kΩ potentiometer to control an on-screen ellipse in p5.js. I connected the potentiometer’s outer legs to 5V and GND and the middle leg  to the analog pin A0, allowing it to act as a variable voltage divider. After uploading the Arduino code and checking the serial monitor, I could see how turning the knob changed the analog readings from 0 to 1023. This helped me understand how analog sensors translate physical movement into numerical data that can be visualized digitally. It was satisfying to see the system work after troubleshooting my wiring and realizing how the order of connections affects the readings.

Exercise 2: Controlling the LED

Group Member: Hajar

Video:

Arduino Code:

//  Arduino: LED brightness from p5.js 

const int ledPin = 10;  // LED connected to pin 10

void setup() {
  Serial.begin(9600);   // must match p5.js baud rate
  pinMode(ledPin, OUTPUT);
}

void loop() {
  if (Serial.available() > 0) {
    int brightness = Serial.read();      // read 0–255
    brightness = constrain(brightness, 0, 255);
    analogWrite(ledPin, brightness);     // control LED brightness
  }
}

P5js code:

let port;
let writer;

async function setup() {
  createCanvas(512, 512);
  background(0);
  textSize(16);
  textAlign(CENTER, CENTER);
  text('Click to connect to Arduino', width/2, height/2);
}

async function mousePressed() {
  if (!port) {
    port = await navigator.serial.requestPort();
    await port.open({ baudRate: 9600 });
    writer = port.writable.getWriter();
    console.log('Connected!');
  }
}

function mouseDragged() {
  if (writer) {
    let brightness = floor(map(mouseY, 0, height, 255, 0));
    brightness = constrain(brightness, 0, 255);
    writer.write(new Uint8Array([brightness]));
  }
}

Schematic: 

Reflection:

(Hajar) For this project, I created both the schematic and the circuit, and I kept them very simple. My setup only included one LED and one resistor connected to the Arduino and grounded. The main purpose of the assignment was to use serial communication with p5.js to control the brightness of the LED, so I focused more on the coding rather than making the hardware complex. Before starting the p5 part, I tested my circuit using a simple Arduino code just to make sure that the LED was lighting up correctly and everything was connected properly. Once I confirmed that it worked, I added the schematic and moved on to the serial communication part. The schematic itself was very basic something I’ve done before so it wasn’t hard to figure out. I liked that I could keep the circuit minimal but still meet the goal of the exercise, which was to control the LED’s brightness through p5. It showed me how even a simple circuit can become interactive and meaningful when combined with code.

The coding part was definitely the hardest and most time-consuming part of this project. I’ve never connected p5.js and Arduino together before, so figuring out how to make them communicate took a lot of trial and error. At first, I kept trying to make it work on Safari without realizing that the serial connection doesn’t actually work there, it only works on Google Chrome. So, I kept rewriting and rechecking my code, thinking there was something wrong with it, even though the logic itself was fine. My professor had already shown us the structure for serial communication, so I kept following it, creating and recreating the same code over and over again, but it just wouldn’t connect.

It got really frustrating at one point because I had everything wired correctly, and my code looked right, but the Arduino and p5 still weren’t talking to each other. I spent a lot of time trying to figure out what was wrong. Once I finally switched to the right browser and saw the serial connection actually working, it was such a relief. The LED started responding, and it felt like everything finally came together. After so many attempts, seeing both the Arduino and p5.js working together perfectly was honestly so rewarding. I was really proud of how it turned out in the end it looked simple but worked exactly how I wanted it to. All that frustration was worth it because the final design turned out really good, and it felt amazing to watch it finally come to life.

Exercise 3: Make the LED light up when the ball bounces

Group Member: Mukhlisa

Video:

b52d0a23-cf7d-4cf2-81e1-e2255dc62cad

Ardunio Code:

int potPin = A5;   // Potentiometer
int ledPin = 3;    // LED on PWM pin

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);

  // Blink LED to confirm setup
  analogWrite(ledPin, 255);
  delay(200);
  analogWrite(ledPin, 0);
}

void loop() {
  // Read potentiometer (0–1023)
  int raw = analogRead(potPin);

  // Map to PWM range (0–255)
  int brightness = map(raw, 0, 1023, 0, 255);

  // Continuously send brightness value to p5 if you need it
  Serial.println(brightness);

  // Check for serial commands from p5
  if (Serial.available()) {
    String data = Serial.readStringUntil('\n');

    // When ball hits ground → p5 sends "1,0"
    if (data == "1,0") {
      analogWrite(ledPin, brightness);  // flash with pot brightness
      delay(100);
      analogWrite(ledPin, 0);           // turn off after flash
    }
  }

  delay(30); // loop stability
}

P5js Code:

let velocity;
let gravity;
let position;
let acceleration;
let drag = 0.99;
let mass = 50;

let brightnessValue = 0; // Potentiometer value from Arduino (0–5)
let ballDropped = false;
let ledOn = false;

function setup() {
  createCanvas(640, 360);
  noFill();
  textSize(18);

  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.5 * mass);
}

function draw() {
  background(255);

  fill(0);
  if (!ballDropped) {
    text("Press D to drop the ball", 20, 30);
    text("Press Space Bar to select Serial Port", 20, 50);
    return;
  }

  if (serialActive) {
    text("Connected", 20, 30);
    text(`Potentiometer: ${brightnessValue}`, 20, 50);
  } else {
    text("Serial Port Not Connected", 20, 30);
  }

  // Gravity only (no wind)
  applyForce(gravity);

  // Update ball
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);

  // Draw ball
  ellipse(position.x, position.y, mass, mass);

  // Bounce
  if (position.y >= height - mass / 2) {
    velocity.y *= -0.9;
    position.y = height - mass / 2;

    // Tell Arduino: turn LED on briefly
    if (serialActive && !ledOn) {
      writeSerial("1,0\n");
      ledOn = true;
    }
  } else if (ledOn) {
    // Tell Arduino: turn LED off
    writeSerial("0,0\n");
    ledOn = false;
  }
}

function applyForce(force) {
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

// Serial setup and drop ball
function keyPressed() {
  if (key == " ") setUpSerial();
  if (key == "D" || key == "d") dropBall();
}

function dropBall() {
  position.set(width / 2, 0);
  velocity.set(0, 0);
  mass = 50;
  gravity = createVector(0, 0.5 * mass);
  ballDropped = true;
}

// Read data from Arduino
function readSerial(data) {
  if (data != null) {
    let fromArduino = split(trim(data), ",");
    if (fromArduino.length === 1) {
      brightnessValue = int(fromArduino[0]); // Potentiometer value
    }
  }
}

Schematic:

Reflection:

(Mukhlisa) For this project, I combined both physical computing and digital simulation by connecting an Arduino circuit to a p5.js sketch. My setup included a potentiometer to control the wind force in the animation and an LED that lit up every time the falling ball hit the ground. I built a simple circuit using one LED, a resistor, and a 10kΩ potentiometer, and then connected it to my computer through serial communication. Even though the hardware was straightforward, the real challenge came from getting the Arduino and p5.js to communicate properly. I spent a lot of time testing the potentiometer readings, debugging the serial connection, and making sure the LED responded at the right moment in the animation.

Week 11 – Final Project Idea

Mini DJ Booth: Interactive Sound Console

Concept Overview

The Mini DJ Booth is a physically interactive sound system that combines Arduino and p5.js to simulate the experience of a digital DJ console. The project uses four physical buttons connected to an Arduino board, each mapped to a unique beat and corresponding visual effect displayed through p5.js. When a button is pressed, a signal is sent from the Arduino to the computer via serial communication, triggering both a sound loop and a colorful visual animation on screen.

The physical setup will be built using cardboard, designed to resemble a miniature DJ booth with labeled, color-coded buttons—each representing a different sound or track. The interface invites users to experiment with rhythm and visual design, mimicking the creative flow of live mixing.

Visual Prototype (Generated on ChatGPT)

Interaction Design

  • Input: Arduino detects button presses.
  • Processing (Thinking): The signal is sent to p5.js, which identifies which button was activated.
  • Output: p5.js responds by playing a corresponding beat and generating a synchronized visual (color and shape animation) on screen.

Each of the four buttons triggers:

  1. A unique sound (e.g., drum, bass).
  2. A distinct color palette and animation style that matches the mood of the beat.

The more users press the buttons after one another they create different beats and sounds mimicking a real DJ booth.

Materials

  • Arduino
  • 4 push buttons (I wanna use the bigger ones that we saw during the lab tour as they feel more tactile and look better overall)
  • 4 resistors (10kΩ)
  • Jumper wires
  • Breadboard
  • USB cable
  • Laptop running p5.js
  • Cardboard and paint/decor for the booth design

User Experience

Users interact with the booth by pressing different buttons to layer and remix beats. The immediate audio-visual feedback creates a playful and performative experience, encouraging rhythm exploration and creative expression. The physicality of pressing real buttons, combined with the digital response on screen, merges tactile and visual engagement, much like an actual DJ setup.

Goal

To explore how physical input devices (Arduino buttons) can enhance digital multimedia experiences (p5.js visuals and sounds), creating an accessible, low-cost prototype that bridges sound, motion, and design.

Reading Response Week 11 – When Disability Meets Design

When doing this weeks reading what actually surprised me most was how the Eames leg splint becomes a symbol of empathy turned into form. I was fascinated by how something born from wartime necessity, an object designed for injured soldiers, evolved into a design philosophy that shaped everyday furniture. It reminded me that innovation often begins in moments of constraint. Charles Eames’s belief that “design depends largely on constraints” reframes disability not as limitation, but as a source of creativity. Reading this, I thought about how many of our most elegant designs emerge not from freedom, but from friction.

The later sections on fashion and prosthetics complicated my idea of good design. I was moved by how eyewear, once a medical device now transformed into a fashion statement, while hearing aids remained confined by the medical model of concealment. That difference says a lot about visibility, shame, and what we consider socially acceptable. When the text described pink plastic hearing aids as a form of “white Western bias,” it made me reflect on how aesthetics can either humanize or marginalize. Why is it that invisibility is seen as dignity, while expression is seen as vanity?

Apple’s iPod and the Muji CD player added another layer to this question. Both suggest that simplicity can be radical, that good design isn’t about adding features but removing noise. The iPod’s “1,000 songs in your pocket” (which now reading this in 2025 is so funny to me because I genuinely can’t imagine a world in which every song I could ever want isn’t three taps away on my phone) echoed the Eameses’ plywood splint: a single elegant solution born from constraint. Yet, the reading also warns that universality and simplicity can’t always coexist. It made me rethink whether inclusive design should aim to be for everyone, or instead embrace difference with honesty.

In the end, I felt the book wasn’t just about disability, it was about humility in creation. Whether in a leg splint, a pair of glasses, or a music player, design becomes an ethical act: one that balances visibility and dignity, simplicity and inclusion, beauty and necessity.

Week 10 – A DIY Musical Instrument (The Ultrasonic Air Piano)

Concept:

The idea behind my instrument was to create a hands-free musical device that transforms invisible gestures into sound. My goal was to design something playful yet technical; a device that reacts to both motion and touch. By combining distance and pressure, the instrument invites intuitive exploration: the closer your hand gets, the higher the note, while pressing the sensor triggers the sound. It merges tactile interaction with sound, making music a physical experience.

 

Method & Materials:

This project was my first time working with two types of sensors on the Arduino Uno:

  • Analog Sensor: Ultrasonic sensor (HC-SR04) to measure distance.
  • Digital Switch: Force Sensitive Resistor (FSR) to detect pressure.
  • Output: Piezo buzzer to produce sound.

I connected the ultrasonic sensor to pins 10 (trig) and 11 (echo), the FSR to analog pin A0, and the buzzer to pin 12.
Each note from the C major scale (C–D–E–F–G–A–B) was assigned to a specific distance range, coded in an array:

int notes[7] = {261, 294, 329, 349, 392, 440, 494};

The system reads distance in real time:

  • When the FSR is pressed and your hand is between 0–50 cm of the sensor, the buzzer plays a tone corresponding to that range.
  • If no pressure is detected or the hand moves out of range, the sound stops.

Process:

At first, it took time to understand how analog vs. digital inputs work and how to read them simultaneously. I researched how to use pulseIn() for the ultrasonic sensor and experimented with mapping values using the map() function.
To visualize the notes, I placed colored paper strips at different distances  (each representing one note of the scale)

Throughout the process, I learned:

  • The importance of wiring correctly (e.g., ensuring the FSR forms a voltage divider).
  • How combining two sensors can create more expressive interaction.

Schematic:

Code:

This combines input from two sensors, an ultrasonic sensor and a force-sensitive resistor (FSR) to generate musical notes through a piezo buzzer. The ultrasonic sensor continuously measures the distance of my hand, while the FSR detects when pressure is applied. When both conditions are met (hand within 50 cm and FSR pressed), the code maps the distance value to a specific note in the C major scale (C, D, E, F, G, A, B). Each distance range corresponds to a different pitch, allowing me to “play” melodies in the air. The code I’m most proud of is the single line that transforms the project from a simple sensor experiment into a musical instrument. It defines the C major scale, turning numerical frequency values into recognizable notes. I love that such a short line of code gives the device its expressive character, it bridges logic and creativity, translating distance data into melody. It’s the heart of the project, where sound and interaction truly come together.

// --- Define musical notes (C major scale) ---
int notes[7] = {261, 294, 329, 349, 392, 440, 494}; // C D E F G A B

Result:

The final prototype acts like an invisible piano: you play by waving your hand in front of the sensor and pressing lightly on the FSR. Each distance triggers a different musical note. The colored papers made it easier to perform intentionally and visually mark pitch changes.

In the demonstration video, the tones respond smoothly to my gestures, transforming simple components into an expressive interface.

Challenges:

One of the main challenges I faced was understanding how each pin on the ultrasonic sensor worked. At first, I didn’t realize that every pin had a specific purpose, like trig for sending signals and echo for receiving them, so it took me a while to fully grasp how data was actually being measured. I also struggled with wiring the circuit, often making small mistakes that stopped the whole setup from working. Drawing out the schematic was another time-consuming part since there were many components to connect and label correctly. Finally, the coding process was challenging because it was my first time using several of these elements, and I had to learn through trial and error how to make the sensors and buzzer communicate smoothly.

Inspiration + Tools thats helped me:

https://projecthub.arduino.cc/theriveroars/simple-hand-controlled-instrument-372bfc

https://learn.adafruit.com/force-sensitive-resistor-fsr/using-an-fsr

Reflection:

This project taught me how code, sensors, and sound can merge into interactive art. The challenge was balancing sensitivity: sometimes the ultrasonic readings fluctuated, and it took some fine tuning. But once it worked, it felt rewarding to hear the instrument. It also made me realize how music can be built from logic, how creative coding allows emotional expression through electronics. I see this as the beginning of exploring computational instruments that combine art and technology.

Week 10 – Reading Response (A Brief Rant on the Future of Interaction Design)

One passage that really stayed with me from Bret Victor’s A Brief Rant on the Future of Interaction Design is his statement that screens are “pictures under glass.” That phrase hit me because it’s so ordinary yet so revealing; every day I touch my phone dozens of times, yet I never actually feel anything back. Victor’s argument that we’ve limited human interaction to tapping on cold glass made me realize how passive our so-called “interactive” technologies have become. I started thinking about how my creativity, whether sketching or coding, always feels richer when my hands are physically involved; pressing, folding, shaping. It made me question: why did we let convenience replace tactility? What would technology look like if it honored the intelligence of our hands instead of reducing them to cursors?

In the Responses section, I was fascinated by how defensive many readers became, as if Victor’s critique was anti-progress. But what I sensed in his tone was care, not nostalgia; a desire to expand our sense of what interaction can mean. This really reminded me of  Refik Anadol’s Machine Hallucinations, a piece I’m analyzing for another course, where data transforms into movement, color, and emotion. Anadol’s work feels like the future Victor imagines: one where technology engages the body and senses, not just the eyes.

These readings challenged my old assumption that the “best” design is the smoothest and most frictionless. Victor helped me see friction as meaningful; it’s how we feel our way through the world. I now think of design less as creating perfect efficiency and more as crafting moments of connection between body, mind, and machine. The essay left me wondering whether the future of interaction design depends not on faster touchscreens, but on rediscovering touch itself; real, textured, imperfect, human touch.

Ultimately, I completely agree with Victor’s message. His critique felt refreshing, almost like a wake-up call to slow down and rethink what “innovation” actually means. I liked how he exposed the emptiness behind shiny new interfaces and instead celebrated the physical, human side of design. Even though his tone was mainly critical, I didn’t find it negative; I found it hopeful. It made me appreciate the kind of design that makes people feel connected, not just technologically advanced.

Week 9 – Analog + Digital

Assignment Overview:

The goal of this task was to create a simple interactive circuit using one digital sensor (a pushbutton) and one analog sensor (an LDR) to control two LEDs on the Arduino Uno. The digital LED turns on and off when the button is pressed, while the analog LED smoothly changes brightness depending on the amount of light detected by the sensor. This task helped me understand how digital and analog inputs behave differently and how they can work together in one circuit.

My Circuit:

Planning the Schematic:

I started by sketching a schematic on my iPad on procreate to visualize how each component would connect. The schematic represents the button on pin D2, the LDR with a 330 kΩ resistor forming a voltage divider to A0, and two LEDs connected to D8 and D9, each with their own 330 Ω resistor to ground. Planning it out first helped me avoid confusion later when placing components on the breadboard and made the whole process smoother. Of course when actually going to build the board a few elements moved around but the schematic mainly shows my thought process and “map” while planning this assignment.

Building the Circuit

Next, I built the circuit on the breadboard, carefully following the schematic. I color-coded my jumper wires to stay organized: red for power, black for ground, green for digital signals, and yellow for analog. One small challenge was understanding that the breadboard’s left and right halves are not connected across the middle gap. Once I fixed that mistake, the circuit started to behave exactly as expected.

(Its kinda hard to see but when I cover the sensor the blue LED shifts in brightness, and the button turns the yellow LED on)

Coding Elements:

This Arduino code connects both a digital and an analog input to control two LEDs in different ways. The process starts with defining pin connections: the pushbutton on pin 2, the LDR sensor on analog pin A0, and two LEDs on pins 8 and 9. In the setup function, the button uses input pullup so it naturally stays HIGH until pressed (no external resistor needed). The digital LED on pin 8 simply turns on when the button is pressed and off when it’s released. The LDR, wired as part of a voltage divider, continuously sends changing light readings to A0. In the loop, these readings (ranging 0–1023) are mapped to 0–255 with the map function so they can control the brightness of the second LED through analogWrite on pin 9. The small delay at the end makes the light level changes smooth and stable. Overall, the code demonstrates how digital and analog signals can be read simultaneously to control different outputs in real time.

// Pin setup 
const int buttonPin = 2;   // Button connected to D2 
const int ledDigital = 8;  // Digital LED (on/off)
const int ledAnalog = 9;   // Analog LED (brightness control)
const int ldrPin = A0;     // LDR sensor connected to A0

void setup() {
  pinMode(buttonPin, INPUT_PULLUP); 
  pinMode(ledDigital, OUTPUT);
  pinMode(ledAnalog, OUTPUT);
}

void loop() {
  // Digital LED controlled by button 
  bool pressed = (digitalRead(buttonPin) == LOW); // LOW = pressed
  digitalWrite(ledDigital, pressed ? HIGH : LOW);

  // Analog LED controlled by LDR light level 
  int lightValue = analogRead(ldrPin);        // Reads LDR 
  int brightness = map(lightValue, 0, 1023, 0, 255); 
  analogWrite(ledAnalog, brightness);         // Set LED brightness

  delay(10); 
}

Troubleshooting and Adjustments 🙁

The most challenging part was honestly just starting this task. It was very overwhelming since I never used arduino before and had no experience in coding C++. It took me a few hours to look through videos and articles just to get a better understanding of how the board works, and I even gave myself a few warm up tasks, like a simple blink test which I will show later in this documentation. In this particular assignment, one of my main challenges was getting the circuit itself right. The first few times I tried to setup the button the led wouldn’t turn on and I felt like I had so many wires and it was hard to focus on what the problem was. To combat this, I took the time to properly think about my schematic as a map, then followed that instead of blindly building my board first then drawing my schematic after. This really helped me when I got frustrated, and I also took many breaks in between so I could come back to the task with fresh eyes.

Random Blink Test Experiment:

Before starting the full circuit, I experimented with a simple blink test using just one LED. I uploaded the basic Blink example to the Arduino IDE to see the LED turn on and off every second. This small test helped me understand how the digital pins, resistors, and ground connections actually worked together in a real circuit. Since it was my first time using Arduino, doing this warm-up made me more confident in reading pin numbers, placing components correctly, and recognizing how code controls physical outputs on the board.

Reflection 🙂

Overall, I really want to emphasize that i’m very proud of myself for this assignment. This showed me how coding and wiring work hand in hand, the schematic guided the hardware setup, and the code brought it to life. This was a rewarding first experience combining both analog and digital sensors in one interactive circuit.

Week 8 – Unusual Switch (Foot-Activated Circuit)

 Step-On Switch:  An Unusual Arduino Foot-Activated Circuit

Video:

Overview:

For my first Arduino assignment, I created an unusual switch that does not require the use of hands. My switch is activated by pressing it with the foot, completing the circuit to light up two LEDs (red and blue). The idea was to create a physical interface that feels natural, intuitive, and accessible, something that could fit into everyday gestures like stepping, rather than pressing buttons with fingers.

Inspiration:

I was inspired to use the feet after thinking about how often we interact with objects using our hands, keyboards, phones, remotes,  and how limiting that can be. I wanted to explore another part of the body to interact with technology in a more grounded and physical way.
The foot felt like an interesting choice because stepping on something has a clear and satisfying feedback, it’s an action that already feels like a switch. Plus, it connected to the idea of accessibility, making a device that could be triggered without hand movement.

Building Process

Since this was my first time ever using Arduino, I learned a lot through trial and error and by asking questions in the IM Lab. I gathered all the materials from the lab, mainly cardboard scraps, metal sheets, and wires.

Here’s how I built it:

  1. Base:
    I used a folded piece of scrap cardboard to create a “step pad.” The bottom layer held the metal conductor (two rectangular aluminum sheets), while the top layer acted as a flexible pressure surface.
  2. Wiring and Circuit:
    I made my own extended wires since the ones in the kit were too short. Professor Mang showed me how to cut and separate the wires together safely. I then taped the two wires onto the cardboard so that when the top layer was pressed, the exposed wire ends touched the metal conductor, closing the circuit and turning on the LEDs.
  3. Circuit Connection:
    The switch was connected to an Arduino Uno and a breadboard. I programmed it so that when the circuit closed (when someone stepped on it), both red and blue LEDs would light up.
  4. Interface Design:
    To make it more intuitive and visually clear, I designed a quick cover on Canva that said “STEP ON ME,” giving the switch a fun and inviting personality. It also made the prototype feel more like a finished (polished) interactive product.

Overall Reflection:

I really enjoyed how hands-on this project was, literally using my feet instead of my hands. It helped me see how electronics can interact with the body in playful, unexpected ways. I liked that the result felt tactile and responsive, and it gave a sense of satisfaction when the LEDs lit up under my foot.

What I could improve next time:

  • I would make the structure more durable using thicker cardboard or a sturdier base, since repeated stepping eventually weakened it.
  • I wish I had more time to maybe figure out how to add all four different colored LED lights to make it look more visually appealing
  • The wiring could be cleaner and more hidden, maybe integrated into the design itself for a more polished look.

Schematic:

This schematic illustrates how my foot-activated switch completes the circuit to power the red and blue LEDs. The 5V output from the Arduino flows through the LEDs and a 330Ω resistor to limit current. The circuit remains open until the metal conductor plates make contact, this happens when I press the cardboard switch with my foot. Once pressed, the metal pieces touch, closing the circuit and allowing current to flow, which lights up both LEDs. This simple setup demonstrates the bones of how my arduino circuit work and was the plan I followed to make building this project much easier.

Conclusion

This project taught me the basics of Arduino, digital input/output, and how creative thinking can shape how we interact with technology. Building a switch with my foot instead of my hand made me realize that interfaces don’t always need to follow convention, they can be playful, personal, and unexpected.

Week 9 Reading Response

This week’s readings were definitely something refreshing to analyze as I agreed with the authors notion that physical computing is less about the devices themselves and more about the relationships they create between people and machines. In “Physical Computing’s Greatest Hits (and misses),” the authors critique of overly complicated projects made me question how often creators mistake complexity for creativity. I found his idea that “the simpler interaction is the more meaningful one” especially relatable. It reminded me of minimalist interactive artworks like Rafael Lozano-Hemmer’s Pulse Room, where a simple heartbeat sensor becomes a profound collective experience. The authors argument made me reflect on my own tendency to prioritize aesthetic or technical sophistication over intuitive engagement from the audience.

In “Making Interactive Art: Set the Stage, Then Shut Up,” his metaphor of the artist as a stage-setter really reframed how I think about authorship. I used to believe that creators should guide the audience toward a specific emotional reaction, but this readings insistence on letting the user finish the work through participation challenged that assumption. It raises the question: where does authorship end in interactive media? Is the true art in the design, or in the unpredictability of human interaction?

Both readings pushed me to see interactivity as a dialogue rather than a display. They align with theories I’ve encountered in my Interactive Media classes, especially discussions around user agency and co-creation. Ultimately, The authors perspective helped me realize that successful interactive work doesn’t shout, it listens. These readings made me rethink what it actually means to design something interactive. I used to believe that making an interactive project meant using as much technology as possible to impress people. But the ideas these readings assert is that the simpler interaction is often the more meaningful one, which really clicked with me. It made me realize that interaction isn’t about showing off sensors or screens, it’s about designing moments that feel natural. I thought about projects I’ve made where the tech took over the experience, and how maybe, the more invisible it becomes, the more powerful the interaction actually is.

Week 8 Reading Response

What I immediately noticed in the readings is how both Don Norman and Robert McMillan challenge how we define functionality; Norman through the psychology of aesthetics, and McMillan through the ingenuity of software engineering. Reading “Emotion and Design: Attractive Things Work Better” made me question something simple yet profound: why do I find certain interfaces or objects “trustworthy”? Norman’s claim that “attractive things work better” stayed with me because it connects emotion to cognition, arguing that beauty is not decoration but an active force in usability. His description of positive affect broadening creative thought resonated with me, especially when I considered my own design projects in other Interactive Media courses I have taken. When a prototype looks cohesive and inviting, I find myself more patient while debugging it; frustration fades faster. Norman’s teapot metaphor illustrates this perfectly, the emotional experience of interacting with a design changes how we perceive its flaws.

In contrast, McMillan’s “Her Code Got Humans on the Moon” celebrates the emotional labor and intellectual rigor behind Margaret Hamilton’s software for Apollo 11. I was surprised by how Hamilton’s story complicates the idea that engineering is purely rational. Her insistence on accounting for human error, writing software that could correct an astronaut’s mistake, echoes Norman’s belief that design must accommodate emotion and imperfection. When Hamilton’s code prevented a lunar crash due to a data overload, it wasn’t just logic at work but empathy, the foresight to design for failure.

Together, these texts made me rethink the separation between “soft” and “hard” skills in design. Emotion and logic, art and code, are not opposites but co-creators of reliability. I’m left wondering: in a future dominated by AI systems, can machines be designed to “care” the way Hamilton’s software did, to anticipate human error with grace?