Concept

ExpressNotes is an interactive audiovisual art experience that blends music and generative visuals to foster expressive play. The project allows users to press buttons on a physical interface (Arduino with pushbuttons) to play piano notes, while dynamic visuals are generated on-screen in real-time. Each note corresponds to a unique visual form and color, turning a simple musical interaction into a creative multimedia composition. The project invites users to explore the relationship between sound and visuals, while also giving them the ability to control the visual environment through canvas color selection and a volume knob for audio modulation.

Implementation Overview

The system is composed of three core components: the Arduino microcontroller, which handles hardware input; the P5.js interface, which handles real-time visuals and audio playback; and a communication bridge between the two using the Web Serial API. The user first lands on a welcome screen featuring a soft background image and the title ExpressNotes, along with instructions and canvas customization. Upon connecting to the Arduino, users select either a black or white canvas before launching into the live performance mode. From there, pressing a button triggers both a piano note and a visual form, while a potentiometer allows for fine volume control of all audio feedback.

Interaction Design

The project emphasizes minimalism and clarity in its interaction model. The welcome screen gently guides users to make creative choices from the start by allowing them to select a canvas color, helping them set the tone for their audiovisual artwork. Once the canvas is active, each button press corresponds to a distinct musical note and is visually reflected through shape, color, and animation. Users can reset the artwork with a “Clear Canvas” button or return to the welcome screen with an “Exit to Intro” button. Additionally, users can press the ‘C’ key on their keyboard to instantly clear the screen. These layered controls enhance the sense of flow and control throughout the interaction.

Arduino Code Description

The Arduino handles seven pushbuttons and a potentiometer. Each button is mapped to a musical note—A through G—and each time a button is pressed, the Arduino sends a serial message like note:C to the connected computer. The potentiometer is used to adjust volume dynamically. Its analog value is read on every loop and sent as a message like volume:873. To avoid repeated messages while a button is held down, the code tracks the previous state of each button to only send data when a new press is detected. The complete Arduino sketch is included below:

const int potPin = A0;
const int buttonPins[] = {2, 3, 4, 5, 6, 7, 8}; // Buttons for A, B, C, D, E, F, G
const char* notes[] = {"A", "B", "C", "D", "E", "F", "G"};
bool buttonStates[7] = {false, false, false, false, false, false, false};

void setup() {
  Serial.begin(57600);
  for (int i = 0; i < 7; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);
  }
}

void loop() {
  int volume = analogRead(potPin);
  Serial.print("volume:");
  Serial.println(volume);

  for (int i = 0; i < 7; i++) {
    bool isPressed = digitalRead(buttonPins[i]) == LOW;
    if (isPressed && !buttonStates[i]) {
      Serial.print("note:");
      Serial.println(notes[i]);
    }
    buttonStates[i] = isPressed;
  }

  delay(100);
}

P5.JS Code :

let port;
let connectBtn;
let soundA, soundB, soundC, soundD, soundE, soundF, soundG;
let volume = 0.5;
let bgImage;
let isConnected = false;
let showIntro = false;
let canvasColor = 0; // 0 for black, 255 for white
let colorChoiceMade = false;
let blackBtn, whiteBtn, clearBtn, exitBtn;
let firstDraw = true;

function preload() {
  soundFormats('wav');
  soundA = loadSound('A.wav');
  soundB = loadSound('B.wav');
  soundC = loadSound('C.wav');
  soundD = loadSound('D.wav');
  soundE = loadSound('E.wav');
  soundF = loadSound('F.wav');
  soundG = loadSound('A.wav'); // Adjust if different from A
  bgImage = loadImage('background.jpg');
}

function setup() {
  createCanvas(windowWidth, windowHeight);
  background(0);
  textAlign(CENTER, CENTER);
  port = createSerial();

  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(width / 2 - 100, height / 2);
  connectBtn.style('font-size', '20px');
  connectBtn.style('padding', '15px 30px');
  connectBtn.mousePressed(connectToArduino);
}

function draw() {
  if (firstDraw) {
    background(0);
    firstDraw = false;
  }

  if (!isConnected) {
    fill(255);
    textSize(32);
    text("ExpressNotes", width / 2, height / 2 - 100);
    return;
  }

  if (isConnected && !showIntro) {
    showIntro = true;
  }

  if (showIntro && !colorChoiceMade) {
    displayIntro();
    return;
  }

  // Only handle serial data, don't clear the canvas
  let str = port.readUntil("\n");
  if (str.length > 0) {
    handleSerial(str.trim());
  }
}

function connectToArduino() {
  if (!port.opened()) {
    let used = usedSerialPorts();
    if (used.length > 0) {
      port.open(used[0], 57600);
    } else {
      port.open('Arduino', 57600);
    }
    isConnected = true;
    connectBtn.hide();
  }
}

function displayIntro() {
  tint(255, 100);
  image(bgImage, 0, 0, width, height);
  noTint();

  fill(0, 0, 0, 200);
  rect(width / 4, height / 4, width / 2, height / 2, 20);

  fill(255);
  textSize(32);
  text("ExpressNotes", width / 2, height / 4 + 50);
  textSize(16);
  text(
    "Welcome to ExpressNotes!\n\nThis interactive application lets you create\nvisual art while playing musical notes.\n\nChoose your canvas\nand start creating!",
    width / 2,
    height / 2 - 40
  );

  // Create canvas color selection buttons
  if (!blackBtn && !whiteBtn) {
    blackBtn = createButton("Black Canvas");
    blackBtn.position(width / 2 - 150, height / 2 + 80);
    blackBtn.mousePressed(() => chooseCanvasColor(0));

    whiteBtn = createButton("White Canvas");
    whiteBtn.position(width / 2 + 50, height / 2 + 80);
    whiteBtn.mousePressed(() => chooseCanvasColor(255));
  }
}

function chooseCanvasColor(colorValue) {
  canvasColor = colorValue;
  colorChoiceMade = true;
  if (blackBtn) blackBtn.remove();
  if (whiteBtn) whiteBtn.remove();
  background(canvasColor); // Only clear when changing colors

  // Show the Clear and Exit buttons after canvas selection
  showCanvasControls();
}

function showCanvasControls() {
  clearBtn = createButton("Clear Canvas");
  clearBtn.position(10, 10);
  clearBtn.mousePressed(clearCanvas);

  exitBtn = createButton("Exit to Intro");
  exitBtn.position(10, 50);
  exitBtn.mousePressed(exitToIntro);
}

function clearCanvas() {
  background(canvasColor); // Clear canvas with current background color
}

function exitToIntro() {
  background(0);
  showIntro = false;
  colorChoiceMade = false;
  clearBtn.remove();
  exitBtn.remove();
  blackBtn = null;
  whiteBtn = null;
  // Reset the intro page
  showIntro = true;
  displayIntro();
}

function handleSerial(data) {
  if (data.startsWith("note:")) {
    let note = data.substring(5);
    playNote(note);
    showVisual(note);
  } else if (data.startsWith("volume:")) {
    let val = parseInt(data.substring(7));
    volume = map(val, 0, 1023, 0, 1);
    setVolume(volume);
  }
}

function playNote(note) {
  if (note === "A") soundA.play();
  else if (note === "B") soundB.play();
  else if (note === "C") soundC.play();
  else if (note === "D") soundD.play();
  else if (note === "E") soundE.play();
  else if (note === "F") soundF.play();
  else if (note === "G") soundG.play();
}

function setVolume(vol) {
  [soundA, soundB, soundC, soundD, soundE, soundF, soundG].forEach(s => s.setVolume(vol));
}

function showVisual(note) {
  push(); // Save current drawing state
  if (note === "A") {
    fill(0, 0, 255, 150);
    noStroke();
    ellipse(random(width), random(height), 50);
  } else if (note === "B") {
    fill(255, 215, 0, 150);
    noStroke();
    triangle(random(width), random(height), random(width), random(height), random(width), random(height));
  } else if (note === "C") {
    fill(255, 0, 0, 150);
    noStroke();
    rect(random(width), random(height), 60, 60);
  } else if (note === "D") {
    stroke(0, 255, 255);
    noFill();
    line(random(width), 0, random(width), height);
  } else if (note === "E") {
    fill(0, 255, 0, 150);
    noStroke();
    ellipse(random(width), random(height), 80);
  } else if (note === "F") {
    fill(255, 105, 180, 150);
    noStroke();
    rect(random(width), random(height), 30, 90);
  } else if (note === "G") {
    stroke(255, 255, 0);
    noFill();
    line(0, random(height), width, random(height));
  }
  pop(); // Restore original drawing state
}

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
  if (!isConnected) {
    connectBtn.position(width / 2 - 100, height / 2);
  }
}


function keyPressed() {
  if (key === 'c' || key === 'C') {
    background(canvasColor); // Clear canvas with current background color
  }
}



Circuit Schematic

The circuit includes seven pushbuttons connected to Arduino digital pins 2 through 8, with internal pull-up resistors enabled. Each button connects one leg to ground and the other to a digital pin. A potentiometer is connected with its middle pin going to A0 on the Arduino, while the outer pins go to 5V and GND. A small speaker or piezo buzzer is powered separately and connected to the P5.js interface for audio playback.

P5.js Code Description

The P5.js sketch handles all audio-visual feedback and interaction. Upon launching, the user is greeted with a title screen and two buttons to choose between a black or white canvas. Once the selection is made, the sketch draws continuously without clearing the background, allowing visuals from each button press to layer and evolve over time. Each button triggers a different note (using .wav files preloaded into the project) and spawns a unique visual form—such as colored circles, triangles, rectangles, and lines—with transparency effects for layering. Volume is dynamically updated from the serial input and mapped to the 0–1 range for sound control. The code uses the p5.webserial library to read serial messages from the Arduino, interpret them, and respond accordingly.

Communication Between Arduino and P5.js

Communication is established using the Web Serial API integrated into P5.js via the p5.webserial library. The Arduino sends simple serial strings indicating either the current volume or the note pressed. The P5.js sketch listens for these messages using port.readUntil("\n"), parses them, and then calls appropriate functions to play sounds or update the interface. For example, a message like note:E will trigger the playNote("E") function and then create the matching shape with showVisual("E"). This streamlined, human-readable message protocol keeps the interaction fluid and easy to debug.

Project Highlights

One of the most rewarding aspects of ExpressNotes is the tight integration between sound, visuals, and user interaction. The use of simple hardware elements to trigger complex audiovisual responses creates an accessible yet expressive digital instrument. The welcome screen and canvas selection elevate the experience beyond just utility and into a more artistic, curated space. The project has also succeeded in demonstrating how hardware and software can communicate fluidly in the browser using modern tools like Web Serial, eliminating the need for extra software installations or complex drivers.

Future Improvements

For future iterations, several enhancements could expand the expressive range of the project. First, including different instrument samples or letting users upload their own would personalize the sound experience. Second, adding real-time animation or particle effects tied to note velocity or duration could create richer visual compositions. Additionally, saving and exporting the canvas as an image or even a short video clip would let users archive and share their creations. Improving responsiveness by removing the delay in the Arduino loop and supporting multiple simultaneous button presses are also key technical upgrades to consider.

Link to video demo : https://drive.google.com/drive/folders/1so48ZlyaFx0JvT3NU65Ju6wzncWNYsBa

Week 13 – Final Project User Testing

Link to video : https://drive.google.com/file/d/1wrsx0LpX8-KULwrtQ7UhdwTtWr0vSpkc/view?usp=sharing

To evaluate the user experience of ExpressNotes, I conducted testing sessions where participants interacted with the system without receiving any instructions or guidance. The goal was to observe how intuitive the system is and identify any points of confusion. Users were introduced to the setup with the Arduino buttons, potentiometer, and the connected P5.js visual interface on a laptop. Most users hesitated at the beginning, unsure of how to begin the interaction, with the “Connect to Arduino” button being overlooked by several participants.

Once the connection was established and users began pressing buttons, they quickly recognized that each button triggered a unique piano note along with a visual effect on the screen. This immediate feedback created a sense of play and curiosity, and users began experimenting more confidently. The relationship between physical input and audiovisual output was generally clear, although the specific mappings between notes and visuals were not always understood without further exploration or explanation.

The potentiometer was another area where users expressed confusion. While some participants guessed that it controlled volume, others assumed it affected brightness or visual intensity. The lack of on-screen feedback for the potentiometer made its purpose harder to identify. Adding a visual indicator—such as a dynamic volume bar—could significantly improve clarity and reinforce the connection between the physical control and the system’s response.

One common suggestion from users was to enhance the physical design of the interface. Several participants mentioned that the Arduino board and buttons looked too technical or unfinished. They recommended covering the board with a piano-style overlay so the buttons resemble piano keys. This would not only make the device more visually appealing but also give immediate context to new users, helping them understand that the interaction is musically driven.

ExpressNotes was well received as an interactive and expressive experience. Users enjoyed the audio-visual feedback and were intrigued by how the visual patterns changed with each note. However, clearer onboarding, labels for controls, a visual volume indicator, and improved hardware presentation would help users engage more quickly and confidently. These observations will guide improvements to make the project more accessible, intuitive, and enjoyable for first-time users.

Week 12 – Finalised Concept

Final Project Proposal: ExpressNotes

For my final project, I am creating ExpressNotes, an interactive audiovisual system that combines sound, visuals, and physical input to inspire real-time artistic expression. The system uses an Arduino connected to seven push buttons and a potentiometer. Each button represents a musical note in the A-G scale. When the user presses a button, the Arduino detects the interaction and sends the corresponding note information to a P5.js sketch running on a computer. The potentiometer allows the user to control the volume of the sounds being played. In response to these inputs, P5.js generates synchronized piano sounds and matching visual effects, forming a dynamic and engaging artistic experience that evolves with each interaction.

ExpressNotes is designed to turn the user into both a musician and visual artist. Each button press is more than a sound—it becomes a trigger for a burst of visual energy. The system fosters expressive exploration by offering real-time feedback. For example, pressing the “C” button might create a ripple of soft blue circles along with a gentle piano tone, while pressing “F” could unleash rotating magenta shapes accompanied by a brighter sound. Over time, the visual canvas builds up in complexity and motion, reflecting the rhythm and emotion of the user’s choices.

The Arduino is programmed to listen for button presses and read the current position of the potentiometer. When a button is pressed, it sends a message like “note:C” to the P5.js program. It also continuously reads the potentiometer value, which ranges from 0 to 1023, and sends volume updates in the form of messages like “volume:750.” The Arduino serves solely as a sender—it does not receive any data back from P5.js.

On the software side, P5.js is responsible for listening to the serial port and interpreting the messages from the Arduino. When it receives a note message, it plays the corresponding piano note and triggers a unique visual effect associated with that note. When it receives a volume message, it maps the value to a usable range and updates the volume of the audio accordingly. Each note is associated with a different visual response to create a richer and more immersive experience. For instance, the “A” note may create blue ripples, “B” may release golden triangle bursts, “C” may trigger expanding red squares, “D” might produce spiraling cyan patterns, “E” could generate flowing green waves, “F” may show rotating magenta shapes, and “G” could spark star-like white particles.

When the user interacts with the buttons and the volume knob, they create a personal audiovisual composition. The experience is fluid, immediate, and emotionally engaging. Each session is unique, and the visual display reflects the rhythm, timing, and feeling of the musical input. The evolving artwork becomes a living canvas of user expression.

In nutshell, ExpressNotes demonstrates the creative potential of combining physical sensors with generative audio-visual programming. It allows users to explore sound and visual art intuitively, transforming simple button presses into an expressive performance. The project encourages artistic freedom, emotional engagement, and a playful connection between sound and image, making it a compelling example of interactive media design.

Week 11 – Final Project Concept

For my final project, I am creating an interactive audiovisual system called ExpressNotes, which combines sound, visuals, and physical input to encourage real-time artistic expression. The system uses an Arduino with four push buttons and a potentiometer. Each button press plays a different piano note, while the potentiometer adjusts the volume. These physical actions are detected by the Arduino and sent to a P5.js sketch running on a multimedia computer. The P5 environment generates visual effects on screen that correspond to each note, creating a dynamic and evolving piece of generative art based on the user’s input.

This project is designed to facilitate an engaging, expressive loop between the user and the system. The Arduino listens to physical interactions—button presses and volume control—and transmits that information instantly. The P5.js sketch interprets the input by producing synchronized audio and visuals. For example, pressing a specific button might trigger a ripple of blue circles with a soft C note, while another might launch animated triangles with a brighter tone. The visual and auditory elements work in tandem, offering immediate and meaningful feedback that makes the experience immersive and rewarding.

ExpressNotes emphasizes creativity and emotional engagement over precision or performance. The user is both musician and painter, creating a personal composition each time they interact with the system. The careful timing of responses ensures the interaction feels fluid and intuitive, and the unique art generated can be saved or shared. This project demonstrates the power of combining physical sensors with real-time multimedia output to create a space for individual expression that is playful, responsive, and visually compelling.

Week 11 – Exercises

1: Ellipse

/*
 * Week 11 Production (1)
 *
 * Inputs:
 *   - A1 - 10k potentiometer connected to 5V and GND
 *
 */

int interval = 100;
int lastMessageTime = 0;

int potPin = A1;

void setup() {
  Serial.begin(9600); // initialize serial communications
}
 
void loop() {
  // read the input pin:
  int potentiometer = analogRead(potPin);                  
  // remap the pot value to 0-255:
  int mappedPotValue = map(potentiometer, 0, 1023, 0, 255); 
  // print the value to the serial port.
  Serial.println(mappedPotValue);
  // slight delay to stabilize the ADC:
  delay(1);                                            
  
  delay(100);
}
let port;
let connectBtn;
let baudrate = 9600;
let lastMessage = "";
let currX;

function setup() {
  createCanvas(400, 400);
  background(220);

  port = createSerial();

  let usedPorts = usedSerialPorts();
  if (usedPorts.length > 0) {
    port.open(usedPorts[0], baudrate);
  }

  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(80, height-60);
  connectBtn.mousePressed(connectBtnClick);

  currX = width/2;
}

function draw() {
  background("white");
  fill('grey');
  circle(currX, height/2, 100);
  
  let str = port.readUntil("\n");
  if (str.length > 0) {
    // console.log(str);
    lastMessage = str;
  }
  
  // Display the most recent message
  text("Last message: " + lastMessage, 10, height - 20);

  // change button label based on connection status
  if (!port.opened()) {
    connectBtn.html("Connect to Arduino");
  } else {
    connectBtn.html("Disconnect");
  }
  
  // // Move shape based on received value
  if (!lastMessage) {lastMessage = "127"}
  currX = map(int(lastMessage), 0, 255, 0, width);
  currX = floor(currX);
  // console.log(currX);
}

function connectBtnClick() {
  if (!port.opened()) {
    port.open("Arduino", baudrate);
  } else {
    port.close();
  }
}

2: LED Brightness

/* 
 * Week 11 Production (2)
 * 
 * Outputs:
 * - 5 - LED
 * 
*/

int ledPin = 5;

void setup() {
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(ledPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(ledPin, HIGH);
  delay(200);
  digitalWrite(ledPin, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int brightness = Serial.parseInt();
    if (Serial.read() == '\n') {
      analogWrite(ledPin, brightness);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}
let port;
let baudrate = 9600;

// Show button to connect / disconnect
let showConnectButton = false;

function setup() {
  createCanvas(640, 480);
  textSize(20);

  // Create the serial port
  port = createSerial();

  // If the user previously connected, reopen the same port  
  let usedPorts = usedSerialPorts();
  if (usedPorts.length > 0) {
    port.open(usedPorts[0], baudrate);
  }

  // any other ports can be opened via a dialog
  if (showConnectButton) {
    connectBtn = createButton('Connect to Arduino');
    connectBtn.position(80, 350);
    connectBtn.mousePressed(setupSerial);
  }
  
}

// Show serial port connection dialog in response to action
function setupSerial() {
  if (!port.opened()) {
    port.open('Arduino', baudrate);
  } else {
    port.close();
  }
}

function draw() {
  background('white');
  fill('black');
  
  if (showConnectButton) {
    // changes button label based on connection status
    if (!port.opened()) {
      connectBtn.html('Connect to Arduino');
    } else {
      connectBtn.html('Disconnect');
    }
  }

  if (!port.opened()) {
    text("Disconnected - press space to connect", 20, 30);
  } else {
    text("Connected - press space to disconnect", 20, 30);
    
    // // Transmit brightness based on mouse position
    mappedX = floor(map(mouseX, 0, width, 0, 255));
    console.log(mappedX);
    let sendToArduino = mappedX + "\n";
    port.write(sendToArduino);
  }
}

function keyPressed() {
  if (key == " ") {
    setupSerial();
  }
}

3: Wind Gravity

/* 
 * Week 11 Production (3)
 * 
 * Inputs:
 *   - A1 - 10k potentiometer connected to 5V and GND
 * 
 * Outputs:
 * - 5 - LED
 * 
*/

int potPin = A1;
int ledPin = 5;

int interval = 100;
int lastMessageTime = 0;

void setup() {
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(potPin, INPUT);
  pinMode(ledPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(ledPin, HIGH);
  delay(200);
  digitalWrite(ledPin, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("127"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    // blink LED based on p5 data
    int status = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(ledPin, status);
      delay(10);
      digitalWrite(ledPin, LOW);
    }

    if (lastMessageTime > interval) {
      lastMessageTime = 0;
      // send mapped potentiometer reading to p5
      int potentiometer = analogRead(potPin);                  
      int mappedPotValue = map(potentiometer, 0, 1023, 0, 255); 
      Serial.println(mappedPotValue);
      // slight delay to stabilize the ADC:
      delay(1);
    }
    else {
      lastMessageTime++;
    }

  }
  digitalWrite(LED_BUILTIN, LOW);
}
let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;

// // Arduino
let port;
let connectBtn;
let baudrate = 9600;
let lastMessage = "";
let showConnectButton = false;

function setup() {
  createCanvas(620, 400);
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
  
  // // Arduino
  port = createSerial();
  let usedPorts = usedSerialPorts();
  if (usedPorts.length > 0) {
    port.open(usedPorts[0], baudrate);
  }
  if (showConnectButton) {
    connectBtn = createButton('Connect to Arduino');
    connectBtn.position(80, 300);
    connectBtn.mousePressed(setupSerial);
  }
}

function draw() {
  background(255);
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
      flashLight();
    }
  
  // // Arduino
  if (showConnectButton) {
    if (!port.opened()) {
      connectBtn.html('Connect to Arduino');
    } else {
      connectBtn.html('Disconnect');
    }
  }
  fill('black');
  if (!port.opened()) {
    text("Disconnected", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
  let str = port.readUntil("\n");
  if (str.length > 0) {
    // console.log(str);
    lastMessage = str;
  }
  
  // Display the most recent message
  text("Last message: " + lastMessage, 10, height - 20);
  
  // // Convert received value to wind.x value
  mappedPot = map(int(lastMessage), 0, 255, -1, 1);
  wind.x = mappedPot;
  let windSpeed = "Wind speed: " + wind.x
  text(windSpeed.substring(0,20), 10, height - 5);
  
  fill('white');
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed(){
  if (keyCode==LEFT_ARROW){
    wind.x=-1;
  }
  if (keyCode==RIGHT_ARROW){
    wind.x=1;
  }
  if (key==' '){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
    port.write("0\n"); // reset light
  }
}

// // Arduino
function setupSerial() {
  if (!port.opened()) {
    port.open('Arduino', baudrate);
  } else {
    port.close();
  }
}

function flashLight() {
  if (port.opened()) {
    port.write("1\n");
    // port.write("0\n");
  }
}

 

Link to video : https://drive.google.com/file/d/1DTy7Y1JiQVhqhlrblGajL4Inp4co38Vx/view?usp=sharing

 

Week 11 Reading Response

Reading Design Meets Disability made me rethink how I’ve traditionally understood assistive devices—as purely functional tools. Pullin challenges that limited view by showing how design and disability can intersect in creative, expressive, and even fashionable ways. What stood out most to me was the idea that disability devices, like hearing aids or prosthetics, shouldn’t have to be hidden or neutral—they can be bold, beautiful, and part of someone’s personal identity. The example of Aimee Mullins using prosthetic legs designed by Alexander McQueen was especially powerful. It showed how design can shift perceptions of disability from something to be fixed or minimized to something that can be celebrated and uniquely expressed.

This reading made me reflect on how design influences the way we feel about ourselves and how others see us. It made me realize how much design has to do with dignity, pride, and empowerment—not just function. I found myself thinking about how many products I use daily that are designed to be sleek or stylish, and how unfair it is that many people with disabilities are given tools that feel like medical equipment instead. Pullin’s emphasis on co-design really resonated with me; involving disabled people directly in the design process isn’t just practical, it’s respectful. This reading left me inspired to think more inclusively about design and more critically about who gets to have choice, beauty, and individuality in the products they use.

Week 10: Musical Instrument

Link to video demo : https://drive.google.com/file/d/1KGj_M7xq6IdsS2Qwq-zbjjspCPPgcaj4/view?usp=sharing

For this assignment, I decided to build a digital trumpet using an Arduino Uno, three push buttons, a potentiometer, and a speaker. My goal was to simulate the behavior of a real trumpet in a fun and creative way, even though I knew the sound would be more electronic than acoustic. It was a great opportunity for me to explore how hardware and code can come together to create music, and I ended up learning a lot about sound generation and analog input in the process.

The concept was simple: each of the three buttons acts like a trumpet valve, and each one triggers a different note — specifically G4, A4, and B4. These are represented in the code as fixed frequencies (392 Hz, 440 Hz, and 494 Hz). When I press one of the buttons, the Arduino sends a signal to the speaker to play the corresponding note. The potentiometer is connected to analog pin A0 and is used to control the volume. This was a really cool addition because it gave the instrument a bit of expressive control — just like how a real musician might vary their breath to change the loudness of a note.

To make the sound a bit more interesting and less robotic, I added a little “vibrato” effect by randomly adjusting the pitch slightly while the note is playing. This gives the tone a subtle wobble that sounds more natural — kind of like the way a real trumpet player might shape a note with their lips. It’s still a square wave, and it’s definitely digital-sounding, but it gives it more character than just playing a flat, unchanging frequency.

If I were to continue developing this project, I have a few ideas for improvements. One would be to add more buttons or allow combinations of the three to create more notes — like a real trumpet with multiple valve positions. I’d also love to add some kind of envelope shaping, so the notes could have a smoother fade-in or fade-out instead of sounding flat and abrupt. It might also be fun to hook the project up to MIDI so it could control a software synthesizer and produce higher quality trumpet sounds. And for an extra visual touch, I could add LEDs that light up in sync with the music.

CODE :

const int potPin = A0;          // Potentiometer for volume
const int speakerPin = 8;       // Speaker on PWM pin
const int buttonPins[] = {2, 3, 4}; // 3 buttons = 3 different notes

// Trumpet-like frequencies (roughly G4, A4, B4)
const int trumpetNotes[] = {392, 440, 494}; 

void setup() {
  for (int i = 0; i < 3; i++) {
    pinMode(buttonPins[i], INPUT); // External pull-down resistors
  }
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  int volume = analogRead(potPin) / 4;

  for (int i = 0; i < 3; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) {
      playTrumpetNote(trumpetNotes[i], volume);
    }
  }

  delay(10); 
}

void playTrumpetNote(int baseFreq, int volume) {
  unsigned long duration = 10000; // microseconds per cycle
  unsigned long startTime = micros();

  while (micros() - startTime < duration) {
    // Slight pitch wobble
    int vibrato = random(-3, 3);
    int currentFreq = baseFreq + vibrato;
    int halfPeriod = 1000000 / currentFreq / 2;

    analogWrite(speakerPin, volume);
    delayMicroseconds(halfPeriod);
    analogWrite(speakerPin, 0);
    delayMicroseconds(halfPeriod);
  }
}

 

Week 10 – Reading Response

Reading A Brief Rant on the Future of Interaction Design felt less like a critique of current technology and more like a reminder of how disconnected we’ve become from our own bodies when interacting with digital tools. What stood out to me wasn’t just the fixation on touch or screens, but the larger idea that we’ve built a digital world that doesn’t physically involve us much at all. We sit, swipe, and speak to our devices, but rarely do we engage with them in a way that feels natural or satisfying. That idea made me reflect on how strange it is that we’ve accepted this passive interaction style as normal, even though it barely scratches the surface of what our senses and motor skills are capable of. The rant made me question whether convenience has quietly replaced depth in our relationship with technology.

What also struck me was the underlying urgency — not just to change what we build, but to change how we think about building. It challenged the assumption that progress is purely about making things smaller, faster, or more responsive. Instead, it asked: what if we measured progress by how much it involves us — our movement, our perception, our ability to explore ideas physically, not just conceptually? It reminded me that interaction design isn’t only about the interface; it’s about the experience and how deeply it aligns with our human nature. This reading didn’t just shift my thinking about interfaces — it made me realize that future design needs to be less about controlling machines and more about collaborating with them through a fuller range of expression. That’s a future I’d like to be part of.

Week 9 – Reading Response

Reading “Physical Computing’s Greatest Hits (and Misses)” was strangely comforting and inspiring at the same time. It helped me realize that a lot of the ideas I thought were already “done” or cliché—like using sensors to make interactive gloves or LED-based projects—are actually important milestones in learning and creativity. I used to feel discouraged when I saw someone else had already made something similar to what I had in mind, but this piece reframed that completely. It emphasized how each version of a repeated idea can still be fresh and meaningful when approached from a personal or unique angle. I found myself especially drawn to the “Fields of Grass” and “Things You Yell At” themes—they really match how I want people to feel something tactile or emotional when they interact with my work. This gave me permission to play, iterate, and remix existing concepts without feeling like I have to reinvent the wheel just to be valid.

That sense of permission and openness carried over into “Making Interactive Art: Set the Stage, Then Shut Up and Listen,” which really shifted how I think about authorship and control in interactive work. I’ve always felt the need to guide viewers to the “correct” meaning, but Tigoe’s argument for stepping back and letting the audience complete the piece through their own actions hit me hard. It reminded me that the most memorable interactions I’ve had with art happened when I could explore it freely, without being told what to think. The comparison to directing actors—offering intentions instead of rigid instructions—really reframed how I might approach building experiences. I’m beginning to see interactive art less like a fixed statement and more like a space for dialogue, where the audience brings the work to life and creates meaning in real time.

Week 9: Analog input & output

Video demo link : https://drive.google.com/file/d/1KHeKfNwfINI-l48dOEf03Td8BbLoIRbe/view?usp=drive_link

Hand-drawn schematic:

For this assignment, I set out to build a simple system using both a digital and an analog sensor to control two separate LEDs. My chosen components were an LDR (light-dependent resistor) for the analog input and a push button for the digital one. I liked the idea of mixing natural input (like ambient light) with a more deliberate human interaction (pressing a button) to affect two types of light output.

To keep things manageable, I started with just the LDR and one LED. The goal was to make the LED change brightness based on how much light the sensor picked up. I wired the LDR as part of a voltage divider, connected it to an analog input pin, and ran a basic analogRead() loop to see the values coming in. From there, I used the map() function to translate the light readings (from 0 to 1023) into PWM values (0 to 255) for controlling LED brightness. I also inverted the mapping, so the LED would get brighter when it was darker — which just felt more intuitive, like a night light.

Once that was working, I added the digital side of things with a simple push button. Pressing the button would turn a second LED on, and releasing it would turn it off. Simple on/off logic with digitalRead() and digitalWrite() did the trick. It was satisfying to see both LEDs responding to such different kinds of input — one gradual and ambient, the other instant and tactile.

One of the challenges I ran into was just getting reliable readings from the button. At first, it was behaving erratically because I forgot to use a pull-down resistor. Once I added that in, everything stabilized. I also realized pretty quickly that without any sort of filtering or delay, the analog LED could flicker a bit depending on the room lighting, so I added a small delay(100) to smooth things out a little.

const int ldrPin = A0;              // LDR connected to analog pin A0
const int buttonPin = 2;            // Push button connected to digital pin 2
const int ledAnalogPin = 9;         // PWM LED pin
const int ledDigitalPin = 12;       // On/off LED pin

void setup() {
  pinMode(buttonPin, INPUT);
  pinMode(ledAnalogPin, OUTPUT);
  pinMode(ledDigitalPin, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  int ldrValue = analogRead(ldrPin);        
  int brightness = map(ldrValue, 0, 1023, 255, 0);  // Brighter in darkness

  analogWrite(ledAnalogPin, brightness);

  int buttonState = digitalRead(buttonPin);

  if (buttonState == HIGH) {
    digitalWrite(ledDigitalPin, HIGH);
  } else {
    digitalWrite(ledDigitalPin, LOW);
  }

  delay(100);
}

Looking back, the most satisfying part was seeing the analog LED respond in real-time to light changes. Just waving my hand over the sensor and watching the LED slowly brighten gave a cool sense of control — like making a light react to shadows. It reminded me of automatic lights in stairways or hotel lobbies, except mine was sitting on a breadboard.

In the future, I’d like to swap the button for a capacitive touch sensor or maybe even a motion detector, so the LED lights up when someone walks by. Or take the LDR further by combining it with an RGB LED that changes color depending on how dark it is — there’s a lot of room to build from this foundation.