Week 2 : Reading Reflection on Casey Reas’ Talk

When I first watched Casey Reas’ talk on chance operations at the beginning of the semester, I wasn’t entirely sure how to feel. Honestly, I was just ready to absorb whatever perspectives were being offered through different narratives. I’ve always been someone who likes control when I’m creating and knowing what’s going to happen, having a plan, getting things “right.” So the idea of building randomness into a project felt a little chaotic, maybe even risky. But Reas broke that tension down. He walked us through how he uses systems, chance, and instruction-based art to remove the artist’s ego from the process and let the artwork evolve in ways he couldn’t fully predict. The way he referenced John Cage and connected those ideas to computational art made it all click: randomness doesn’t mean lack of intent, it just shifts where the intent is. Reas isn’t just throwing things into the void and hoping for the best, he’s setting up a structure where randomness can still move freely. That clicked with me. It’s not about losing control entirely, it’s about creating a space where unexpected things can happen without everything falling apart. That made the idea of randomness feel a lot less intimidating and a lot more useful.

Since I’m writing this reflection a bit later in the semester, I’ve had more time to think about my own relationship to randomness—and honestly, I like randomness I can control. Total chaos just ends up looking like a muddy mess. I prefer when randomness happens within boundaries, where the outcome is still cohesive and intentional, even if it’s unpredictable. That’s the balance I’m drawn to: letting go a little, but not so much that the work loses meaning or direction. It’s about creating space for surprise, but still being able to call the final result your own.

Week 12 – Finalized Idea

Concept:

My project explores the fascinating intersection between physical interaction and emergent systems through a digital flocking simulation. Inspired by Craig Reynolds’ “Boids” algorithm, I’m creating an interactive experience where users can manipulate a flock of virtual entities using both hand gestures and physical controls. The goal is to create an intuitive interface that allows people to “conduct” the movement of the flock, experiencing how simple rules create complex, mesmerizing patterns.

The simulation displays a collection of geometric shapes (triangles, circles, squares, and stars) that move according to three core flocking behaviors: separation, alignment, and cohesion. Users can influence these behaviors through hand gestures detected by a webcam and physical controls connected to an Arduino.

Arduino Integration Design

The Arduino component of my project will create a tangible interface for controlling specific aspects of the flocking simulation:

  1. Potentiometer Input:
    • Function: Controls the movement speed of all entities in the flock
    • Implementation: Analog reading from potentiometer (0-1023)
    • Communication: Raw values sent to P5 via serial communication
    • P5 Action: Values mapped to speed multiplier (0.5x to 5x normal speed)
  2. Button 1 – “Add” Button:
    • Function: Adds new entities to the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “ADD” text command when pressed
    • P5 Action: Creates 5 new boids at random positions
  3. Button 2 – “Remove” Button:
    • Function: Removes entities from the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “REMOVE” text command when pressed
    • P5 Action: Removes 5 random boids from the simulation

The Arduino code will continuously monitor these inputs and send the appropriate data through serial communication at 9600 baud. I plan to implement debouncing for the buttons to ensure clean signals and reliable operation.

P5.js Implementation Design

The P5.js sketch handles the core simulation and multiple input streams:

  1. Flocking Algorithm:
    • Three steering behaviors: separation (avoidance), alignment (velocity matching), cohesion (position averaging)
    • Adjustable weights for each behavior to change flock characteristics
    • Four visual representations: triangles (default), circles, squares, and stars
  2. Hand Gesture Recognition:
    • Uses ML5.js with HandPose model for real-time hand tracking
    • Left hand controls shape selection:
      • Index finger + thumb pinch: Triangle shape
      • Middle finger + thumb pinch: Circle shape
      • Ring finger + thumb pinch: Square shape
      • Pinky finger + thumb pinch: Star shape
    • Right hand controls flocking parameters:
      • Middle finger + thumb pinch: Increases separation force
      • Ring finger + thumb pinch: Increases cohesion force
      • Pinky finger + thumb pinch: Increases alignment force
  3. Serial Communication with Arduino:
    • Receives and processes three types of data:
      • Analog potentiometer values to control speed
      • “ADD” command to add boids
      • “REMOVE” command to remove boids
    • Provides visual indicator of connection status
  4. User Interface:
    • Visual feedback showing connection status, boid count, and potentiometer value
    • Dynamic gradient background that subtly responds to potentiometer input
    • Click-to-connect functionality for Arduino communication

Current Progress

So far, I’ve implemented the core flocking algorithm in P5.js and set up the hand tracking system using ML5.js. The boids respond correctly to the three steering behaviors, and I can now switch between different visual representations.

I’ve also established the serial communication framework between P5.js and Arduino using the p5.webserial.js library. The system can detect previously used serial ports and automatically reconnect when the page loads.

For the hand gesture recognition, I’ve successfully implemented the basic detection of pinch gestures between the thumb and different fingers. The system can now identify which hand is which (left vs. right) and apply different actions accordingly.

Next steps include:

  1. Finalizing the Arduino circuit with the potentiometer and two buttons
  2. Implementing proper debouncing for the buttons
  3. Refining the hand gesture detection to be more reliable
  4. Adjusting the flocking parameters for a more visually pleasing result
  5. Adding more visual feedback and possibly sound responses

The most challenging aspect so far has been getting the hand detection to work reliably, especially distinguishing between left and right hands consistently. I’m still working on improving this aspect of the project.

I believe this project has exciting potential not just as a technical demonstration, but as an exploration of how we can create intuitive interfaces for interacting with complex systems. By bridging physical controls and gesture recognition, I hope to create an engaging experience that allows users to develop an intuitive feel for how emergent behaviors arise from simple rules.

Week 12: Finalized Concept

Finalized concept for the project:

My final project concept is inspired by a popular game called Piano Tiles. My idea is to create a sturdy, fully functional four-key piano connected to an Arduino. Users will be able to physically play the game using this piano, while the gameplay will be displayed on a laptop screen and recreated in p5js, with some differences like a life powerup.

Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5

My arduino program will be in charge of sending all the inputs from the push buttons when a player presses a key on the piano to the p5js. This will be similar to the musical instrument assignment we did in class except the speaker will not be in the arduino but rather an output on the computer from p5 to see if the player pressed the right key in the right time frame.

Design and description of what P5 program will do and what it will send to and/or receive from Arduino:

Speaking of which, my p5js program will run the graphics for the game itself with users seeing the tiles they have to click. It will receive the input from the Arduino of when the user has clicked the piano key and use logic to make sure that was correct according to the game rules. If not, the game will end unless the player has an extra life which can be received in the game by pressing all 4 tiles 3 times at a certain point in the song.

I’m currently working on the 3d design for the piano. Right now I found a file online that looks like this:

But, I need it to have 4 keys, so I am going to see if I can somehow alter the design to be for 4 keys instead of these 7 and also add a hole for the wires to come out from. I also anticipate the timing of the keys with the p5 to be hard. Firstly, there is often a delay between the arduino and the p5 I noticed when using the potentiometer in one of our assignments and that could mess up my game. Secondly, creating the tiles to fall in sync with an audio will be difficult and time consuming. I may just make it so that the tiles fall randomly, even if it is not in sync with the music. The game mechanics will still work like usual though.

Week 12 – Project Proposal

This requires some explanation:

Originally, I had a well-fleshed-out idea for the use of motion sensors to play a volleyball game. While conceptually sound, in practice, the motion sensors just could not cooperate. After a lot of testing with Arduino and its sensors, I realized that the ball would move too quick  for the sensors to process properly. Instead, I decided to make a violin.

The main mechanism in charge of producing sound will be a potentiometer, in such a way that when a bowstring is pulled back and forth, the potentiometer dial shall turn. Its analog output will be sent to p5, and detecting the bow’s movement will power a synthesizer to play sound. Next, the violin will have digital output buttons. Holding down buttons will give the arduino digital outputs also sent to p5. In p5, detecting which button is being pressed down will turn into a specific note in the scale. each of the 8 buttons represents one note, forming a full scale. This allows us to get a functional violin.

Week 12-Finalised Concept

 Concept

It’s Crossy Road gone fairy-tale: guide an on-screen duck across busy roads and small streams in p5.js. Reach the final, wide river and a real-world duck glides straight across a tabletop “river,” lights pulsing as a victory lap. Screen and reality shake hands for one perfect moment

 What Already Works (p5.js Only)

 

Module Done Notes
full screen canvas, road- rivers, predators, food 60 fps, collision OK( still needs work)
Duck sprite + full-axis movement Arrow keys for now
Item counter + score/time keeping only counted through codes so far
Final-river trigger zone logic works

 Real-World Interaction (Planned)

  • Prop: duck on a 30 cm linear rail (continuous-rotation servo + belt).

  • Motion: one smooth back to front glide only.

  • FX: underside RGB LED strip (water glow) + small piezo “quack” sample.

  • Cue: begins the instant p5 fires DUCK_GO and stops at rail limit.

 Arduino Paragraph (hardware plan)

Thinking of using the Arduino as the go-between that lets the computer game talk to the real duck. A little joystick and one “hop” button plug into it; the board simply reads how far you push the stick and whether the button is pressed, then sends those numbers to p5.js through the USB cable every split-second. Most of the time the Arduino just listens. When the game finally says “DUCK_GO”, the board springs into action: it turns on a motor that slides the rubber duck straight across a mini track, switches on soft blue-green lights under the “water,” and makes a quick quack with a tiny speaker. When p5.js later sends “DUCK_STOP,” the motor and lights shut off and the duck stays put. Because motors and lights can gulp a lot of power all at once, they’ll run from their own plug-in adapter so the Arduino itself never loses juice mid-move.

 Next-Week Targets

  1. Prototype rail — mount servo + belt, confirm straight glide

  2. Minimal Arduino sketch —  joystick; act on DUCK_GO with LED blink

  3. Serial bridge live — replace console.log() with serial.write() in p5

  4. End-to-end smoke test — finish level, duck moves

 Risks & Mitigation

  • Servo overshoot → limit switches or timed cutoff.

  • Serial lag → short packets, high baud.

  • Scope creep → no extra rivers, no particle splashes until core loop is solid.

Final Project Progress and Design – Zayed Alsuwaidi

Commit to your Final Project Proposal, include the following explanations in your blog post:

    • Finalized concept for the project
    • Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5
    • Design and description of what P5 program will do and what it will send to and/or receive from Arduino
    • Start working on your overall project (document the progress)

The finalized concept for my project is essentially an experimental simulation which visualizes the specific heart-rate pattern of the person interacting with it, produces experimental music in coordination with that data representation in real-time, and allowing for the user to interact with the simulation.

Serial connection: From arduino to p5.js. (one way)

Links for music (not mine) and if time allows, I might create my own music, but this is the current set of ambient and experimental music:

https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.2/p5.min.js

https://cdnjs.cloudflare.com/ajax/libs/tone/14.8.49/Tone.min.js

function mousePressed() {
  for (let btn of buttons) {
    if (mouseX > btn.x && mouseX < btn.x + btn.w && mouseY > btn.y && mouseY < btn.y + btn.h) {
      if (btn.action === "back") {
        currentSeqIndex = (currentSeqIndex - 1 + sequences.length) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      } else if (btn.action === "pause") {
        btn.paused = !btn.paused;
        btn.label = btn.paused ? "Play" : "Pause";
        if (btn.paused) {
          Tone.Transport.pause();
        } else {
          Tone.Transport.start();
          sequence.start(0);
        }
      } else if (btn.action === "forward") {
        currentSeqIndex = (currentSeqIndex + 1) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      }
    }
  }
}

To emphasize the user interaction and fine-tune the functionality, the mouse pressed function alters the algorithm in which the music is produced.

 

I faced several issues with this TypeError:

TypeError: Cannot read properties of undefined (reading ‘time’)

 

I am currently using a placeholder variable for the input (heart rate) from this pulseSensor. The reason for that is that I need to solder the piece on the right (the metal wires) to create a connection so that I may connect the arduino to the pulseSensor. I am not experienced with soldering and I will ask for further help to continue this stage. 

Next Step: My next step is to solder the wires and to start testing the sensor and implement it into the project. From this, I will test which patterns I can identify to produce the required data visualization. This is a large part of the project so at the current phase it is 30% complete.

Here is my p5.js so far, with a working and interactive algorithm to select preferred ambient music, and functionality based on the heart rate (simulated and dummy variable controlled with slider).

For the arduino uno, I will use this code:

#include <PulseSensorPlayground.h>
#include <ArduinoJson.h>

const int PULSE_PIN = A0;
const int BTN1_PIN = 2;
const int BTN2_PIN = 3;
const int BTN3_PIN = 4;

PulseSensorPlayground pulseSensor;
StaticJsonDocument<200> doc;

void setup() {
  Serial.begin(9600);
  pulseSensor.analogInput(PULSE_PIN);
  pulseSensor.setThreshold(550);
  pulseSensor.begin();
  
  pinMode(BTN1_PIN, INPUT_PULLUP);
  pinMode(BTN2_PIN, INPUT_PULLUP);
  pinMode(BTN3_PIN, INPUT_PULLUP);
}

void loop() {
  int bpm = pulseSensor.getBeatsPerMinute();
  if (!pulseSensor.sawStartOfBeat()) {
    bpm = 0; // Reset if no beat detected
  }
  
  doc["bpm"] = bpm;
  doc["btn1"] = digitalRead(BTN1_PIN) == LOW ? 1 : 0;
  doc["btn2"] = digitalRead(BTN2_PIN) == LOW ? 1 : 0;
  doc["btn3"] = digitalRead(BTN3_PIN) == LOW ? 1 : 0;
  
  serializeJson(doc, Serial);
  Serial.println();
  
  delay(100); // Send data every 100ms
}

and I will test it and document my progress in debugging as well.

 

Design considerations of the physical presentation of the project:

I am still thinking through different forms of cases, designs such as bracelets or medical tape to make the connection between the sensor and the person interacting with the program.

The design requires technical consideration, as the connection between the sensor and the radial artery would be already slightly weak (with a margin of error) I need to document this as well and consider my design after the implementation of the pulseSensor.

For the buttons, I am planning to make them on some form of platform (something similar to the platform that the breadboard and arduino are attached to.


Fullscreen for the best experience.

Final Project Concept: Plant Whisperer – Interactive Plant Companion

Plant Whisperer is a physical-digital interaction system that lets a real plant express how it feels through a friendly digital avatar. Using Arduino Uno and p5.js, the system monitors the plant’s environment, specifically light exposure and human interaction, and translates this data into visual and auditory feedback.

I want to promote awareness of nature and care through playful, intuitive technology. It reimagines how we perceive and respond to non-verbal living things by giving the plant a way to “talk back.”

Sensors and Components

    • Photoresistor (Light Sensor): Detects ambient light around the plant.

    • Capacitive Touch Sensor – DIY (using the CapacitiveSensor library): Detects when the user interacts with the plant.

    • RGB LED: Shows the plant’s current emotional state in color.

    • Piezo Buzzer: Plays tones based on mood or user interaction.

Avatar Behavior (in p5.js)

The avatar is a stylized digital plant that changes facial expression, movement, background color, and plays ambient sounds based on the sensor input.

Inspiration

Mood Logic:

Sensor Input Mood Visual in p5.js LED Color Sound
Bright Light Happy Smiling plant, upright Green Gentle chime
Dim Light Sleepy Drooping plant, closed eyes Blue Soft drone
Very Low Light Sad Frowning plant, faded color Purple Slow tone
Button Pressed Excited Eyes sparkle, leaf wiggle Yellow flash Upbeat trill

Significance of the Project

My goal with this project is to encourage mindfulness, care, and emotional engagement with the environment. By giving a non-verbal living organism a digital voice, the system fosters empathy and attention.

This project is about more than just monitoring a plant, it’s about interaction. By gently blurring the line between organic life and digital expression, Plant Whisperer invites users to slow down, observe, and connect with their environment through technology.

Final Project Proposal

The project is a bomb defusal puzzle box, consisting of four mini-games that must be solved in sequence to reveal a 4-digit defusal code. Each completed challenge unlocks one digit of the code. After all four digits are revealed, the player must input the code correctly to determine which wire to “cut” to defuse the bomb.

The gameplay is immersive and pressure-driven, combining speed, precision, memory, and logic.

 The 4 Challenges

    1. Button Mash

    • Tap a button exactly 24 times as fast as possible so that user can move onto next challenge without wasting too much time. Encourages rhythm and pressure timing and serves as a warmup game for users as well.


      2. Math Lock

      A simple math problem appears on screen. The user selects their  answer by turning a potentiometer. A confirm button locks in the answer. Feedback is given through p5.js on correctness.


      3. Note Match

      A musical note (from 4 pitches) is played using a buzzer. The player uses one of four buttons to play and listen to notes. When confident, the player presses a confirm button to lock in their selection. Visual feedback shows which note they selected.


      4. Morse Code

      A Morse code pattern ( with dots and dashes) representing a letter is shown briefly on screen. The user recreates the pattern using . and – through short and long presses of a button and then lock in their answer using a designated confirm button 


Arduino Program: Input/Output Design

Inputs:

Buttons: 4 buttons which will be multipurpose and serve as options for the various challenges and one button which will act as confirmation button and users will use it to lock in their answer.

Potentiometer: For Math Lock answer selection.

Outputs:

Buzzer: For Note Match playback.
Serial Communication: Sends current state/selections to p5.js.

Arduino to p5.js:

Selections (0–3 for Note Match/Math Lock)
Dot/dash inputs during Morse Code
Tap count for Button Mash
“CONFIRM” for challenge submission

p5.js to Arduino:

Math problem and options
Correct note index
Target Morse code sequence
Challenge start triggers

Visuals and Logic – p5js

1. Rendering challenge screens
2. Displaying math problems, notes, and Morse codes
3. Receiving real-time input signals via serial
4. Confirming answers and unlocking digits
5. Displaying progress toward final defusal

After all 4 digits are unlocked, p5 transitions to a final code entry stage:

– The player enters the 4-digit code
– If correct, p5 shows which colored wire to cut (physically implemented via clip wire/jumpers)
– If incorrect, p5 gives a failure message

Assignment 12: Final Project Proposal (Repeat After Me)

For my final project, I am making an interactive reaction and memory game. The game will use physical components connected to an Arduino to play the game. It is inspired by retro games like Dance Dance Revolution. The project will challenge users to memorize and repeat increasingly complex light patterns using a diamond-shaped controller layout (similar to a DDR board). With each round, the sequence will grow longer, which will test the user’s memory and reaction speed.

Arduino Input/Output:
Inputs:

  • Four Arcade buttons, arranged in a diamond shape (UP, DOWN, LEFT, RIGHT)
  • Buttons are read using digitalRead(). When they are pressed, a keyword  corresponding to that button (UP, DOWN,…etc) is sent to P5 via serial communication

Output:

  • During the pattern display, P5 will send commands to the Arduino to light up the arcade button LEDs in a way that mirrors the on-screen pattern

P5:

Design:

  • A retro-looking DDR-inspired interface
  • Each direction will be animated (highlighted) when it is part of the pattern

Game Logic:

  • It will generate and store a random pattern of directions each round
  • Will animate the pattern both on the screen, and through the Arduino
  • It will wait for the player input, then compare it to the stored pattern
  • If the sequence is correct, it will increase the score, add a new step to the pattern, then begin the next round
  • If the sequence is incorrect, it will end the game and show the “Game Over” screen

Serial Communication:

  • It will recieve inputs (“UP”, “DOWN”, “LEFT”, “RIGHT”) according to the button that the user presses on the Arduino
  • Send directions to the Arduino during the pattern animation so it’s LEDs match the screen

Whats yet to be done:

Making the graphics to be used on the P5 screen (start screen, end screen, arrows), and finalising the code for each of the game states

Week 12 – Finalised Concept

Final Project Proposal: ExpressNotes

For my final project, I am creating ExpressNotes, an interactive audiovisual system that combines sound, visuals, and physical input to inspire real-time artistic expression. The system uses an Arduino connected to seven push buttons and a potentiometer. Each button represents a musical note in the A-G scale. When the user presses a button, the Arduino detects the interaction and sends the corresponding note information to a P5.js sketch running on a computer. The potentiometer allows the user to control the volume of the sounds being played. In response to these inputs, P5.js generates synchronized piano sounds and matching visual effects, forming a dynamic and engaging artistic experience that evolves with each interaction.

ExpressNotes is designed to turn the user into both a musician and visual artist. Each button press is more than a sound—it becomes a trigger for a burst of visual energy. The system fosters expressive exploration by offering real-time feedback. For example, pressing the “C” button might create a ripple of soft blue circles along with a gentle piano tone, while pressing “F” could unleash rotating magenta shapes accompanied by a brighter sound. Over time, the visual canvas builds up in complexity and motion, reflecting the rhythm and emotion of the user’s choices.

The Arduino is programmed to listen for button presses and read the current position of the potentiometer. When a button is pressed, it sends a message like “note:C” to the P5.js program. It also continuously reads the potentiometer value, which ranges from 0 to 1023, and sends volume updates in the form of messages like “volume:750.” The Arduino serves solely as a sender—it does not receive any data back from P5.js.

On the software side, P5.js is responsible for listening to the serial port and interpreting the messages from the Arduino. When it receives a note message, it plays the corresponding piano note and triggers a unique visual effect associated with that note. When it receives a volume message, it maps the value to a usable range and updates the volume of the audio accordingly. Each note is associated with a different visual response to create a richer and more immersive experience. For instance, the “A” note may create blue ripples, “B” may release golden triangle bursts, “C” may trigger expanding red squares, “D” might produce spiraling cyan patterns, “E” could generate flowing green waves, “F” may show rotating magenta shapes, and “G” could spark star-like white particles.

When the user interacts with the buttons and the volume knob, they create a personal audiovisual composition. The experience is fluid, immediate, and emotionally engaging. Each session is unique, and the visual display reflects the rhythm, timing, and feeling of the musical input. The evolving artwork becomes a living canvas of user expression.

In nutshell, ExpressNotes demonstrates the creative potential of combining physical sensors with generative audio-visual programming. It allows users to explore sound and visual art intuitively, transforming simple button presses into an expressive performance. The project encourages artistic freedom, emotional engagement, and a playful connection between sound and image, making it a compelling example of interactive media design.