Very very late assignment 11 submission

 

 

Sorry for my late submission , I was facing a lot of problems that I was not aware how to solve, apparently my browser (Opera GX) does not support p5 – arduino communication, it took me ages to realize, and I compensated with putting extra effort into my assignment.

1)

 

2)

3)

// bounce detection and wind control
// pin setup
const int potPin = A0;  // pot for wind control
const int ledPin = 9;   // led that lights up on bounce

// vars
int potValue = 0;       // store pot reading
float windValue = 0;    // mapped wind value
String inputString = ""; // string to hold incoming data
boolean stringComplete = false; // flag for complete string
unsigned long ledTimer = 0;      // timer for led
boolean ledState = false;        // led state tracking
const long ledDuration = 200;    // led flash duration ms

void setup() {
  // set led pin as output
  pinMode(ledPin, OUTPUT);
  
  // start serial comm
  Serial.begin(9600);
  
  // reserve 200 bytes for inputString
  inputString.reserve(200);
}

void loop() {
  // read pot val
  potValue = analogRead(potPin);
  
  // map to wind range -2 to 2
  windValue = map(potValue, 0, 1023, -20, 20) / 10.0;
  
  // send wind value to p5
  Serial.print("W:");
  Serial.println(windValue);
  
  // check for bounce info
  if (stringComplete) {
    // check if it was a bounce message
    if (inputString.startsWith("BOUNCE")) {
      // turn on led
      digitalWrite(ledPin, HIGH);
      ledState = true;
      ledTimer = millis();
    }
    
    // clear string
    inputString = "";
    stringComplete = false;
  }
  
  // check if led should turn off
  if (ledState && (millis() - ledTimer >= ledDuration)) {
    digitalWrite(ledPin, LOW);
    ledState = false;
  }
  
  // small delay to prevent serial flood
  delay(50);
}

// serial event occurs when new data arrives
void serialEvent() {
  while (Serial.available()) {
    // get new byte
    char inChar = (char)Serial.read();
    
    // add to input string if not newline
    if (inChar == '\n') {
      stringComplete = true;
    } else {
      inputString += inChar;
    }
  }
}

 

 

Week 12 – Finalized Idea

Concept:

My project explores the fascinating intersection between physical interaction and emergent systems through a digital flocking simulation. Inspired by Craig Reynolds’ “Boids” algorithm, I’m creating an interactive experience where users can manipulate a flock of virtual entities using both hand gestures and physical controls. The goal is to create an intuitive interface that allows people to “conduct” the movement of the flock, experiencing how simple rules create complex, mesmerizing patterns.

The simulation displays a collection of geometric shapes (triangles, circles, squares, and stars) that move according to three core flocking behaviors: separation, alignment, and cohesion. Users can influence these behaviors through hand gestures detected by a webcam and physical controls connected to an Arduino.

Arduino Integration Design

The Arduino component of my project will create a tangible interface for controlling specific aspects of the flocking simulation:

  1. Potentiometer Input:
    • Function: Controls the movement speed of all entities in the flock
    • Implementation: Analog reading from potentiometer (0-1023)
    • Communication: Raw values sent to P5 via serial communication
    • P5 Action: Values mapped to speed multiplier (0.5x to 5x normal speed)
  2. Button 1 – “Add” Button:
    • Function: Adds new entities to the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “ADD” text command when pressed
    • P5 Action: Creates 5 new boids at random positions
  3. Button 2 – “Remove” Button:
    • Function: Removes entities from the simulation
    • Implementation: Digital input with debouncing
    • Communication: Sends “REMOVE” text command when pressed
    • P5 Action: Removes 5 random boids from the simulation

The Arduino code will continuously monitor these inputs and send the appropriate data through serial communication at 9600 baud. I plan to implement debouncing for the buttons to ensure clean signals and reliable operation.

P5.js Implementation Design

The P5.js sketch handles the core simulation and multiple input streams:

  1. Flocking Algorithm:
    • Three steering behaviors: separation (avoidance), alignment (velocity matching), cohesion (position averaging)
    • Adjustable weights for each behavior to change flock characteristics
    • Four visual representations: triangles (default), circles, squares, and stars
  2. Hand Gesture Recognition:
    • Uses ML5.js with HandPose model for real-time hand tracking
    • Left hand controls shape selection:
      • Index finger + thumb pinch: Triangle shape
      • Middle finger + thumb pinch: Circle shape
      • Ring finger + thumb pinch: Square shape
      • Pinky finger + thumb pinch: Star shape
    • Right hand controls flocking parameters:
      • Middle finger + thumb pinch: Increases separation force
      • Ring finger + thumb pinch: Increases cohesion force
      • Pinky finger + thumb pinch: Increases alignment force
  3. Serial Communication with Arduino:
    • Receives and processes three types of data:
      • Analog potentiometer values to control speed
      • “ADD” command to add boids
      • “REMOVE” command to remove boids
    • Provides visual indicator of connection status
  4. User Interface:
    • Visual feedback showing connection status, boid count, and potentiometer value
    • Dynamic gradient background that subtly responds to potentiometer input
    • Click-to-connect functionality for Arduino communication

Current Progress

So far, I’ve implemented the core flocking algorithm in P5.js and set up the hand tracking system using ML5.js. The boids respond correctly to the three steering behaviors, and I can now switch between different visual representations.

I’ve also established the serial communication framework between P5.js and Arduino using the p5.webserial.js library. The system can detect previously used serial ports and automatically reconnect when the page loads.

For the hand gesture recognition, I’ve successfully implemented the basic detection of pinch gestures between the thumb and different fingers. The system can now identify which hand is which (left vs. right) and apply different actions accordingly.

Next steps include:

  1. Finalizing the Arduino circuit with the potentiometer and two buttons
  2. Implementing proper debouncing for the buttons
  3. Refining the hand gesture detection to be more reliable
  4. Adjusting the flocking parameters for a more visually pleasing result
  5. Adding more visual feedback and possibly sound responses

The most challenging aspect so far has been getting the hand detection to work reliably, especially distinguishing between left and right hands consistently. I’m still working on improving this aspect of the project.

I believe this project has exciting potential not just as a technical demonstration, but as an exploration of how we can create intuitive interfaces for interacting with complex systems. By bridging physical controls and gesture recognition, I hope to create an engaging experience that allows users to develop an intuitive feel for how emergent behaviors arise from simple rules.

Finalized concept for the project

Finalized concept for the project

My final project helps users, especially beginners, to learn how to play the piano and learn how to jam to the blues music style. I created a mini-piano consisting of 2 octaves plus one key (spanning notes C4 to C6), which is a welcoming size for beginners. A visualization of a piano is displayed on the p5js sketch, which can be helpful for the piano player to see an animated pressed key and listen to the relevant audio for that pressed key.

The piano is color-coded by note, so that note “C” is white, “D” is orange, “E” is red, “F” is blue and so on. This was a deliberate choice because seeing different colours on the piano can help users familiarize themselves with the positions of the keys over time. Additionally, I used this presentation slide deck with instructions to play the notes, color-coded, in order (example in Fig. 1). Thus, as users see the color-coded notes on the presentation and try to follow it, they could more quickly and easily match it to the note on the physical piano that they should play.

Instructions to Play, In Left-Right Order, the Color-Coded Notes
Fig. 1. Instructions to Play, In Left-Right Order, the Color-Coded Notes

Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5

  • Inputs:
    • Force Sensitive Resistors (FSR) – Arduino program receives note state based on FSR readings vs press thresholds. If the readings meet the conditions, send to p5.
    • Push buttons – Arduino program receives note state based on reading on whether button is pressed.
  • No outputs

Design and description of what P5 program will do and what it will send to and/or receive from Arduino

In the communication from Arduino to p5, a string is sent line-by-line containing note name, followed by “=”, followed by note state (either 0 or 1 depending on if it’s considered pressed or not). activeSerialNoteStates is an important variable that stores the latest state (0 or 1) received from Arduino for each note name. Based on the current state of a particular key in activeSerialNoteStates, the handlePressEvent() and display() for that key is called.

Progress

Tomorrow and this Sunday, I could try to work on building the circuit.

References:

  • Blues scale, E blues scale: https://www.youtube.com/watch?v=CjJwxtahGtw
  • Major vs Minor Blues scales: https://www.youtube.com/watch?v=WWEchKvZwdE
  • Pentatonic scales and blues scales: https://www.youtube.com/watch?v=Vj-BOmKgdE4

Week 12: Finalized Concept

Finalized concept for the project:

My final project concept is inspired by a popular game called Piano Tiles. My idea is to create a sturdy, fully functional four-key piano connected to an Arduino. Users will be able to physically play the game using this piano, while the gameplay will be displayed on a laptop screen and recreated in p5js, with some differences like a life powerup.

Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5

My arduino program will be in charge of sending all the inputs from the push buttons when a player presses a key on the piano to the p5js. This will be similar to the musical instrument assignment we did in class except the speaker will not be in the arduino but rather an output on the computer from p5 to see if the player pressed the right key in the right time frame.

Design and description of what P5 program will do and what it will send to and/or receive from Arduino:

Speaking of which, my p5js program will run the graphics for the game itself with users seeing the tiles they have to click. It will receive the input from the Arduino of when the user has clicked the piano key and use logic to make sure that was correct according to the game rules. If not, the game will end unless the player has an extra life which can be received in the game by pressing all 4 tiles 3 times at a certain point in the song.

I’m currently working on the 3d design for the piano. Right now I found a file online that looks like this:

But, I need it to have 4 keys, so I am going to see if I can somehow alter the design to be for 4 keys instead of these 7 and also add a hole for the wires to come out from. I also anticipate the timing of the keys with the p5 to be hard. Firstly, there is often a delay between the arduino and the p5 I noticed when using the potentiometer in one of our assignments and that could mess up my game. Secondly, creating the tiles to fall in sync with an audio will be difficult and time consuming. I may just make it so that the tiles fall randomly, even if it is not in sync with the music. The game mechanics will still work like usual though.

Week 11- Final Project Preliminary Concept

Concept

For the final project I’ll be building a color memory game where the user sees colors generated at random by p5js and tries to match the colors by pressing arcade push buttons in the same order. And it gets tough as the game progresses. Every time the user gets it right a two-tiered trolley moves forward. The goal is to make the trolley/toy move as far as possible and close to the finish line within a short period of time.

Arduino and p5js

There will be serial communication between arduino and p5js in relaying the information concerning the color pressed (in arduino) against what has been displayed (on p5js) randomly. If there is a correct match between the two communication is sent back to the arduino to move the trolley forward. If not the p5js repeats the same pattern of colors previously displayed. There will also be a timer on the p5js screen to show the amount of time spent by the player.

Week 12 – Project Proposal

This requires some explanation:

Originally, I had a well-fleshed-out idea for the use of motion sensors to play a volleyball game. While conceptually sound, in practice, the motion sensors just could not cooperate. After a lot of testing with Arduino and its sensors, I realized that the ball would move too quick  for the sensors to process properly. Instead, I decided to make a violin.

The main mechanism in charge of producing sound will be a potentiometer, in such a way that when a bowstring is pulled back and forth, the potentiometer dial shall turn. Its analog output will be sent to p5, and detecting the bow’s movement will power a synthesizer to play sound. Next, the violin will have digital output buttons. Holding down buttons will give the arduino digital outputs also sent to p5. In p5, detecting which button is being pressed down will turn into a specific note in the scale. each of the 8 buttons represents one note, forming a full scale. This allows us to get a functional violin.

Week 12-Finalised Concept

 Concept

It’s Crossy Road gone fairy-tale: guide an on-screen duck across busy roads and small streams in p5.js. Reach the final, wide river and a real-world duck glides straight across a tabletop “river,” lights pulsing as a victory lap. Screen and reality shake hands for one perfect moment

 What Already Works (p5.js Only)

 

Module Done Notes
full screen canvas, road- rivers, predators, food 60 fps, collision OK( still needs work)
Duck sprite + full-axis movement Arrow keys for now
Item counter + score/time keeping only counted through codes so far
Final-river trigger zone logic works

 Real-World Interaction (Planned)

  • Prop: duck on a 30 cm linear rail (continuous-rotation servo + belt).

  • Motion: one smooth back to front glide only.

  • FX: underside RGB LED strip (water glow) + small piezo “quack” sample.

  • Cue: begins the instant p5 fires DUCK_GO and stops at rail limit.

 Arduino Paragraph (hardware plan)

Thinking of using the Arduino as the go-between that lets the computer game talk to the real duck. A little joystick and one “hop” button plug into it; the board simply reads how far you push the stick and whether the button is pressed, then sends those numbers to p5.js through the USB cable every split-second. Most of the time the Arduino just listens. When the game finally says “DUCK_GO”, the board springs into action: it turns on a motor that slides the rubber duck straight across a mini track, switches on soft blue-green lights under the “water,” and makes a quick quack with a tiny speaker. When p5.js later sends “DUCK_STOP,” the motor and lights shut off and the duck stays put. Because motors and lights can gulp a lot of power all at once, they’ll run from their own plug-in adapter so the Arduino itself never loses juice mid-move.

 Next-Week Targets

  1. Prototype rail — mount servo + belt, confirm straight glide

  2. Minimal Arduino sketch —  joystick; act on DUCK_GO with LED blink

  3. Serial bridge live — replace console.log() with serial.write() in p5

  4. End-to-end smoke test — finish level, duck moves

 Risks & Mitigation

  • Servo overshoot → limit switches or timed cutoff.

  • Serial lag → short packets, high baud.

  • Scope creep → no extra rivers, no particle splashes until core loop is solid.

Final Project Progress and Design – Zayed Alsuwaidi

Commit to your Final Project Proposal, include the following explanations in your blog post:

    • Finalized concept for the project
    • Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5
    • Design and description of what P5 program will do and what it will send to and/or receive from Arduino
    • Start working on your overall project (document the progress)

The finalized concept for my project is essentially an experimental simulation which visualizes the specific heart-rate pattern of the person interacting with it, produces experimental music in coordination with that data representation in real-time, and allowing for the user to interact with the simulation.

Serial connection: From arduino to p5.js. (one way)

Links for music (not mine) and if time allows, I might create my own music, but this is the current set of ambient and experimental music:

https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.2/p5.min.js

https://cdnjs.cloudflare.com/ajax/libs/tone/14.8.49/Tone.min.js

function mousePressed() {
  for (let btn of buttons) {
    if (mouseX > btn.x && mouseX < btn.x + btn.w && mouseY > btn.y && mouseY < btn.y + btn.h) {
      if (btn.action === "back") {
        currentSeqIndex = (currentSeqIndex - 1 + sequences.length) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      } else if (btn.action === "pause") {
        btn.paused = !btn.paused;
        btn.label = btn.paused ? "Play" : "Pause";
        if (btn.paused) {
          Tone.Transport.pause();
        } else {
          Tone.Transport.start();
          sequence.start(0);
        }
      } else if (btn.action === "forward") {
        currentSeqIndex = (currentSeqIndex + 1) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      }
    }
  }
}

To emphasize the user interaction and fine-tune the functionality, the mouse pressed function alters the algorithm in which the music is produced.

 

I faced several issues with this TypeError:

TypeError: Cannot read properties of undefined (reading ‘time’)

 

I am currently using a placeholder variable for the input (heart rate) from this pulseSensor. The reason for that is that I need to solder the piece on the right (the metal wires) to create a connection so that I may connect the arduino to the pulseSensor. I am not experienced with soldering and I will ask for further help to continue this stage. 

Next Step: My next step is to solder the wires and to start testing the sensor and implement it into the project. From this, I will test which patterns I can identify to produce the required data visualization. This is a large part of the project so at the current phase it is 30% complete.

Here is my p5.js so far, with a working and interactive algorithm to select preferred ambient music, and functionality based on the heart rate (simulated and dummy variable controlled with slider).

For the arduino uno, I will use this code:

#include <PulseSensorPlayground.h>
#include <ArduinoJson.h>

const int PULSE_PIN = A0;
const int BTN1_PIN = 2;
const int BTN2_PIN = 3;
const int BTN3_PIN = 4;

PulseSensorPlayground pulseSensor;
StaticJsonDocument<200> doc;

void setup() {
  Serial.begin(9600);
  pulseSensor.analogInput(PULSE_PIN);
  pulseSensor.setThreshold(550);
  pulseSensor.begin();
  
  pinMode(BTN1_PIN, INPUT_PULLUP);
  pinMode(BTN2_PIN, INPUT_PULLUP);
  pinMode(BTN3_PIN, INPUT_PULLUP);
}

void loop() {
  int bpm = pulseSensor.getBeatsPerMinute();
  if (!pulseSensor.sawStartOfBeat()) {
    bpm = 0; // Reset if no beat detected
  }
  
  doc["bpm"] = bpm;
  doc["btn1"] = digitalRead(BTN1_PIN) == LOW ? 1 : 0;
  doc["btn2"] = digitalRead(BTN2_PIN) == LOW ? 1 : 0;
  doc["btn3"] = digitalRead(BTN3_PIN) == LOW ? 1 : 0;
  
  serializeJson(doc, Serial);
  Serial.println();
  
  delay(100); // Send data every 100ms
}

and I will test it and document my progress in debugging as well.

 

Design considerations of the physical presentation of the project:

I am still thinking through different forms of cases, designs such as bracelets or medical tape to make the connection between the sensor and the person interacting with the program.

The design requires technical consideration, as the connection between the sensor and the radial artery would be already slightly weak (with a margin of error) I need to document this as well and consider my design after the implementation of the pulseSensor.

For the buttons, I am planning to make them on some form of platform (something similar to the platform that the breadboard and arduino are attached to.


Fullscreen for the best experience.

Final Project Concept: Plant Whisperer – Interactive Plant Companion

Plant Whisperer is a physical-digital interaction system that lets a real plant express how it feels through a friendly digital avatar. Using Arduino Uno and p5.js, the system monitors the plant’s environment, specifically light exposure and human interaction, and translates this data into visual and auditory feedback.

I want to promote awareness of nature and care through playful, intuitive technology. It reimagines how we perceive and respond to non-verbal living things by giving the plant a way to “talk back.”

Sensors and Components

    • Photoresistor (Light Sensor): Detects ambient light around the plant.

    • Capacitive Touch Sensor – DIY (using the CapacitiveSensor library): Detects when the user interacts with the plant.

    • RGB LED: Shows the plant’s current emotional state in color.

    • Piezo Buzzer: Plays tones based on mood or user interaction.

Avatar Behavior (in p5.js)

The avatar is a stylized digital plant that changes facial expression, movement, background color, and plays ambient sounds based on the sensor input.

Inspiration

Mood Logic:

Sensor Input Mood Visual in p5.js LED Color Sound
Bright Light Happy Smiling plant, upright Green Gentle chime
Dim Light Sleepy Drooping plant, closed eyes Blue Soft drone
Very Low Light Sad Frowning plant, faded color Purple Slow tone
Button Pressed Excited Eyes sparkle, leaf wiggle Yellow flash Upbeat trill

Significance of the Project

My goal with this project is to encourage mindfulness, care, and emotional engagement with the environment. By giving a non-verbal living organism a digital voice, the system fosters empathy and attention.

This project is about more than just monitoring a plant, it’s about interaction. By gently blurring the line between organic life and digital expression, Plant Whisperer invites users to slow down, observe, and connect with their environment through technology.