Week 12-Finalised Concept

 Concept

It’s Crossy Road gone fairy-tale: guide an on-screen duck across busy roads and small streams in p5.js. Reach the final, wide river and a real-world duck glides straight across a tabletop “river,” lights pulsing as a victory lap. Screen and reality shake hands for one perfect moment

 What Already Works (p5.js Only)

 

Module Done Notes
full screen canvas, road- rivers, predators, food 60 fps, collision OK( still needs work)
Duck sprite + full-axis movement Arrow keys for now
Item counter + score/time keeping only counted through codes so far
Final-river trigger zone logic works

 Real-World Interaction (Planned)

  • Prop: duck on a 30 cm linear rail (continuous-rotation servo + belt).

  • Motion: one smooth back to front glide only.

  • FX: underside RGB LED strip (water glow) + small piezo “quack” sample.

  • Cue: begins the instant p5 fires DUCK_GO and stops at rail limit.

 Arduino Paragraph (hardware plan)

Thinking of using the Arduino as the go-between that lets the computer game talk to the real duck. A little joystick and one “hop” button plug into it; the board simply reads how far you push the stick and whether the button is pressed, then sends those numbers to p5.js through the USB cable every split-second. Most of the time the Arduino just listens. When the game finally says “DUCK_GO”, the board springs into action: it turns on a motor that slides the rubber duck straight across a mini track, switches on soft blue-green lights under the “water,” and makes a quick quack with a tiny speaker. When p5.js later sends “DUCK_STOP,” the motor and lights shut off and the duck stays put. Because motors and lights can gulp a lot of power all at once, they’ll run from their own plug-in adapter so the Arduino itself never loses juice mid-move.

 Next-Week Targets

  1. Prototype rail — mount servo + belt, confirm straight glide

  2. Minimal Arduino sketch —  joystick; act on DUCK_GO with LED blink

  3. Serial bridge live — replace console.log() with serial.write() in p5

  4. End-to-end smoke test — finish level, duck moves

 Risks & Mitigation

  • Servo overshoot → limit switches or timed cutoff.

  • Serial lag → short packets, high baud.

  • Scope creep → no extra rivers, no particle splashes until core loop is solid.

Final Project Progress and Design – Zayed Alsuwaidi

Commit to your Final Project Proposal, include the following explanations in your blog post:

    • Finalized concept for the project
    • Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5
    • Design and description of what P5 program will do and what it will send to and/or receive from Arduino
    • Start working on your overall project (document the progress)

The finalized concept for my project is essentially an experimental simulation which visualizes the specific heart-rate pattern of the person interacting with it, produces experimental music in coordination with that data representation in real-time, and allowing for the user to interact with the simulation.

Serial connection: From arduino to p5.js. (one way)

Links for music (not mine) and if time allows, I might create my own music, but this is the current set of ambient and experimental music:

https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.2/p5.min.js

https://cdnjs.cloudflare.com/ajax/libs/tone/14.8.49/Tone.min.js

function mousePressed() {
  for (let btn of buttons) {
    if (mouseX > btn.x && mouseX < btn.x + btn.w && mouseY > btn.y && mouseY < btn.y + btn.h) {
      if (btn.action === "back") {
        currentSeqIndex = (currentSeqIndex - 1 + sequences.length) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      } else if (btn.action === "pause") {
        btn.paused = !btn.paused;
        btn.label = btn.paused ? "Play" : "Pause";
        if (btn.paused) {
          Tone.Transport.pause();
        } else {
          Tone.Transport.start();
          sequence.start(0);
        }
      } else if (btn.action === "forward") {
        currentSeqIndex = (currentSeqIndex + 1) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      }
    }
  }
}

To emphasize the user interaction and fine-tune the functionality, the mouse pressed function alters the algorithm in which the music is produced.

 

I faced several issues with this TypeError:

TypeError: Cannot read properties of undefined (reading ‘time’)

 

I am currently using a placeholder variable for the input (heart rate) from this pulseSensor. The reason for that is that I need to solder the piece on the right (the metal wires) to create a connection so that I may connect the arduino to the pulseSensor. I am not experienced with soldering and I will ask for further help to continue this stage. 

Next Step: My next step is to solder the wires and to start testing the sensor and implement it into the project. From this, I will test which patterns I can identify to produce the required data visualization. This is a large part of the project so at the current phase it is 30% complete.

Here is my p5.js so far, with a working and interactive algorithm to select preferred ambient music, and functionality based on the heart rate (simulated and dummy variable controlled with slider).

For the arduino uno, I will use this code:

#include <PulseSensorPlayground.h>
#include <ArduinoJson.h>

const int PULSE_PIN = A0;
const int BTN1_PIN = 2;
const int BTN2_PIN = 3;
const int BTN3_PIN = 4;

PulseSensorPlayground pulseSensor;
StaticJsonDocument<200> doc;

void setup() {
  Serial.begin(9600);
  pulseSensor.analogInput(PULSE_PIN);
  pulseSensor.setThreshold(550);
  pulseSensor.begin();
  
  pinMode(BTN1_PIN, INPUT_PULLUP);
  pinMode(BTN2_PIN, INPUT_PULLUP);
  pinMode(BTN3_PIN, INPUT_PULLUP);
}

void loop() {
  int bpm = pulseSensor.getBeatsPerMinute();
  if (!pulseSensor.sawStartOfBeat()) {
    bpm = 0; // Reset if no beat detected
  }
  
  doc["bpm"] = bpm;
  doc["btn1"] = digitalRead(BTN1_PIN) == LOW ? 1 : 0;
  doc["btn2"] = digitalRead(BTN2_PIN) == LOW ? 1 : 0;
  doc["btn3"] = digitalRead(BTN3_PIN) == LOW ? 1 : 0;
  
  serializeJson(doc, Serial);
  Serial.println();
  
  delay(100); // Send data every 100ms
}

and I will test it and document my progress in debugging as well.

 

Design considerations of the physical presentation of the project:

I am still thinking through different forms of cases, designs such as bracelets or medical tape to make the connection between the sensor and the person interacting with the program.

The design requires technical consideration, as the connection between the sensor and the radial artery would be already slightly weak (with a margin of error) I need to document this as well and consider my design after the implementation of the pulseSensor.

For the buttons, I am planning to make them on some form of platform (something similar to the platform that the breadboard and arduino are attached to.


Fullscreen for the best experience.

Final Project Concept: Plant Whisperer – Interactive Plant Companion

Plant Whisperer is a physical-digital interaction system that lets a real plant express how it feels through a friendly digital avatar. Using Arduino Uno and p5.js, the system monitors the plant’s environment, specifically light exposure and human interaction, and translates this data into visual and auditory feedback.

I want to promote awareness of nature and care through playful, intuitive technology. It reimagines how we perceive and respond to non-verbal living things by giving the plant a way to “talk back.”

Sensors and Components

    • Photoresistor (Light Sensor): Detects ambient light around the plant.

    • Capacitive Touch Sensor – DIY (using the CapacitiveSensor library): Detects when the user interacts with the plant.

    • RGB LED: Shows the plant’s current emotional state in color.

    • Piezo Buzzer: Plays tones based on mood or user interaction.

Avatar Behavior (in p5.js)

The avatar is a stylized digital plant that changes facial expression, movement, background color, and plays ambient sounds based on the sensor input.

Inspiration

Mood Logic:

Sensor Input Mood Visual in p5.js LED Color Sound
Bright Light Happy Smiling plant, upright Green Gentle chime
Dim Light Sleepy Drooping plant, closed eyes Blue Soft drone
Very Low Light Sad Frowning plant, faded color Purple Slow tone
Button Pressed Excited Eyes sparkle, leaf wiggle Yellow flash Upbeat trill

Significance of the Project

My goal with this project is to encourage mindfulness, care, and emotional engagement with the environment. By giving a non-verbal living organism a digital voice, the system fosters empathy and attention.

This project is about more than just monitoring a plant, it’s about interaction. By gently blurring the line between organic life and digital expression, Plant Whisperer invites users to slow down, observe, and connect with their environment through technology.

Final Project Proposal

The project is a bomb defusal puzzle box, consisting of four mini-games that must be solved in sequence to reveal a 4-digit defusal code. Each completed challenge unlocks one digit of the code. After all four digits are revealed, the player must input the code correctly to determine which wire to “cut” to defuse the bomb.

The gameplay is immersive and pressure-driven, combining speed, precision, memory, and logic.

 The 4 Challenges

    1. Button Mash

    • Tap a button exactly 24 times as fast as possible so that user can move onto next challenge without wasting too much time. Encourages rhythm and pressure timing and serves as a warmup game for users as well.


      2. Math Lock

      A simple math problem appears on screen. The user selects their  answer by turning a potentiometer. A confirm button locks in the answer. Feedback is given through p5.js on correctness.


      3. Note Match

      A musical note (from 4 pitches) is played using a buzzer. The player uses one of four buttons to play and listen to notes. When confident, the player presses a confirm button to lock in their selection. Visual feedback shows which note they selected.


      4. Morse Code

      A Morse code pattern ( with dots and dashes) representing a letter is shown briefly on screen. The user recreates the pattern using . and – through short and long presses of a button and then lock in their answer using a designated confirm button 


Arduino Program: Input/Output Design

Inputs:

Buttons: 4 buttons which will be multipurpose and serve as options for the various challenges and one button which will act as confirmation button and users will use it to lock in their answer.

Potentiometer: For Math Lock answer selection.

Outputs:

Buzzer: For Note Match playback.
Serial Communication: Sends current state/selections to p5.js.

Arduino to p5.js:

Selections (0–3 for Note Match/Math Lock)
Dot/dash inputs during Morse Code
Tap count for Button Mash
“CONFIRM” for challenge submission

p5.js to Arduino:

Math problem and options
Correct note index
Target Morse code sequence
Challenge start triggers

Visuals and Logic – p5js

1. Rendering challenge screens
2. Displaying math problems, notes, and Morse codes
3. Receiving real-time input signals via serial
4. Confirming answers and unlocking digits
5. Displaying progress toward final defusal

After all 4 digits are unlocked, p5 transitions to a final code entry stage:

– The player enters the 4-digit code
– If correct, p5 shows which colored wire to cut (physically implemented via clip wire/jumpers)
– If incorrect, p5 gives a failure message

Assignment 12: Final Project Proposal (Repeat After Me)

For my final project, I am making an interactive reaction and memory game. The game will use physical components connected to an Arduino to play the game. It is inspired by retro games like Dance Dance Revolution. The project will challenge users to memorize and repeat increasingly complex light patterns using a diamond-shaped controller layout (similar to a DDR board). With each round, the sequence will grow longer, which will test the user’s memory and reaction speed.

Arduino Input/Output:
Inputs:

  • Four Arcade buttons, arranged in a diamond shape (UP, DOWN, LEFT, RIGHT)
  • Buttons are read using digitalRead(). When they are pressed, a keyword  corresponding to that button (UP, DOWN,…etc) is sent to P5 via serial communication

Output:

  • During the pattern display, P5 will send commands to the Arduino to light up the arcade button LEDs in a way that mirrors the on-screen pattern

P5:

Design:

  • A retro-looking DDR-inspired interface
  • Each direction will be animated (highlighted) when it is part of the pattern

Game Logic:

  • It will generate and store a random pattern of directions each round
  • Will animate the pattern both on the screen, and through the Arduino
  • It will wait for the player input, then compare it to the stored pattern
  • If the sequence is correct, it will increase the score, add a new step to the pattern, then begin the next round
  • If the sequence is incorrect, it will end the game and show the “Game Over” screen

Serial Communication:

  • It will recieve inputs (“UP”, “DOWN”, “LEFT”, “RIGHT”) according to the button that the user presses on the Arduino
  • Send directions to the Arduino during the pattern animation so it’s LEDs match the screen

Whats yet to be done:

Making the graphics to be used on the P5 screen (start screen, end screen, arrows), and finalising the code for each of the game states

Week 12 – Finalised Concept

Final Project Proposal: ExpressNotes

For my final project, I am creating ExpressNotes, an interactive audiovisual system that combines sound, visuals, and physical input to inspire real-time artistic expression. The system uses an Arduino connected to seven push buttons and a potentiometer. Each button represents a musical note in the A-G scale. When the user presses a button, the Arduino detects the interaction and sends the corresponding note information to a P5.js sketch running on a computer. The potentiometer allows the user to control the volume of the sounds being played. In response to these inputs, P5.js generates synchronized piano sounds and matching visual effects, forming a dynamic and engaging artistic experience that evolves with each interaction.

ExpressNotes is designed to turn the user into both a musician and visual artist. Each button press is more than a sound—it becomes a trigger for a burst of visual energy. The system fosters expressive exploration by offering real-time feedback. For example, pressing the “C” button might create a ripple of soft blue circles along with a gentle piano tone, while pressing “F” could unleash rotating magenta shapes accompanied by a brighter sound. Over time, the visual canvas builds up in complexity and motion, reflecting the rhythm and emotion of the user’s choices.

The Arduino is programmed to listen for button presses and read the current position of the potentiometer. When a button is pressed, it sends a message like “note:C” to the P5.js program. It also continuously reads the potentiometer value, which ranges from 0 to 1023, and sends volume updates in the form of messages like “volume:750.” The Arduino serves solely as a sender—it does not receive any data back from P5.js.

On the software side, P5.js is responsible for listening to the serial port and interpreting the messages from the Arduino. When it receives a note message, it plays the corresponding piano note and triggers a unique visual effect associated with that note. When it receives a volume message, it maps the value to a usable range and updates the volume of the audio accordingly. Each note is associated with a different visual response to create a richer and more immersive experience. For instance, the “A” note may create blue ripples, “B” may release golden triangle bursts, “C” may trigger expanding red squares, “D” might produce spiraling cyan patterns, “E” could generate flowing green waves, “F” may show rotating magenta shapes, and “G” could spark star-like white particles.

When the user interacts with the buttons and the volume knob, they create a personal audiovisual composition. The experience is fluid, immediate, and emotionally engaging. Each session is unique, and the visual display reflects the rhythm, timing, and feeling of the musical input. The evolving artwork becomes a living canvas of user expression.

In nutshell, ExpressNotes demonstrates the creative potential of combining physical sensors with generative audio-visual programming. It allows users to explore sound and visual art intuitively, transforming simple button presses into an expressive performance. The project encourages artistic freedom, emotional engagement, and a playful connection between sound and image, making it a compelling example of interactive media design.

Week 12: Final Project Proposal

Smart House System

For this project, I am designing a Smart House System using Arduino UNO and p5.js.
The idea was to automate key parts of a house and parking lot, and to create a more interactive experience using voice announcements made with p5.js.

The main features include:

  • Automatic main door that opens when a person approaches.

  • Car parking system that displays the number of available spots and automatically opens a parking barricade if spots are available.

  • Indoor lighting system that switches on when it gets dark, based on a photoresistor sensor.

  • Real-time voice announcements generated in p5.js based on Arduino sensor inputs, to make the system feel alive and responsive during an exhibition demo.

Design and Description of Arduino Program

The Arduino is responsible for all hardware sensor readings, actuator controls, and sending/receiving messages to and from p5.js.

Component Type Description
Distance Sensor (Main Door) Input Measures distance to detect a visitor near the main door.
Distance Sensor (Parking Entrance) Input Detects a car approaching the parking gate (entry).
Distance Sensor (Parking Exit) Input Detects a car leaving the parking area (exit).
Photoresistor (Indoor Lighting) Input Detects ambient light levels to determine if indoor lights should be turned on.
Servo Motor (Main Door) Output Opens or closes the main entrance door based on visitor detection.
Servo Motor (Parking Gate) Output Opens or closes the parking barricade when a car is allowed entry.
LED Display Output Shows the number of parking spots left.
Indoor LEDs / Lights Output Turns indoor lights on/off based on photoresistor readings.

Design and Description of p5.js Program

The p5.js program is responsible for interaction, feedback, and visualization.

Main Tasks:

  • Connects to Arduino through Serial communication.

  • Reads messages sent from Arduino (e.g., “door_open”, “lights_on”, “parking_spots:3”).

  • Plays real-time voice announcements using the p5.speech library based on events detected.

  • Displays a virtual dashboard showing the current number of available parking spots and statuses like door open/closed, lights on/off.

Voice Announcements Based on Arduino Events:

Arduino Message p5.js Voice Announcement
door_open “Welcome! The door is opening.”
car_entry “Vehicle detected. Checking parking availability.”
car_exit “Vehicle exited. Parking spot now available.”
parking_full “Parking is full. Please wait.”
lights_on “Lights are now on for your comfort.”
lights_off “Lights are off to save energy.”

Week 12 – Final Concept – Go Ichi-Go!

Concept and Overview:

For my final project, I want to create a game called “Go Ichi-Go!”. So the game would have a character called Ichigo (strawberry), who runs and has to jump over obstacles like jumping puddles of whipped cream, and towers of chocolate, and slide under floating slices of cake. After each jump/dive, there would be randomised text giving cute strawberry themed puns. For this I’ll use an array of randomised puns. 

There would also be sound effects for each jump or dive. There would be two buttons to jump or dive, using Arduino, and a button to start/restart the game. The player wins after successfully overcoming 10 obstacles.

If they win, it would display a congratulations image and a restart option. This triggers the movement of the servo motor, which will dispense candy as the reward for winning.

If they fail to do so, then it would just go to a Game Over state with a restart button. 

Things I’ll need :

  1. Arduino Uno
  2. Breadboard
  3. Servo Motor
  4. Three arcade buttons
  5. Jumper wires
  6. Alligator clips/wires
  7. Laser print arcade box
  8. Candy box dispenser

The serial communication would be like this:

From Arduino to p5: 

Pressing the “Start” button sends an “S” to p5, to start/restart the game.

Pressing the “Jump” button sends a “J” to p5, triggering the character to jump.

Pressing the “Dive” button sends a “D” to p5, triggering the character to jump.

if (digitalRead(startBtn) == LOW) {
   Serial.println("S");
   delay(100);
 }
 if (digitalRead(jumpBtn) == LOW) {
   Serial.println("J");
   delay(100);
 }
 if (digitalRead(diveBtn) == LOW) {
   Serial.println("D");
   delay(100);
 }

From p5 to Arduino:

At the end, when they win, P5 sends a “C” to Arduino, triggering the movement of the servo motor. 

serial.write('C\n');

Schematic:

DESIGN :

I want to make this game in a very kawaii retro vibe, but also focus more on the interaction design on the game itself, something I learnt from my midterm. So this time, I used Canva to layer up the illustrations, instead of drawing them from scratch. 

I also want to laser cut a box to hold all this together and make it look like one of those retro arcade game boxes. 

So far, I’ve just started the basic outline of the code, so I haven’t gotten very far with it. 

Next steps

  1. Finish game interface and troubleshoot 
  2. Laser cut arcade box pieces
  3. Make a box structure for the candy dispenser

 



Week 12 – Final Project Proposal

My final project is a bomb defusal game inspired by Keep Talking and Nobody Explodes. Just like in the original game, the player has to disarm several modules on the bomb in order to successfully defuse it. Currently I plan to include four types of modules. The first is Simon Says, using four buttons and corresponding LEDs. The second is a single larger button connected to a multicolored LED, which will be disarmed differently depending on the color. The third is an adaptation of cutting wires, where the player will have to either disconnect or rearrange the wires correctly. The last module requires the user to use a potentiometer as a tuning knob and try to hone in on the correct frequency.

The Arduino will be responsible for reading the inputs from each module, and correctly lighting the LEDs where needed. I also intend to use the display screen from our kits to show the bomb’s timer if possible, and use a buzzer/speaker to create a beeping noise. The p5.js side will be responsible for initializing a new game and managing its state. It will receive the inputs from Arduino and verify them, and send back the confirmation if a module was completed. I also want to use it to render a representation of the bomb, including the correct LED lights and countdown timer. In line with the original game, it can also work to provide the information needed to disarm the more complicated modules. In terms of interaction, p5.js will be used to start the game and will display a win/loss screen after it ends.

I have started writing some basic code for both components:

/*
Final Project (WIP)
By Matthias Kebede
*/





// // // Global Variables
// // Inputs
const int tempIn = A1;

// // Outputs
const int tempOut = 8;
const int speakerPin = 10;

// // Communication and State Information
int sendInterval = 100;   # ms
int lastSendTime = 0;   # ms
int timer = [5, 0];   # minutes, seconds
int beepFreq = 1000;   # hz
int beepDur = 50;   # ms
int beepInterval = 100;   # ms
int lastBeepTime = 0;





// // // Main Processes
void setup() {
  Serial.begin(9600);

  // // Inputs and Outputs
  pinMode(tempIn, INPUT);
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(tempOut, OUTPUT);

  // // Check built-in LED
  digitalWrite(LED_BUILTIN, HIGH);
  delay(200);
  digitalWrite(LED_BUILTIN, LOW);

  // // Start handshake w/ p5.js
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH);
    Serial.println("123"); // identifiable starting number
    delay(300);
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // // Wait for p5.js
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH);

    int firstVal = Serial.parseInt();
    if (Serial.read() == '\n') {
      analogWrite(tempOut, firstVal);
    }

    if (lastSendTime > sendInterval) {
      lastSendTime = 0;
      int sendVal = analogRead(tempIn);
      int mappedVal = map(sendVal, low, high, 0, 255);
      Serial.println(sendVal);
      delay(1); // stabilize ADC? check this
    } 
    else {
      lastSendTime++;
    }

    digitalWrite(LED_BUILTIN, LOW);
  }
}





// // // Helper Functions
void timerSound() {
  if (lastBeepTime > beepInterval) {
    // // Less than 30 seconds remaining
    if (timer[0] == 0 && timer[1] < 30) {
      tone(speakerPin, beepFreq + 500, beepDur);
    }
    // // Final minute but more than 30 seconds
    else if (timer[0] == 0) {
      tone(speakerPin, beepFreq + 250, beepDur + 25);
    }
    else {
      tone(speakerPin, beepFreq, beepDur + 50);
    }
  }
  else {
    lastBeepTime++;
  }
}
/*
Final Project
By Matthias Kebede
*/


// // // Global Variables
let debug = true;
let gameState = 'menu'; // menu, playing, win, lose
let game;
// // For display
let play, menuTitle;




// // // Main Processes
function preload() {
  
}

function setup() {
  createCanvas(600, 450);
  createMenu();
}

function draw() {
  background(220);
  
  switch (gameState) {
    case 'menu':
      showMenu();
      break;
    case 'playing':
      game.display();
      break;
    default:
      break;
      
  }
}



// // // Classes
class Game {
  constructor() {
    this.gameover = false;
    this.start = millis();
    this.time = 30;
    this.modules = [];
  }
  display() {
    text(`Time: ${Math.floor(this.time)}`, width/2, height/2);
    if (!this.gameover) {
      this.update();
    }
    else {
      text("Gameover", width/2, height*0.2);
    }
  }
  update() {
    for (let module in modules) {
      module.checkStatus();
    }
    // // Update timer
    if (this.time > 0 + 1/frameRate()) {
      this.time -= 1/frameRate();
    }
    else {
      this.time = 0;
      this.gameover = true;
    }
  }
}

class Module {
  constructor() {}
  checkStatus() {}
}



// // // Interaction
function mouseClicked() {}



// // // Displays
function createMenu() {
  // // Menu Title
  menuTitle = createElement('h1', "Main Menu");
  menuTitle.size(9*18, 100);
  menuTitle.position((width-menuTitle.width)/2, height*0.2);
  // // Play Button
  play = createButton("Play");
  play.size(width/5, height/15);
  play.position((width-play.width)/2, height/2);
  play.mousePressed(function() {
    game = new Game();
    gameState = 'playing';
    removeElements();
  });
}
function showMenu() {}


// // // Helper Functions


Week 12 – Finalised Concept

For my final project, I am building a 3D Music Interface that allows users to trigger both 3D visual animations and musical sounds through a physical touch-sensitive controller. The interface will be made out of cardboard and tinfoil touchpads connected to an Arduino. Each physical touch will trigger a corresponding 3D box animation in p5.js, along with a musical note generated directly in the browser using the p5.sound library. The goal is to create a playful, intuitive instrument that connects physical gestures to both sound and visual expression immediately and reliably.

Arduino Program Design

The Arduino will act as the input device, reading signals from multiple touch-sensitive pads made of tinfoil and cardboard.

  • Inputs:

    • Each touchpad will be connected to a digital input pin.

    • When the user touches a pad, it will register a HIGH signal on that pin.

  • Processing:

    • The Arduino continuously checks the state of each touchpad inside the loop() function.

    • When a touch is detected, the Arduino sends a unique identifier through Serial communication.

    • Each touchpad is mapped to a different number (e.g., 0, 1, 2, 3, up to 7).

  • Outputs:

    • The Arduino only sends data to p5.js. It does not control any external outputs like LEDs or motors.

  • Communication with p5.js:

    • Sends a single-digit number (0–7) each time a pad is touched.

    • No information is received from p5.js back to Arduino because the communication is one-way (Arduino → p5).

Example behavior:

  • Touchpad 0 is touched → Arduino sends 0 over Serial.

  • Touchpad 1 is touched → Arduino sends 1 over Serial.

    …and so on for each pad.

    The Arduino sketch will look something like this (early draft structure):

    void setup() {
      Serial.begin(9600);
      pinMode(touchPin1, INPUT);
      pinMode(touchPin2, INPUT);
      ...
    }
    
    void loop() {
      if (digitalRead(touchPin1) == HIGH) {
        Serial.println(0);
      }
      if (digitalRead(touchPin2) == HIGH) {
        Serial.println(1);
      }
      ...
    }
    

p5.js Program Design

The p5.js program will be responsible for receiving data from Arduino and producing both sound and visual feedback based on the input.

  • Receiving data:

    • p5.js listens to the Serial port using the serialEvent() function.

    • Each time it receives a number from Arduino (0–7), it triggers two actions:

      1. Visual: Activates a specific 3D box animation (e.g., spin, color change, movement).

      2. Audio: Plays a specific musical note using the p5.sound library.

  • Processing:

    • The incoming serial value is matched to the corresponding element in an array of 3D boxes.

    • A corresponding musical note (e.g., C2, D2, E2, F2…) is also mapped and played.

  • Outputs:

    • On the screen: Real-time 3D animations using WEBGL (such as spinning boxes).

    • In the browser: Sound output through the computer speakers using p5.sound.

  • Communication with Arduino:

    • p5.js only receives data from Arduino.

    • No messages are sent back

Example behavior:

  • Receive 0 → Box 0 spins and note C2 plays

  • Receive 1 → Box 1 spins and note D2 plays

  • Receive 2 → Box 2 spins and note E2 plays

    Early pseudocode for handling serial data:

function serialEvent() {
  let inData = Number(serial.read());
  if (inData >= 0 && inData < boxes.length) {
    boxes[inData].play(); // Spin box
    playNote(inData); // Play corresponding sound
  }
}

Next Steps

  1. Build a basic cardboard and tinfoil prototype for the touch-sensitive interface

  2. Test Arduino with simple touch sensor code, confirming that touchpads reliably trigger digital input signals

  3. Set up p5.serialport and confirm that Arduino can send and p5.js can receive serial data properly
  4. Prepare an initial sound mapping using the p5.sound library

  5. Create a rough version of the 3D visualizations (boxes spinning in response to input)