Final Project Documentation

SonarGame Documentation

Overview

SonarGame is an interactive browser-based game that uses a sonar sensor connected to an Arduino to control gameplay. The game features two modes – Submarine and Exploration – where players navigate through underwater environments by physically moving their hand or an object in front of the sonar sensor.

System Requirements

  • Computer with a modern web browser (Chrome recommended for WebSerial API)
  • Arduino board (Uno or similar)
  • HC-SR04 ultrasonic sensor
  • USB cable to connect Arduino to computer
  • Basic electronic components (jumper wires, optional breadboard)

Installation

Web Application Setup

  1. Download the game files to your local machine:
    • index.html
    • style.css
    • sketch.js (main game file)
    • webSerial.js (for Arduino communication)
  2. Set up the file structure:
sonar-game/
├── index.html
├── style.css
├── js/
│   ├── sketch.js
│   └── webSerial.js

Arduino Setup

  1. Connect the HC-SR04 ultrasonic sensor to your Arduino:
    • VCC to 5V on Arduino
    • GND to GND on Arduino
    • TRIG to pin 9 on Arduino
    • ECHO to pin 10 on Arduino
  2. Upload the Arduino code (see Arduino Code section)

Game Setup

HTML Structure

Create an index.html file with the following structure:

html
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Sonar Game</title>
    <link rel="stylesheet" href="style.css">
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
</head>
<body>
    <div id="gameContainer">
        <div id="gameCanvas"></div>
        
        <div id="controls">
            <div id="modeSelector">
                <h2>Select Game Mode</h2>
                <button id="submarineMode">Submarine Mode</button>
                <button id="explorationMode">Exploration Mode</button>
            </div>
            
            <div id="instructions" style="display: none;">
                <p id="gameInstructions"></p>
                <button onclick="document.getElementById('instructions').style.display='none';document.getElementById('modeSelector').style.display='flex';">Back to Mode Selection</button>
            </div>
            
            <div id="gameInfo">
                <div>Score: <span id="scoreDisplay">0</span></div>
                <div>Level: <span id="levelDisplay">1</span></div>
                <div>Lives: <span id="livesDisplay">3</span></div>
            </div>
            
            <div id="sonarControls">
                <div id="sonarSimulator">
                    <label for="sonarRange">Simulate Sonar (0-400cm): <span id="sonarValue">200</span></label>
                    <input type="range" id="sonarRange" min="0" max="400" value="200">
                </div>
                
                <div id="arduinoControls">
                    <button id="toggleSimulator">Toggle Simulator</button>
                    <button id="connectArduino">Connect Arduino</button>
                    <button id="disconnectArduino">Disconnect Arduino</button>
                </div>
            </div>
        </div>
    </div>

    <script src="js/webSerial.js"></script>
    <script src="js/sketch.js"></script>
</body>
</html>

CSS Styling

Create a style.css file:

css
body {
    margin: 0;
    padding: 0;
    font-family: Arial, sans-serif;
    background-color: #111;
    color: #0f0;
}

#gameContainer {
    display: flex;
    flex-direction: column;
    height: 100vh;
}

#gameCanvas {
    flex-grow: 1;
}

#controls {
    height: 60px;
    display: flex;
    justify-content: space-between;
    align-items: center;
    padding: 0 20px;
    background-color: #222;
}

#modeSelector {
    display: flex;
    flex-direction: column;
    position: absolute;
    top: 50%;
    left: 50%;
    transform: translate(-50%, -50%);
    background-color: rgba(0, 0, 0, 0.8);
    padding: 20px;
    border-radius: 10px;
    text-align: center;
}

#instructions {
    position: absolute;
    top: 50%;
    left: 50%;
    transform: translate(-50%, -50%);
    background-color: rgba(0, 0, 0, 0.8);
    padding: 20px;
    border-radius: 10px;
    text-align: center;
}

button {
    background-color: #0f0;
    color: #000;
    border: none;
    padding: 8px 16px;
    margin: 5px;
    border-radius: 5px;
    cursor: pointer;
}

button:hover {
    background-color: #0c0;
}

#gameInfo {
    display: flex;
    gap: 20px;
}

#sonarControls {
    display: flex;
    align-items: center;
}

#sonarSimulator {
    margin-right: 20px;
}

input[type="range"] {
    width: 200px;
    margin: 0 10px;
}

Game Mechanics

Game Modes

  1. Submarine Mode:
    • Navigate a submarine through underwater caves
    • Avoid obstacles like rocks, mines, and enemy submarines
    • Collect power-ups for extra lives and points
  2. Exploration Mode:
    • Explore ocean depths with diving gear
    • Avoid sharks, jellyfish, and trash
    • Collect treasures and discover sea creatures

Controls

The game is controlled using a sonar sensor. The distance measured by the sonar determines the vertical position of the player:

  • Closer to the sensor = higher position on screen
  • Further from sensor = lower position on screen

Scoring System

  • Each obstacle successfully avoided: +1 point
  • Power-up collection: +10 points
  • Level increases every 20 points
  • Starting with 3 lives
  • Game over when lives reach 0

Game Elements

  1. Obstacles:
    • Submarine mode: rocks, mines, enemy submarines
    • Exploration mode: sharks, jellyfish, trash
    • Colliding with obstacles reduces lives
  2. Power-ups:
    • Extra life: Restores 1 life
    • Extra points: Adds 10 points
  3. Visual effects:
    • Sonar waves: Visually represent sonar activity
    • Particles: Created on collisions and power-up collection

P5JS

 

 

Arduino Integration

Arduino Code

cpp
// SonarGame Arduino Code
// Connects HC-SR04 ultrasonic sensor to send distance readings to browser

// Pin definitions
const int trigPin = 9;  // Trigger pin of the HC-SR04
const int echoPin = 10; // Echo pin of the HC-SR04

// Variables
long duration;
int distance;

void setup() {
  // Initialize Serial communication
  Serial.begin(9600);
  
  // Configure pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
}

void loop() {
  // Clear the trigger pin
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  
  // Send a 10μs pulse to trigger
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  // Read the echo pin, duration in microseconds
  duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in centimeters
  // Speed of sound is 343m/s = 34300cm/s
  // Duration is time for sound to travel to object and back
  // So distance = (duration * 34300) / 2 / 1000000
  distance = duration * 0.034 / 2;
  
  // Limit range to 0-400cm to match game expectations
  if (distance > 400) distance = 400;
  if (distance < 0) distance = 0;
  
  // Send the distance to the Serial port
  Serial.println(distance);
  
  // Small delay before next reading
  delay(50);
}

WebSerial Integration

The game uses the WebSerial API to communicate with the Arduino. The webSerial.js file should handle:

  1. Requesting port access from the browser
  2. Opening serial connection
  3. Reading and parsing serial data
  4. Handling connection errors

Here’s a sample implementation:

javascript
// webSerial.js - Handles communication with Arduino via WebSerial API

let port;
let serialActive = false;
let reader;
let readableStreamClosed;

async function setUpSerial() {
  if ('serial' in navigator) {
    try {
      // Request port access
      port = await navigator.serial.requestPort();
      
      // Open the port with appropriate settings
      await port.open({ baudRate: 9600 });
      
      serialActive = true;
      
      // Set up reading from the port
      const decoder = new TextDecoder();
      const readableStreamClosed = port.readable.pipeTo(new WritableStream({
        write(chunk) {
          // Process each chunk of data
          let string = decoder.decode(chunk);
          
          // Call the readSerial function from sketch.js
          if (typeof readSerial === 'function') {
            readSerial(string);
          }
        }
      }));
      
      console.log("Serial connection established successfully");
    } catch (error) {
      console.error("Error opening serial port:", error);
    }
  } else {
    console.error("WebSerial API not supported in this browser");
    alert("WebSerial is not supported in this browser. Please use Chrome or Edge.");
  }
}

Customization

Adding New Obstacles

To add new obstacles, modify the createObstacle() function in sketch.js:

javascript
function createObstacle() {
  // Increase the range for obstacle types
  let obstacleType;
  
  if (gameMode === "submarine") {
    obstacleType = floor(random(4)); // Increased from 3 to 4
  } else {
    obstacleType = floor(random(4)) + 4; // Adjusted accordingly
  }
  
  let obstacle = {
    x: width + 50,
    y: random(50, height - 50),
    width: random(30, 80),
    height: random(20, 60),
    speed: 2 + level * 0.5,
    type: obstacleType,
  };
  
  obstacles.push(obstacle);
}

// Then update the drawObstacles() function to handle the new type
function drawObstacles() {
  for (let obstacle of obstacles) {
    // ... existing code ...
    
    if (gameMode === "submarine") {
      // ... existing obstacle types ...
      else if (obstacle.type === 3) {
        // New obstacle type
        // Add drawing code here
      }
    } else {
      // ... existing code ...
      else if (obstacle.type === 7) {
        // New obstacle type for exploration mode
        // Add drawing code here
      }
    }
  }
}

Adjusting Difficulty

To adjust game difficulty, modify these variables:

  1. Initial values in the resetGame() function:
    javascript
    function resetGame() {
      score = 0;
      level = 1;
      lives = 5; // Increased from 3 for easier gameplay
      // ...
    }
  2. Obstacle generation rate in the updateGame() function:
    javascript
    if (millis() - lastObstacleTime > 2500 - level * 75) { // Adjusted timing
      createObstacle();
      lastObstacleTime = millis();
    }
  3. Obstacle speed in the createObstacle() function:
    javascript
    let obstacle = {
      // ...
      speed: 1.5 + level * 0.3, // Slower progression
      // ...
    };

Troubleshooting

Game Performance Issues

If the game is running slowly:

  1. Reduce the number of background elements
  2. Decrease particle effects
  3. Lower the frequency of sonar waves
javascript
function createBackgroundElements() {
  const numElements = 30; // Reduced from 50
  // ...
}

function createParticles(x, y, count, particleColor) {
  count = Math.floor(count / 2); // Half the number of particles
  // ...
}

Arduino Connection Problems

  1. Cannot connect to Arduino:
    • Ensure Arduino is connected via USB
    • Verify correct driver installation
    • Try using a different USB port
    • Restart browser and try again
  2. Erratic sonar readings:
    • Check sensor wiring connections
    • Ensure stable power supply
    • Add smoothing to the readings:
javascript
// In Arduino code, add smoothing:
const int numReadings = 5;
int readings[numReadings];
int readIndex = 0;
int total = 0;
int average = 0;

void setup() {
  // Initialize all the readings to 0
  for (int i = 0; i < numReadings; i++) {
    readings[i] = 0;
  }
  // ... existing setup code ...
}

void loop() {
  // ... existing sonar reading code ...
  
  // Subtract the last reading
  total = total - readings[readIndex];
  // Read from sensor
  readings[readIndex] = distance;
  // Add the reading to the total
  total = total + readings[readIndex];
  // Advance to the next position in the array
  readIndex = (readIndex + 1) % numReadings;
  // Calculate the average
  average = total / numReadings;
  
  // Send the smoothed value
  Serial.println(average);
  
  // ... delay code ...
}
  1. Browser compatibility:
    • WebSerial API is only supported in Chromium-based browsers (Chrome, Edge)
    • Ensure your browser is up to date

Final Documentation

Button Beats!

My final project is a game inspired by a popular game called Piano Tiles called Button Beats. Users are able to physically play the game using physical buttons, while the gameplay is displayed on a laptop screen and recreated in p5js, with some differences like a life powerup through the special golden tile.

Video of gameplay: https://drive.google.com/file/d/1q-BEMe4s6vl2vXgGhi7uOwFdDXdhTPeO/view?usp=sharing

Arduino

My arduino program is in charge of sending all the serial inputs from the push buttons when a player presses a key on the piano to the p5js. This is similar to the musical instrument assignment we did in class except the speaker is not in the arduino but rather music playing on the computer from p5 as long as the player pressed the right key in the right time frame. The arduino code reads the serial input from the button and sends that to p5 as A, B, C, or D depending on which button was clicked. The code for this can be seen below:

const int buttonPins[] = {2, 3, 4, 5};
const char buttonChars[] = {'A', 'B', 'C', 'D'};
const int numButtons = 4;

void setup() {
  Serial.begin(9600);
  for (int i = 0; i < numButtons; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);
  }
}

#sends the serial of the button pressed as the letter A, B, C, or D 
void loop() {
  for (int i = 0; i < numButtons; i++) {
    if (digitalRead(buttonPins[i]) == LOW) {
      Serial.println(buttonChars[i]);
      delay(200);
    }
  }
}

P5JS

The p5js runs the code for the entire game. It sets up all the game states and all the game logic. The player will see a welcome screen at the start explaining the rules and a way to connect to the Arduino. Once the user presses enter, the game begins, the music starts playing, and the tiles are randomly generated for the user to physically press on the arduino. When a button is pushed on the arduino, the p5js receives this and follows the logic provided in p5js to ensure that there is currently a tile in that lane and marks that as pressed. If not, the player loses a life. Also, if the player fails to press the button before the tile leaves the screen, they lose a life. If a player loses all their lives and makes a mistake, the game goes to the game over screen and gives the player the option to restart. Here are pictures of all the game states:

Throughout the game, there is a chance to gain extra lives if the golden tile appears which has a 2.5% probability of appearing  at any time in the game. The black tiles also appear randomly, but do speed up as the game goes on which increases the game’s difficulty. The score is kept based on how many tiles you pressed before losing all your lives. Here is the link to the game: https://editor.p5js.org/izza.t/full/moFOZkumG

The circuit consists of the 4 push buttons connected to pins 2, 3, 4, and 5 as well as ground. Here is the schematic below:

I’m particularly proud of the way the game handles the inputs and creates a game that looks pretty similar to the original piano tiles itself. It was really fun from a user perspective to be able to use actual buttons to play the game instead of tapping on a screen. The game working and knowing when you’ve missed a tile or when a tile goes off screen is also something I’m pretty proud of.

One thing though is that the tiles are randomly generated and there is only one song. In the future, it would be nice to have it such that the keys are synced up to the song as is in the game Piano Tiles itself as well as options to select different songs.

 

 

Week 14 – Final Project Documentation

Concept Overview

“Sketch and Switch” is a modern take on the Etch-A-Sketch, but with a twist, where the user can draw with random colored lines with mechanical knobs, as this version uses Arduino hardware to control a digital canvas through two potentiometers and a button. The potentiometers allow left/right and up/down movement of the drawing point, and a button press toggles drawing mode while randomly altering the line’s color and thickness.

Project Interaction

Video interaction: https://drive.google.com/file/d/1h1HtV-_-JUEKgieFiu1-NDM2Pb2Thfwr/view?usp=sharing

Interaction Design

Users interact with two potentiometers and a button:

    • Left Potentiometer: Controls horizontal (X-axis) movement.
    • Right Potentiometer: Controls vertical (Y-axis) movement.
    • Button: Toggles drawing mode and changes the drawing color randomly.

Arduino Code

const int potX = A0;         // Potentiometer for horizontal movement
const int potY = A1;         // Potentiometer for vertical movement
const int buttonPin = 2;     // Pushbutton for toggling draw mode

int lastButtonState = HIGH;              
unsigned long lastDebounceTime = 0;      // Timestamp of last button state change
unsigned long debounceDelay = 20;        // Debounce time to avoid false triggers

void setup() {
  Serial.begin(115200);                  
  pinMode(buttonPin, INPUT_PULLUP);      
}

void loop() {
  int xVal = analogRead(potX);           // Read horizontal potentiometer value
  int yVal = analogRead(potY);           // Read vertical potentiometer value

  int reading = digitalRead(buttonPin);  
  int btnState = LOW;                    // Default button state to not pressed

  if (reading != lastButtonState) {
    lastDebounceTime = millis();
  }

  if ((millis() - lastDebounceTime) > debounceDelay) {
    btnState = (reading == LOW) ? 1 : 0;  // Set button state (pressed = 1)
  }

  lastButtonState = reading;

  // Send formatted data: x, y, buttonState
  Serial.print(xVal);
  Serial.print(",");
  Serial.print(yVal);
  Serial.print(",");
  Serial.println(btnState);

  delay(20); // Small delay to reduce serial flooding
}

Circuit Schematic

 

p5.js Code

let port;
let connectButton, disconnectButton, finishButton, startButton, saveButton, statusText;
let xPos = 0;
let yPos = 0;
let drawingEnabled = false;
let isConnected = false;
let prevX, prevY;
let lastButtonState = 0;
let started = false;
let tutorialShown = false;
let currentColor;
let studentImg;
let tutorialButton;

function preload() {
  studentImg = loadImage("Shamsa.PNG"); // preload image for the intro screen
}

function setup() {
  createCanvas(1280, 720); // fixed landscape canvas size
  background("#F5F5DC");

  port = createSerial(); // setup WebSerial port
  currentColor = color(random(255), random(255), random(255)); // initial random color

  // Setup start screen
  startButton = createButton("Start");
  styleButton(startButton);
  positionCenter(startButton, 0, 60);
  startButton.mousePressed(() => {
    startButton.hide();
    tutorialShown = true;
    showTutorialScreen(); // go to tutorial screen before drawing
  });

  statusText = createP("Status: Not connected");
  statusText.position(10, 10);
  statusText.hide(); // hidden until drawing mode begins
}

function styleButton(btn) {
  // Apply consistent style to all buttons
  btn.style("background-color", "#CF877D");
  btn.style("color", "black");
  btn.style("border-radius", "10px");
  btn.style("padding", "10px 15px");
  btn.style("font-size", "14px");
  btn.style("border", "none");
}

function positionCenter(btn, offsetX, offsetY) {
  // Center button horizontally/vertically with optional offset
  btn.position((width - btn.size().width) / 2 + offsetX, (height - btn.size().height) / 2 + offsetY);
}

function showTutorialScreen() {
  clear();
  background("#F5F5DC");

  // Instructions and disclaimer
  textAlign(CENTER);
  fill("#a8423d");
  textSize(32);
  text("Welcome to Sketch & Switch!", width / 2, 80);

  textSize(20);
  fill(0);
  text(
    "Disclaimer:\nThe blue knobs may be difficult at first, so twist them slowly and gently.\n" +
    "The one on the right moves ↑↓, and the one on the left moves ←→",
    width / 2, 160
  );

  text(
    "Instructions:\n1. Press 'Connect' to connect to your drawing device\n2. Twist knobs to move\n" +
    "3. Press the button on the board to change color (it will be randomized)\n" +
    "4. When finishing the drawing, click 'Finish Drawing' to clear it,\n" +
    "   or click 'Save as PNG' to download your art.\n\n Tip: Clockwise = ↑ or →, CounterClockwise = ↓ or ←",
    width / 2, 320
  );

  // Begin actual drawing
  tutorialButton = createButton("Start Drawing");
  styleButton(tutorialButton);
  tutorialButton.position(width / 2 - 70, height - 100);
  tutorialButton.mousePressed(() => {
    tutorialButton.remove();
    clear();
    background(255);
    started = true;
    setupDrawingUI(); // load UI controls for drawing
  });
}

function setupDrawingUI() {
  // Create control buttons
  connectButton = createButton("Connect");
  connectButton.mousePressed(() => {
    if (!port.opened()) {
      port.open("Arduino", 115200); // open WebSerial at 115200 baud
    }
  });
  styleButton(connectButton);

  disconnectButton = createButton("Disconnect");
  disconnectButton.mousePressed(() => {
    if (port.opened()) {
      port.close(); // safely close serial port
    }
  });
  styleButton(disconnectButton);

  finishButton = createButton("Finish Drawing");
  finishButton.mousePressed(() => {
    background(255); // clear canvas
    drawingEnabled = false;
  });
  styleButton(finishButton);

  saveButton = createButton("Save as PNG");
  saveButton.mousePressed(() => {
    saveCanvas("drawing", "png"); // download current canvas
  });
  styleButton(saveButton);

  positionUI(); // arrange buttons
  statusText.show();
}

function positionUI() {
  // Align control buttons along the top
  let baseX = width / 2 - 250;
  let y = 10;
  connectButton.position(baseX, y);
  disconnectButton.position(baseX + 130, y);
  finishButton.position(baseX + 260, y);
  saveButton.position(baseX + 420, y);
}

function draw() {
  if (!started) {
    // Intro screen only if tutorial not yet shown
    if (!tutorialShown) {
      background("#F5F5DC");
      textAlign(CENTER, CENTER);
      textSize(48);
      fill("#a8423d");
      text("Sketch & Switch!", width / 2, height / 2 - 100);

      textSize(24);
      fill(0);
      text("Press Start to Begin", width / 2, height / 2 - 40);

      imageMode(CENTER);
      image(studentImg, width / 4, height / 2 - 30, studentImg.width / 3, studentImg.height / 3);
    }
    return;
  }

  // Serial data handling (reads only once per frame to prevent lag)
  if (port.opened()) {
    isConnected = true;
    statusText.html("Status: Connected");

    let data = port.readUntil("\n");
    if (data && data.trim().length > 0) {
      processSerial(data.trim()); // pass cleaned data to be handled
    }
  } else {
    isConnected = false;
    statusText.html("Status: Not connected");
  }

  // Draw a small dot when not drawing (cursor)
  if (!drawingEnabled && isConnected) {
    fill(currentColor);
    noStroke();
    ellipse(xPos, yPos, 6, 6);
  }
}

function processSerial(data) {
  // Parse "x,y,button" format from Arduino
  let parts = data.split(",");
  if (parts.length !== 3) return;

  let xVal = int(parts[0]);
  let yVal = int(parts[1]);
  let btn = int(parts[2]);

  if (isNaN(xVal) || isNaN(yVal) || isNaN(btn)) return;

  // Map potentiometer values to canvas dimensions
  xPos = map(xVal, 0, 1023, 0, width);
  yPos = map(yVal, 0, 1023, 0, height);

  // Toggle drawing mode when button is pressed
  if (btn === 1 && lastButtonState === 0) {
    drawingEnabled = !drawingEnabled;
    currentColor = color(random(255), random(255), random(255));
    prevX = xPos;
    prevY = yPos;
  }

  // Draw if in drawing mode
  if (drawingEnabled) {
    stroke(currentColor);
    strokeWeight(2);
    line(prevX, prevY, xPos, yPos);
    prevX = xPos;
    prevY = yPos;
  }

  lastButtonState = btn; // update for debounce
}

 

Arduino and p5.js Communication

Communication is established via serial connection:

    • Arduino: Sends comma-separated values (X, Y, ButtonState) at a set interval.
    • p5.js: Reads incoming serial data, parses the values, and updates the drawing accordingly.

 

Highlights

I’m proud of the fact that the push button consistently functions to toggle between the drawing modes of randomized thickness and color. Moreover, with optimal baud rate adjustment and data handling, the potentiometers also function with minimal lag, showing smoother and finger movement.

In addition, the project also has clear on-screen instructions and simple0 controls to allow users across any age range to easily become involved, whether they are experts or complete newbies to physical computing.

Areas for Future Improvement

While Sketch & Switch functions smoothly, there’s still plenty of room to build on its foundations:

    • Add a feature where it allow users to position the drawing point before enabling drawing mode, giving them more control over line placement.
    • Adding a color wheel and thickness slider in the UI so users can manually choose colors and line widths, rather than relying solely on randomness.
    • Add an undo button to let users correct mistakes without having to clear the entire canvas.
    • Replace the current components with larger potentiometers and a larger push button for improved tactile feedback and easier control.

Final Project

Hand-Drawn Shapes Recognition System

Project Overview

Please find code on my github repository

The Hand-Drawn Shapes Recognition system is an innovative interactive desktop application that combines computer vision and machine learning to recognize shapes drawn by users in real-time. The project emerged from the recognition that while humans can easily identify simple hand-drawn shapes, creating a computer system to replicate this capability presents significant challenges due to variations in drawing styles, imprecisions, and the inherent ambiguity of hand-drawn content. The system addresses these challenges through a sophisticated hybrid approach that leverages both traditional computer vision techniques and modern machine learning methods.

At its core, the application provides users with an intuitive drawing canvas where they can create shapes using either a mouse/touchpad or a connected Arduino controller. Once a shape is drawn, the system processes the image using OpenCV for preliminary analysis and contour detection, then employs a dual-recognition strategy combining geometric feature analysis with an SVM classifier to identify the drawn shape with high accuracy. This hybrid approach enables the system to recognize various shapes even with limited training data, making it both powerful and adaptable to individual users’ drawing styles.

Beyond mere recognition, the system offers a conversational interface that provides dynamic feedback based on recognition confidence and established interaction patterns. The application continually improves its recognition capabilities through user feedback, saving labeled drawings to a training database and supporting incremental model training in the background, effectively “learning” from each interaction.

System Architecture

The system employs a modular architecture with clearly defined components that handle specific aspects of the application’s functionality. This approach enhances maintainability, supports extensibility, and simplifies the debugging process. The architecture consists of four main component categories: Core Components, UI Components, Data Management, and Hardware Integration.

The core components form the backbone of the application’s functionality. The Recognition System serves as the central element, implementing the hybrid approach to shape recognition. It orchestrates the entire recognition process from image preprocessing to final shape classification. This component contains several specialized classes including the DrawingRecognizer that coordinates the recognition workflow, the ShapeFeatureExtractor for deriving geometric and statistical features from contours, the ShapeClassifier for machine learning classification, and the GeometricAnalyzer for traditional computer vision approaches to shape identification.

Supporting the recognition system, the Drawing Manager bridges the UI and recognition system, managing drawing operations and history tracking. The Conversation Manager handles the AI assistant’s responses, providing dynamic, context-aware feedback based on recognition results and interaction history. The Text-to-Speech component adds an auditory dimension to the user experience, verbalizing the AI assistant’s responses through multiple TTS engine options.

The UI components provide the visual interface through which users interact with the system. The Main Window contains the primary application interface, housing the drawing canvas and AI response display. The Canvas component serves as the interactive drawing surface, handling mouse events and supporting features like undo/redo, zoom, and grid display. Complementing these elements, the Toolbar offers access to drawing tools such as color selection and pen size adjustments, while various dialog screens provide access to settings, training data management, and shape labeling.

Data management components ensure persistence and organized data handling. The Database Interface manages data storage using SQLite, maintaining records of user settings, labeled shapes, and drawing history. User Settings handles application preferences, while Drawing History tracks past drawings and recognition results, allowing users to review their progression over time.

Recognition Technology

The recognition technology represents the system’s most sophisticated aspect, implementing a dual-approach strategy that combines traditional computer vision techniques with machine learning. This hybrid methodology provides robust baseline performance through geometric analysis while continuously improving through machine learning from user interactions.

The recognition process begins with image preprocessing, where the drawn shape is converted to grayscale, Gaussian blur is applied to reduce noise, and adaptive thresholding creates a binary image. The system then performs contour detection using OpenCV to identify shapes within the image, extracting the largest contour as the primary shape of interest. This approach effectively isolates the intended shape even when the drawing contains imperfections or stray marks.

Feature extraction forms the next critical step in the process. The ShapeFeatureExtractor class derives a comprehensive set of geometric and statistical features from the identified contour. These features include basic metrics such as area, perimeter, and bounding box dimensions; shape properties including circularity, convexity, and solidity; moment-based features like Hu Moments that provide rotation, scale, and translation invariance; multiple levels of contour approximation; corner analysis examining count, angles, and distributions; symmetry analysis measuring vertical and horizontal symmetry; and enclosing shape analysis testing fit against geometric primitives.

With features extracted, the GeometricAnalyzer applies traditional computer vision approaches to classify the shape. This component implements specialized detectors for common shapes like rectangles, triangles, ellipses, and hexagons. Each detector analyzes the extracted features against known geometric patterns, calculating confidence scores that reflect how closely the drawing matches each potential shape type. This rule-based approach provides strong baseline recognition even before machine learning is applied.

The machine learning component, implemented in the ShapeClassifier class, adds another dimension to the recognition process. Using scikit-learn’s LinearSVC as the primary classifier, the system categorizes shapes based on their extracted features. The classification pipeline includes feature standardization to normalize values to zero mean and unit variance, feature selection using ANOVA F-value to focus on the most discriminative attributes, and finally, classification with LinearSVC including class balancing to handle imbalanced training data. This approach yields high accuracy even with limited training examples.

The final recognition decision combines results from both approaches. Geometric analysis provides baseline recognition scores, while machine learning classification results receive higher weighting when available. Confidence scores are normalized and ranked, with the system returning the top guesses along with their confidence levels. This dual-approach strategy leverages the strengths of both paradigms, producing recognition that is both accurate and continuously improving.

 

Here’s an example of debug images

User Interface

The user interface prioritizes intuitive interaction while providing access to the system’s advanced capabilities. Built with PyQt5, the interface combines simplicity with functionality to accommodate both novice and experienced users. The UI consists of several key elements designed to work together harmoniously while maintaining a clear separation of functions.

The drawing canvas serves as the primary interaction point, providing a responsive surface where users can create shapes. The canvas supports freehand drawing with customizable pen properties including color and thickness. Drawing operations benefit from features like undo/redo capability (supporting up to 50 steps), zoom and pan functionality for detailed work, optional grid display for alignment assistance, and pressure sensitivity support for hardware that offers this capability. An auto-save function ensures work is preserved even in the event of unexpected issues.

Complementing the canvas, the toolbar provides access to essential drawing tools and functions. Users can select pen color from a palette or using a color picker, adjust stroke thickness through a slider control, toggle between pen and eraser modes, clear the canvas with a single click, and access undo/redo functions for correcting mistakes. The toolbar’s layout prioritizes frequently used functions while maintaining a clean, uncluttered appearance that doesn’t distract from the drawing process.

The information panel displays the AI assistant’s responses and recognition results. After recognition, this area shows the top shape guesses along with their confidence percentages, presented in a clear, easy-to-understand format. The assistant’s conversational responses provide context-aware feedback, varying based on recognition confidence and previous interactions to avoid repetitive messaging. This panel also offers buttons for confirming or correcting the AI’s guesses, facilitating the training feedback loop that improves recognition over time.

Dialog screens provide access to less frequently used functions without cluttering the main interface. The Settings Dialog allows users to customize application behavior through categories including general settings, drawing tool properties, recognition parameters, and text-to-speech options. The Training Dialog displays statistics on the training data, showing labeled shapes in the database and allowing management of training examples. The Label Dialog facilitates correction of misrecognized shapes, capturing user feedback that enhances model performance.

The user experience flow has been carefully designed to feel natural and responsive. When drawing a shape, users experience immediate visual feedback as their strokes appear on the canvas. Upon requesting recognition (either through the UI button or Arduino controller), the system processes the drawing and promptly displays results in the information panel. The conversational AI response provides context to the recognition, often suggesting improvements or offering praise based on recognition confidence. If the system misidentifies a shape, users can easily correct it, with the application acknowledging this feedback and incorporating it into future recognition attempts.

Hardware Integration

The system extends beyond traditional mouse and keyboard input through its Arduino integration, offering a novel physical interaction method that enhances the drawing experience. This hardware component connects through a serial interface and enables users to draw shapes using physical controls rather than conventional computer input devices.

The Arduino controller serves as an alternative input method, allowing users to draw using a joystick and trigger various actions with physical buttons. Five buttons are mapped to specific functions: triggering shape recognition, clearing the canvas, changing drawing color, adjusting stroke thickness, and toggling drawing mode. These correspond to pins 2 through 6 on the Arduino board. Drawing control is achieved through a potentiometer connected to pins A0 and A1, offering analog control of cursor position similar to a joystick. This physical interface provides a more tactile drawing experience that some users may find more intuitive than mouse-based drawing.

The system implements robust connection management for the Arduino controller. At application startup, the program automatically scans available serial ports to detect connected Arduino devices. Once detected, a dedicated thread continuously reads input data, translating it into drawing actions within the application. The connection management system includes auto-reconnect capability, allowing the application to recover from temporary disconnections without requiring user intervention. This ensures reliable hardware communication even in environments where connections might be intermittent.

Data processing for Arduino input employs a buffering system to handle the potentially variable data rate from the serial connection. Incoming data follows a structured format that indicates the input type (button press or analog input) and its value, with the application parsing this information and triggering appropriate actions in response. Analog inputs are normalized and mapped to canvas coordinates, ensuring smooth and predictable cursor movement despite potential variations in potentiometer readings.

Error handling for hardware integration is particularly robust, accounting for common issues like connection loss, malformed data, or hardware failures. The system implements graceful degradation when hardware components are unavailable, automatically falling back to mouse/keyboard input if the Arduino connection cannot be established or is lost during operation. Users receive clear notifications about hardware status through the application’s status bar, ensuring they remain informed about available input methods.

Development Journey

The development of the Hand-Drawn Shapes Recognition system followed an iterative process with distinct phases that progressively enhanced functionality and performance. Each phase built upon previous achievements while addressing limitations and incorporating user feedback, though not without significant challenges along the way.

The project began with the foundation phase, establishing the basic architecture and developing core components. During this period, I implemented the PyQt5-based user interface with a functional drawing canvas and basic shape extraction using OpenCV. Initial recognition relied solely on geometric analysis using traditional computer vision techniques, providing reasonable accuracy for well-drawn shapes but struggling with imprecise or ambiguous drawings. This phase established the project’s technical foundation while highlighting the need for more sophisticated recognition approaches.

As development progressed, I introduced machine learning to enhance recognition capabilities. Initial experiments with various classifiers led to my selection of Support Vector Machines as the primary classification algorithm due to their effectiveness with limited training data. The first training dataset consisted of manually labeled examples across eight shape categories: cross, square, other, triangle, ellipse, rectangle, hexagon, and line. This initial training process demonstrated the potential of machine learning while revealing challenges in data collection and feature selection.

A significant milestone occurred when the training dataset expanded to over 10,000 samples. Console output from this period reveals the distribution across shape categories: ellipse with 2,970 samples, rectangle with 2,680 samples, triangle with 2,615 samples, and “other” with 1,950 samples represented the majority classes, while cross (36 samples), square (17 samples), hexagon (16 samples), and line (20 samples) constituted minority classes. Training the model with this imbalanced dataset required careful consideration of class weights to prevent bias toward majority classes. The training process extracted 36 numeric features from each sample, excluding non-numeric data that might compromise model performance.

The training process during this phase required several minutes of computation, highlighting performance considerations that would later be addressed through optimization. Despite these challenges, the model achieved impressive accuracy metrics with 0.99 training accuracy and 0.97 test accuracy. These results validated the machine learning approach while providing a baseline for future improvements. Upon completion, the model was saved to disk as “shape_classifier.pkl” for subsequent use by the application.

One of the most devastating challenges I faced during development was losing approximately 60% of the codebase due to a catastrophic data loss incident. Most critically, this included a highly refined model that I had trained on over 10 million shapes from Google’s Quick Draw dataset. This advanced model represented tens of hours of training time across multiple GPU instances and had achieved significantly higher accuracy rates than previous iterations, particularly for complex and ambiguous shapes. Rebuilding after this loss required considerable effort, recreating critical code components from memory and documentation while working to reconstruct a training dataset that could approach the quality of the lost model.

Hardware integration represented another significant development phase. The Arduino controller implementation expanded the application’s input options while introducing new technical challenges related to serial communication and input mapping. I worked through issues of connection reliability, data parsing, and input calibration to create a seamless experience across both traditional and hardware-based input methods. This integration demonstrated the system’s flexibility while providing a novel interaction method that some users preferred over mouse-based drawing.

Throughout the development journey, user feedback played a crucial role in refining the system. Early testing revealed usability issues in the drawing interface that I addressed through UI refinements. Recognition errors highlighted gaps in the training data, leading to targeted data collection for underrepresented shape categories. Performance concerns during recognition and training prompted the optimization efforts that would become a major focus in later development stages.

Performance Optimization

As the system evolved and the training dataset grew, performance optimization became increasingly important to maintain responsiveness and enable real-time recognition. I implemented several key optimizations that significantly improved both training speed and runtime performance, particularly critical after losing my previous highly-optimized model.

A fundamental enhancement involved replacing the original SVC (Support Vector Classifier) implementation with LinearSVC, dramatically reducing computational complexity from O(n³) to O(n). This change resulted in training times that scaled linearly rather than cubically with dataset size, making it practical to train with larger datasets and more features. For a dataset with over 10,000 samples, this optimization reduced training time from hours to minutes, enabling more frequent model updates and facilitating experimentation with different feature sets and hyperparameters.

The feature extraction process, initially a bottleneck during both training and recognition, benefited from several optimizations. I implemented parallel feature extraction using Python’s multiprocessing capabilities, distributing the CPU-intensive work of calculating geometric features across multiple processor cores. This approach achieved near-linear speedup on multi-core systems, significantly reducing processing time for large batches of training images. Additionally, vectorizing operations with NumPy replaced inefficient Python loops with optimized array operations, further accelerating the feature calculation process. These optimizations were essential not just for performance, but for helping me recover from the lost advanced model by making retraining more efficient.

Data management optimizations addressed I/O-related performance issues. The system implemented batch loading and preprocessing of training data, reducing disk access frequency and allowing more efficient memory utilization. Feature caching stored pre-computed features for training examples, eliminating redundant calculations when retraining the model or performing incremental updates. Database operations were optimized with appropriate indexing and query strategies, ensuring efficient retrieval of training examples and user settings even as the database grew in size.

The recognition pipeline itself underwent substantial optimization to support real-time feedback. The system implemented adaptive algorithm selection, applying simpler, faster recognition methods for clear, well-formed shapes while reserving more computationally intensive analysis for ambiguous cases. Feature selection using techniques like Principal Component Analysis (PCA) and SelectKBest reduced the dimensionality of the feature space without significantly impacting accuracy, accelerating both training and inference. Memory management techniques minimized allocations during recognition, reducing garbage collection overhead and preventing memory-related performance degradation.

Command-line options added during this phase provided further optimization capabilities. The --retry flag enabled an automatic retry mechanism for failed samples, improving training robustness. Users could configure the maximum number of retry attempts with --max-retry-attempts (defaulting to 3) and specify the minimum required samples per shape class with --min-samples (defaulting to 10). For situations where machine learning was unnecessary or unavailable, the --geometric-only option limited recognition to geometric template rendering, reducing computational requirements. The --output option allowed specifying a custom output path for the trained model, facilitating experimentation with different model configurations.

These optimization efforts transformed the application from a proof-of-concept demonstration to a practical tool suitable for regular use. Recognition response times decreased from seconds to sub-second levels, providing the immediate feedback essential for a satisfying user experience. Training times reduced dramatically, enabling more frequent model updates and supporting the incremental learning approach that helped the system adapt to individual users’ drawing styles.

Future Enhancements

The Hand-Drawn Shapes Recognition system establishes a solid foundation that can be extended in numerous directions to enhance functionality, improve performance, and expand applicability. While the current implementation successfully addresses the core recognition challenge, several potential enhancements have been identified for future development iterations.

Advanced machine learning represents a promising direction for further development. Integrating deep learning approaches, particularly convolutional neural networks (CNNs), could improve recognition accuracy for complex shapes without requiring explicit feature engineering. Transfer learning from pre-trained models would enable leveraging existing visual recognition capabilities while reducing the required training data volume. Implementing ensemble methods combining multiple classifiers could enhance recognition robustness, especially for ambiguous cases where different approaches might yield complementary insights.

User experience enhancements could make the application more intuitive and powerful. Implementing multi-shape recognition would allow the system to identify multiple distinct shapes within a single drawing, expanding its applicability to more complex diagrams. A shape suggestion system could provide real-time guidance as users draw, helping them create more recognizable shapes. Enhanced drawing tools including shape creation templates, text annotation, and layer support would transform the application from a recognition demonstrator to a complete drawing tool with intelligent recognition capabilities.

Platform expansion represents another potential development path. Creating web and mobile versions of the application would increase accessibility, allowing users to benefit from shape recognition across different devices. Cloud-based training and recognition would enable sharing improvements across the user base, with each user’s corrections potentially improving the system for everyone. API development would allow third-party integration, enabling other applications to leverage the recognition capabilities for their own purposes.

Educational applications offer particularly promising opportunities. Developing specialized modes for teaching geometry could help students learn shape properties through interactive drawing and recognition. Creating games based on shape recognition would make learning engaging while simultaneously gathering valuable training data. Implementing custom shape sets would allow teachers to create domain-specific recognition tasks targeting particular educational objectives.

Accessibility improvements could make the system more inclusive. Enhanced text-to-speech integration would better serve users with visual impairments, providing more detailed auditory feedback about recognition results and drawing state. Implementing alternative input methods beyond the current mouse/touchpad and Arduino options could accommodate users with different abilities and preferences. Creating profiles for different user needs would allow the interface and recognition parameters to adapt automatically based on individual requirements.

The continuous improvement framework established in the current implementation provides a solid foundation for these enhancements. The modular architecture facilitates adding new components without disrupting existing functionality, while the dual-approach recognition strategy can incorporate new techniques alongside proven methods. As the system evolves, it will continue building on its core strengths while expanding to address new challenges and opportunities in shape recognition and interactive drawing.

Conclusion

The Hand-Drawn Shapes Recognition system represents my creation of a sophisticated blend of computer vision, machine learning, and interactive design, resulting in an application that not only recognizes hand-drawn shapes but continuously improves through user interaction. By implementing a hybrid approach combining geometric analysis with machine learning, my system achieves high recognition accuracy even with limited initial training data, while establishing a framework for ongoing enhancement through user feedback.

My development journey illustrates the iterative process of building intelligent interactive systems, progressing from basic geometric analysis to sophisticated machine learning while continuously refining the user experience. This journey included overcoming significant setbacks, most notably losing 60% of the codebase and an advanced model trained on over 10 million Quick Draw shapes that represented tens of hours of training time. Despite these challenges, I persevered, rebuilding critical components and implementing performance optimizations that transformed promising algorithms into a responsive application suitable for regular use, demonstrating how theoretical approaches can be successfully adapted to practical applications through thoughtful engineering, resilience, and attention to user needs.

The system’s modular architecture and dual-approach recognition strategy provide a flexible foundation for future development, supporting enhancements from advanced machine learning techniques to expanded platform support and specialized applications. This extensibility ensures the project can evolve to address new requirements and incorporate emerging technologies while maintaining its core functionality and user-friendly design, and provides resilience against potential future setbacks similar to my experience with data loss.

Throughout development, the balance between sophistication and accessibility remained a central consideration. While implementing advanced recognition techniques, I maintained focus on creating an intuitive interface that hides complexity from users while providing transparent feedback about recognition results. This approach makes the technology accessible to users regardless of their technical background, fulfilling my project’s goal of bridging the gap between human perceptual abilities and computer vision capabilities.

The Hand-Drawn Shapes Recognition system stands as both a practical application and a technological demonstration of my work, showing how computer vision and machine learning can enhance human-computer interaction in creative contexts. The project also represents my perseverance through significant technical challenges and data loss, emerging stronger with more efficient algorithms and robust error handling. As the system continues to evolve under my development, it will further narrow the gap between how humans and computers perceive and interpret visual information, creating increasingly natural and intuitive interaction experiences while maintaining resilience against the inevitable challenges of complex software development.

Week 14 – Final Project

Concept

My project is a DIY Disco that combines Arduino and p5.js to create an interactive audio-visual experience. The Arduino handles the physical buttons, DC motor and controls a NeoPixel LED strip, while p5.js is responsible for generating a visualizer that reacts to the music.

How It Works

The project consists of two main components:

Arduino Control and LED Display

  • Three physical buttons are connected to the Arduino:
      • Button 3: Activates the airhorn sound effect
      • Button 2: Increases the speed of the music
      • Button 1: Shuffles to a different song
  • The NeoPixel strip lights up with different patterns that shuffle when Shuffle button is pressed

p5.js Visualizer and Music Control

  • p5.js generates a circular music visualizer that responds to the song’s frequencies. It receives signals from the Arduino through serial communication to trigger changes in the visualizer based on button presses.

Interaction Design

The interaction design is simple and engaging: pressing physical buttons on the Arduino changes the music’s speed, shuffles the track, or plays a sound effect while the NeoPixel lights and p5.js visualizer respond instantly.

Arduino Code

My project features three physical buttons connected to the Arduino. Button 1 triggers a signal (1) that is sent to p5.js and triggers the airhorn effect,  Button 2 sends a different signal (2) to p5.js, which affects the speed of the song playing. Button 3  sends a signal (3) to shuffle to the next song as well as cycles through five distinct LED animation patterns. Alongside this, the Arduino also manages the motor that spins the vinyl. Through serial communication with p5.js, the motor’s state is toggled based on signals received. A ‘1’ signal turns the motor on, while a ‘0’ stops it and clears the LED display. Below is the code:

#include <Adafruit_NeoPixel.h>

const int button1Pin = 2;
const int button2Pin = 3;
const int button3Pin = 4;
const int bin1Pin = 8;
const int bin2Pin = 7;
const int pwmBPin = 9;
bool motorRunning = true;

#define NEOPIXEL_PIN 6
#define LED_COUNT    60
#define Num_LED      56

Adafruit_NeoPixel strip(LED_COUNT, NEOPIXEL_PIN, NEO_GRBW + NEO_KHZ800);

unsigned long lastButtonPress1 = 0;
unsigned long lastButtonPress2 = 0;
unsigned long lastButtonPress3 = 0;
unsigned long lastActionTime1 = 0;
unsigned long lastActionTime2 = 0;
unsigned long lastActionTime3 = 0;
const unsigned long debounceDelay = 100;
const unsigned long cooldown = 1000;

int currentPattern = 0;
const int totalPatterns = 5;

unsigned long lastPatternUpdate = 0;
const unsigned long patternInterval = 80;

int snakeIndex = 0;
float hueOffset = 0;

void setup() {
  Serial.begin(9600);
  pinMode(button1Pin, INPUT_PULLUP);
  pinMode(button2Pin, INPUT_PULLUP);
  pinMode(button3Pin, INPUT_PULLUP);
  pinMode(bin1Pin, OUTPUT);
  pinMode(bin2Pin, OUTPUT);
  pinMode(pwmBPin, OUTPUT);
  strip.begin();
  strip.show();
  strip.setBrightness(180);
}

void loop() {
  unsigned long currentMillis = millis();

  handleButtons(currentMillis);
  handleSerial();
  controlMotor();

  if (currentMillis - lastPatternUpdate >= patternInterval) {
    lastPatternUpdate = currentMillis;
    runPattern(currentPattern);
  }
}

void handleButtons(unsigned long currentMillis) {
  if (digitalRead(button1Pin) == LOW && currentMillis - lastButtonPress1 >= debounceDelay && currentMillis - lastActionTime1 >= cooldown) {
    Serial.println("1");
    lastButtonPress1 = currentMillis;
    lastActionTime1 = currentMillis;
  }

  if (digitalRead(button2Pin) == LOW && currentMillis - lastButtonPress2 >= debounceDelay && currentMillis - lastActionTime2 >= cooldown) {
    Serial.println("2");
    lastButtonPress2 = currentMillis;
    lastActionTime2 = currentMillis;
  }

  if (digitalRead(button3Pin) == LOW && currentMillis - lastButtonPress3 >= debounceDelay && currentMillis - lastActionTime3 >= cooldown) {
    Serial.println("3");
    currentPattern = (currentPattern + 1) % totalPatterns;
    lastButtonPress3 = currentMillis;
    lastActionTime3 = currentMillis;
  }
}

void handleSerial() {
  if (Serial.available() > 0) {
    char incomingByte = Serial.read();
    if (incomingByte == '1') {
      motorRunning = true;
    } else if (incomingByte == '0') {
      motorRunning = false;
      digitalWrite(bin1Pin, LOW);
      digitalWrite(bin2Pin, LOW);
      analogWrite(pwmBPin, 0);
      strip.clear();
      strip.show();
    }
  }
}

void controlMotor() {
  if (motorRunning) {
    digitalWrite(bin1Pin, HIGH);
    digitalWrite(bin2Pin, LOW);
    analogWrite(pwmBPin, 50);
  } else {
    digitalWrite(bin1Pin, LOW);
    digitalWrite(bin2Pin, LOW);
    analogWrite(pwmBPin, 0);
  }
}

// === Pattern Dispatcher ===
void runPattern(int pattern) {
  switch (pattern) {
    case 0: discoFlash(); break;
    case 1: snakeCrawl(); break;
    case 2: colorWave(); break;
    case 3: sparkleStars(); break;
    case 4: fireGlow(); break;
  }
}

// Pattern 0: Disco Flash
void discoFlash() {
  for (int i = 0; i < Num_LED; i++) {
    strip.setPixelColor(i, randomColor());
  }
  strip.show();
}

// Pattern 1: Snake Crawl
void snakeCrawl() {
  strip.clear();
  int snakeLength = 6;
  for (int i = 0; i < snakeLength; i++) {
    int index = (snakeIndex + i) % Num_LED;
    strip.setPixelColor(index, Wheel((index * 5 + hueOffset)));
  }
  snakeIndex = (snakeIndex + 1) % Num_LED;
  hueOffset += 1;
  strip.show();
}

// Pattern 2: Smooth Rainbow Wave
void colorWave() {
  for (int i = 0; i < Num_LED; i++) {
    int hue = (i * 256 / Num_LED + (int)hueOffset) % 256;
    strip.setPixelColor(i, Wheel(hue));
  }
  hueOffset += 1;
  strip.show();
}

// Pattern 3: Sparkle Stars
void sparkleStars() {
  for (int i = 0; i < Num_LED; i++) {
    strip.setPixelColor(i, (random(10) < 2) ? strip.Color(255, 255, 255) : strip.Color(0, 0, 10));
  }
  strip.show();
}

// Pattern 4: Fire Glow
void fireGlow() {
  for (int i = 0; i < Num_LED; i++) {
    int r = random(180, 255);
    int g = random(0, 100);
    int b = 0;
    strip.setPixelColor(i, strip.Color(r, g, b));
  }
  strip.show();
}

// Helpers
uint32_t randomColor() {
  return strip.Color(random(256), random(256), random(256));
}

uint32_t Wheel(byte WheelPos) {
  WheelPos = 255 - WheelPos;
  if (WheelPos < 85) {
    return strip.Color(255 - WheelPos * 3, 0, WheelPos * 3);
  }
  if (WheelPos < 170) {
    WheelPos -= 85;
    return strip.Color(0, WheelPos * 3, 255 - WheelPos * 3);
  }
  WheelPos -= 170;
  return strip.Color(WheelPos * 3, 255 - WheelPos * 3, 0);
}

P5 code

The code uses createSerial() to open a serial port, allowing it to send and receive data between the p5 sketch and the arduino. The sensor values received from the Arduino (via serialPort.readUntil(“\n”)) trigger different actions within the p5 sketch based on specific sensor inputs. For example, a sensor value of 1 plays an airhorn sound, 2 toggles the playback speed of the song, and 3 shuffles to a new random song. The sensor values are continuously checked in the checkButtonPress() function, which responds accordingly by performing actions like playing sounds or changing song attributes.

The logic behind the visualizer relies on the FFT (Fast Fourier Transform) analysis of the audio. The fft.analyze() function breaks the audio into different frequency bands, so it will give a spectrum that represents the amplitude of different frequencies in the sound. The visualizer then maps these frequency intensities, for my project I decided to do so in a circular arrangement around the center of the screen, where each bar’s height is determined by the amplitude of its corresponding frequency band. The visualizer is updated in real time, so if the music is changed, or the speed is changed it will reflect those changes.

// === GLOBAL VARIABLES === //
let state = "landing";
let song;
let fft;
let selectedSong = "";
let sensorValue = 0;
let serialPort;
let serialSpeed = 9600;

let partySongs = ["nowahala.mp3", "umbrella.mp3", "yeah.mp3", "onlygirl.mp3", "hips.mp3","feeling.mp3","romance.mp3","monalisa.mp3","move.mp3","saywhat.mp3","yamore.mp3","adore.mp3","gorah.mp3"];
let airhornSound;

let startButton, continueButton, restartButton;
let isFastSpeed = false;

let bgImg;
let Instructions;
let vis;

// === PRELOAD === //
function preload() {
  bgImg = loadImage('bg.png');
  Instructions = loadImage('instructions.png');
  vis =loadImage('vis.png')
  airhornSound = loadSound("airhorn.mp3");
}

// === SETUP === //
function setup() {
  createCanvas(1460, 760);
  textAlign(CENTER, CENTER);
  angleMode(DEGREES);
  colorMode(HSB);

  // Start Button
  startButton = createButton("Start");
  styleButton(startButton, width / 2 - 45, height / 2 + 200);
  startButton.mousePressed(() => {
    state = "instructions";
    hideAllButtons();
    continueButton.show();
  });

  // Continue Button
  continueButton = createButton("Continue");
  styleButton(continueButton, width / 2 - 60, height / 2 + 200);
  continueButton.mousePressed(() => {
    selectedSong = random(partySongs);
    state = "visualizer";
    hideAllButtons();
  });
  continueButton.hide();

  // Restart Button
  restartButton = createButton("Restart");
  styleButton(restartButton, 20, 20, true);
  restartButton.mousePressed(() => {
    if (song && song.isPlaying()) song.stop();
    song = undefined;
    state = "landing";
    isFastSpeed = false;
    hideAllButtons();
    startButton.show();
    if (serialPort.opened()) serialPort.write("0");
  });
  restartButton.hide();

  // Serial setup
  serialPort = createSerial();
  let previous = usedSerialPorts();
  if (previous.length > 0) {
    serialPort.open(previous[0], serialSpeed);
  }
}

// === STYLE BUTTONS === //
function styleButton(btn, x, y, small = false) {
  btn.position(x, y);
  btn.style("padding", small ? "8px 16px" : "12px 24px");
  btn.style("font-size", small ? "16px" : "18px");
  btn.style("background-color", "#ffc700");
  btn.style("border", "none");
  btn.style("border-radius", "12px");
  btn.mouseOver(() => btn.style("background-color", "#FFFF"));
  btn.mouseOut(() => btn.style("background-color", "#ffc700"));
  btn.hide();
}

function hideAllButtons() {
  startButton.hide();
  continueButton.hide();
  restartButton.hide();
}

// === DRAW === //
function draw() {
  let data = serialPort.readUntil("\n");
  if (data.length > 0) {
    sensorValue = int(data);
    checkButtonPress(sensorValue);
  }

  if (state === "landing") {
    showLanding();
    startButton.show();
    if (serialPort.opened()) serialPort.write("0");
  } else if (state === "instructions") {
    showInstructions();
    continueButton.show();
  } else if (state === "visualizer") {
    restartButton.show();
    if (song === undefined) {
      loadSong(selectedSong);
    }
    runVisualizer();
    if (serialPort.opened()) serialPort.write("1");
  }
}

// === LANDING SCREEN === //
function showLanding() {
  image(bgImg, 0, 0, width, height);
}

// === INSTRUCTIONS SCREEN === //
function showInstructions() {
  image(Instructions, 0, 0, width, height);
}

// === LOAD SONG === //
function loadSong(songName) {
  song = loadSound(songName, startSong);
  fft = new p5.FFT();
}

function startSong() {
  song.rate(isFastSpeed ? 1.5 : 1.0);
  song.loop();
}

// === VISUALIZER === //
function runVisualizer() {
  let spectrum = fft.analyze();
  let lowerLimit = 0;
  let upperLimit = Math.floor(spectrum.length / 2);
  let numBars = upperLimit - lowerLimit;
  let radius = 70;
  let angleStep = 360 / numBars;
  let maxBarHeight = height / 1.8;


  image(vis, 0, 0, width, height);

  push();
  translate(width / 2, height / 2);

  for (let j = 0; j < 4; j++) {
    push();
    rotate(j * 90);
    for (let i = lowerLimit; i < upperLimit; i++) {
      let angle = (i - lowerLimit) * angleStep;
      let barHeight = map(spectrum[i], 0, 500, 15, maxBarHeight);
      let xEnd = cos(angle) * (radius + barHeight);
      let yEnd = sin(angle) * (radius + barHeight);
      stroke('#ffc700');
      line(0, 0, xEnd, yEnd);
    }
    pop();
  }

  pop();
}

// === CHECK SERIAL INPUT === //
function checkButtonPress(sensorValue) {
  if (state === "visualizer") {
    if (sensorValue === 1) {
      playAirhorn();
    } else if (sensorValue === 2) {
      toggleSpeed();
    } else if (sensorValue === 3) {
      shuffleNextSong();
    }
  }
}

// === SHUFFLE SONG === //
function shuffleNextSong() {
  let nextSong = random(partySongs);
  if (song && song.isPlaying()) song.stop();
  selectedSong = nextSong;
  isFastSpeed = false;
  loadSong(selectedSong);
}

// === TOGGLE SPEED === //
function toggleSpeed() {
  if (song) {
    isFastSpeed = !isFastSpeed;
    song.rate(isFastSpeed ? 1.5 : 1.0);
  }
}

// === PLAY AIRHORN === //
function playAirhorn() {
  if (airhornSound.isLoaded()) {
    airhornSound.play();
  }
}

 

Reflection/Difficulties

The project was not without its difficulties. One of my primary challenges involved oversensitive buttons. The issue arose because the buttons were triggering multiple actions from a single press, causing  these unintended rapid cycling of effects. To address this, I implemented a debounce mechanism and a cooldown timer, which made sure that each button press only activated an action once and prevented continuous cycling. This solution helped smooth out the interaction, making the experience more intuitive for the user.

Navigating the FFT (Fast Fourier Transform) in this project was also a challenge, as it involves converting an audio signal into its frequency components, which then drive the visual effects. The concept of analyzing an audio signal in terms of its frequency spectrum was at first a bit tricky to grasp. The FFT function takes the sound data, decomposes it into various frequency bands, and produces an array of amplitude values that represent the strength of each frequency. The biggest hurdle was understanding how to properly interpret and map these frequency values to create my visuals. For instance, the fft.analyze() function returns an array of amplitudes across a range of frequencies, but to effectively use this data, I needed to determine which frequency bands would be most useful for creating the visualizer’s elements. After some experimentation, I decided to focus on the lower and mid-range frequencies, which seemed to correspond best with the types of beats and musical elements I wanted to visualize.

Another significant issue was the motor’s weakness, which required a jump start for it to function properly. This created confusion for users, as they struggled to understand why the motor wasn’t working correctly.

Overall, I am very happy with how my project turned out, as it encompassed most if not all the things I wanted it to do. If I were to improve it however, I would maybe make the neopixels visualize the music as well, get the data from the FFT and send it to arduino to visaulize the changes in music. Maybe also add more sound effects, and additional interactive physical elements.

Final Project – Code Black

For my final project, I created a hybrid digital-physical escape room-style game inspired by timed bomb defusal sequences in video games. The entire experience blends physical interactivity through Arduino with dynamic visuals and logic in p5.js. The result is a tense, fast-paced game where players must race against the clock to complete four distinct challenges and successfully input a final disarm code before the device “explodes.”

The core idea was to simulate a bomb defusal setup using varied mini-games—each one testing a different skill: speed, logic, memory, and pattern recognition.

How the Game Works

The digital game runs in p5.js and connects to a single Arduino board that handles inputs from physical buttons, rotary encoders, and switches. Players are given 80 seconds to complete four escalating stages:

  1. Button Mash Challenge – Tap a physical button repeatedly until a counter hits the target.

  2. Math Riddle Quiz – Use a button to select answers and confirm; one wrong answer ends the game.

  3. Note Match – Listen to a played note and match it using one of four physical options.

  4. Morse Code Challenge – Decode a flashing Morse signal and reproduce it using a button.

After all four stages, the player must recall and enter a 4-digit code, derived from the hidden logic behind each stage. If they enter it correctly, the bomb is defused and a green screen confirms success. Otherwise—boom. Game over.

A countdown timer runs persistently, and sound effects, animations, and images change based on player progress, creating an immersive narrative.

Video Demo:  https://drive.google.com/drive/folders/1xghtShbdS5ApygD3-LrT41DRQsbnQ98U?usp=share_link

Hardware & Physical Build

This game relies heavily on Arduino to provide tactile interaction. Here’s how each component contributes:

  • Button Mash: A simple digital input wired to a button switch.

  • Math Quiz: A designated button allows users to scroll through numeric answers, with a button to lock in their choice.

  • Note Match: A speaker plays a pitch generated from Arduino, and players must select the correct note using four distinct buttons.

  • Morse Code: The p5 screen shows a pattern, which the player must replicate with button presses (dots and dashes).

To enhance the look, I created screen graphics for each stage and embedded them as assets into the p5.js sketch. I also used audio cues (success/failure sounds) to give it more feedback.

Code + Serial Integration

The p5.js sketch acts as the game engine and controller, managing state transitions, timing, visuals, and logic. Arduino handles all the physical input and sends data to p5.js over serial using a consistent message format.

Initially, I experimented with sending raw characters for stage signals and player responses, but ran into reliability issues. Eventually, I switched to using numeric values and simple prefixes which made parsing much more predictable.

There’s a small but critical serial timing issue to manage — making sure Arduino doesn’t flood the buffer, and that p5 reads and trims data consistently. I handled this using readUntil("\n").trim() on the p5 side and line breaks on the Arduino side.

I also implemented a game reset trigger — pressing “R” after the game ends resets both the p5 and Arduino states and lets the player start over without refreshing the browser.

Arduino Code:

// initialize connections
const int BUTTON_PINS[] = {2, 3, 4, 5}; 
const int BUZZER_PIN = 8;
const int CONFIRM_PIN = 7;
const int POT_PIN = A0;
const int RED_LED_PIN_1 = 9;
const int YELLOW_LED_PIN_1 = 10;
const int GREEN_LED_PIN_1 = 12;
const int RED_LED_PIN_2 = 13;
const int YELLOW_LED_PIN_2 = 11;
const int GREEN_LED_PIN_2 = 6;

// initialize all game variables
int currentPressed = -1;
int targetNote = -1;
bool newRound = true;
bool morsePressed = false;
unsigned long morseStart = 0;
int buttonMashCount = 0;
int currentGame = 0;

bool bombDefused = false;
bool bombExploded = false;
bool gameEnded = false;

unsigned long gameStartTime;
const unsigned long GAME_DURATION = 80000; 

bool inCountdown = false;
unsigned long lastBeepTime = 0;
unsigned long beepInterval = 1000;

int blinkState = 0; 
unsigned long lastBlinkTime = 0;
unsigned long blinkInterval = 400;
bool ledOn = false;

void setup() {
  Serial.begin(9600);
  while (!Serial);
  // setup and initialize all physical connections
  for (int i = 0; i < 4; i++) pinMode(BUTTON_PINS[i], INPUT_PULLUP);
  pinMode(BUZZER_PIN, OUTPUT);
  pinMode(CONFIRM_PIN, INPUT_PULLUP);
  pinMode(POT_PIN, INPUT);
  pinMode(RED_LED_PIN_1, OUTPUT);
  pinMode(YELLOW_LED_PIN_1, OUTPUT);
  pinMode(GREEN_LED_PIN_1, OUTPUT);
  pinMode(RED_LED_PIN_2, OUTPUT);
  pinMode(YELLOW_LED_PIN_2, OUTPUT);
  pinMode(GREEN_LED_PIN_2, OUTPUT);

  randomSeed(analogRead(A1));
  gameStartTime = millis();
}

void loop() {
  if (Serial.available()) {
    String input = Serial.readStringUntil('\n');
    input.trim();

    if (input == "RESET") {
      resetGame(); // reset game variables when reset command is received
      return;
    }
    // to identify target note sent by arduino
    if (input.startsWith("NOTE:")) {
      targetNote = input.substring(5).toInt();  // parses 6th character which holds numeric value
      newRound = true;
      return;
    }
    // if bomb is defused on p5, it sends input to arduino and bomb is defused here as well
    if (input == "DEFUSED"){
      bombDefused= true; 
      gameEnded = true;
      return;
    }
    // in case user makes a mistake in games, p5 sends exploded to arduino
    if (input == "EXPLODED") {
      bombExploded = true;
      gameEnded = true;
      Serial.println("EXPLOSION_ACK");
      return;
    }
    // to parse game sent to arduino each time a challenge is completed and we move to next one
    currentGame = input.toInt();

    if (currentGame == 0) buttonMashCount = 0;
    if (currentGame == 2) newRound = true;
  }
  // when bomb is defused or explodes
  if (gameEnded) {
    noTone(BUZZER_PIN);
    return;
  }
  // turn of all leds 
  if (bombExploded || bombDefused) {
    digitalWrite(RED_LED_PIN_1, LOW);
    digitalWrite(YELLOW_LED_PIN_1, LOW);
    digitalWrite(GREEN_LED_PIN_1, LOW);
    digitalWrite(RED_LED_PIN_2, LOW);
    digitalWrite(YELLOW_LED_PIN_2, LOW);
    digitalWrite(GREEN_LED_PIN_2, LOW);

    noTone(BUZZER_PIN);
  }

  unsigned long elapsed = millis() - gameStartTime;
  // handles blinking of leds alternatively until 30 seconds are left 
  unsigned long remaining = GAME_DURATION - elapsed;
  if (!gameEnded && !bombDefused) {
  if (remaining > 30000) {
    if (millis() - lastBlinkTime >= 400) {
      lastBlinkTime = millis();
      ledOn = !ledOn;

      digitalWrite(RED_LED_PIN_1, LOW);
      digitalWrite(YELLOW_LED_PIN_1, LOW);
      digitalWrite(GREEN_LED_PIN_1, LOW);
      digitalWrite(RED_LED_PIN_2, LOW);
      digitalWrite(YELLOW_LED_PIN_2, LOW);
      digitalWrite(GREEN_LED_PIN_2, LOW);


      if (ledOn) {
        if (blinkState == 0) {
          digitalWrite(GREEN_LED_PIN_1, HIGH);
          digitalWrite(GREEN_LED_PIN_2, HIGH);}
        else if (blinkState == 1) {
          digitalWrite(YELLOW_LED_PIN_1, HIGH);
          digitalWrite(YELLOW_LED_PIN_2, HIGH);}
        else if (blinkState == 2) {
          digitalWrite(RED_LED_PIN_1, HIGH);
          digitalWrite(RED_LED_PIN_2, HIGH);}

        blinkState = (blinkState + 1) % 3;
      }
    }
  }
  // last 30 seconds yellow starts blibking with beeps
  else if (remaining > 13000) {
    if (millis() - lastBlinkTime >= 500) {
      lastBlinkTime = millis();
      ledOn = !ledOn;

      //ensure other LEDs are off
      digitalWrite(RED_LED_PIN_1, LOW);
      digitalWrite(RED_LED_PIN_2, LOW);
      digitalWrite(GREEN_LED_PIN_1, LOW);
      digitalWrite(GREEN_LED_PIN_2, LOW);

      // Yellow blinking
      digitalWrite(YELLOW_LED_PIN_1, ledOn ? HIGH : LOW);
      digitalWrite(YELLOW_LED_PIN_2, ledOn ? HIGH : LOW);
    }
    // beeps
    if (millis() - lastBeepTime >= 1000) {
      lastBeepTime = millis();
      tone(BUZZER_PIN, 1000, 100);
    }
  }
  // last 10 seconds red is blinking with faster beeps
  else if (remaining > 3000) {
    if (millis() - lastBlinkTime >= 300) {
      lastBlinkTime = millis();
      ledOn = !ledOn;
      digitalWrite(RED_LED_PIN_1, ledOn ? HIGH : LOW);
      digitalWrite(RED_LED_PIN_2, ledOn ? HIGH : LOW);
    }

    if (millis() - lastBeepTime >= 500) {
      lastBeepTime = millis();
      tone(BUZZER_PIN, 1200, 100);
    }
  }
}
  // bomb exploded cause time is up
  if (elapsed >= GAME_DURATION && !bombDefused) {
    bombExploded = true;
    gameEnded = true;
    Serial.println("EXPLODED");
    return;
  }
  // Serial input
  

  switch (currentGame) {
    case 0: handleButtonMash(); break;
    case 1: handleMathQuiz(); break;
    case 2: handleNoteMatch(); break;
    case 3: handleMorseCode(); break;
  }
}
// to handle physicak input for each game
void handleButtonMash() {
  static unsigned long lastPressTime = 0;
  static bool lastButtonState = HIGH;
  bool currentState = digitalRead(CONFIRM_PIN);
  // each press sends 1 to p5, which increments counter for current presses 
  if (lastButtonState == HIGH && currentState == LOW && millis() - lastPressTime > 200) {
    buttonMashCount++;
    lastPressTime = millis();
    Serial.println("1");
  }
  lastButtonState = currentState;
}
// resetGame function defined to reset all game variables and start game again
void resetGame() {
  bombDefused = false;
  bombExploded = false;
  gameEnded = false;
  buttonMashCount = 0;
  currentPressed = -1;
  currentGame = 0;
  newRound = true;
  morsePressed = false;
  targetNote = -1;
  morseStart = 0;
  gameStartTime = millis();
  ledOn = false;
  blinkState = 0;
  lastBlinkTime = 0;
  lastBeepTime = 0;
  
  // turn off all LEDs and buzzer
  digitalWrite(RED_LED_PIN_1, LOW);
  digitalWrite(YELLOW_LED_PIN_1, LOW);
  digitalWrite(GREEN_LED_PIN_1, LOW);
  digitalWrite(RED_LED_PIN_2, LOW);
  digitalWrite(YELLOW_LED_PIN_2, LOW);
  digitalWrite(GREEN_LED_PIN_2, LOW);
  noTone(BUZZER_PIN);
}

void handleMathQuiz() {
  static int selectedNum = 0;
  static int lastButtonState = HIGH;
  static unsigned long lastDebounceTime = 0;
  const unsigned long debounceDelay = 200;

  int currentState = digitalRead(BUTTON_PINS[0]); // increment button on pin 2

  // handle incrementing selected number
  if (lastButtonState == HIGH && currentState == LOW && (millis() - lastDebounceTime > debounceDelay)) {
    selectedNum = (selectedNum + 1) % 10;
    Serial.print("SELECT:");
    Serial.println(selectedNum);
    lastDebounceTime = millis();
  }
  lastButtonState = currentState;

  // handle confirmation
  if (digitalRead(CONFIRM_PIN) == LOW) {
    delay(50); 
    // sends selected number to arduino when confirm button is pressed 
    if (digitalRead(CONFIRM_PIN) == LOW) {
      Serial.print("PRESS:");
      Serial.println(selectedNum);
      delay(300); 
    }
  }
}

void handleNoteMatch() {
  static unsigned long toneEndTime = 0;
  static bool isPlayingTarget = false;

  // handle new round target note
  if (newRound) {
    noTone(BUZZER_PIN);
    delay(5);
    digitalWrite(BUZZER_PIN, LOW);
    // plays target note sent by p5
    tone(BUZZER_PIN, getPitch(targetNote), 500);
    toneEndTime = millis() + 500;
    isPlayingTarget = true;
    newRound = false;
    return;
  }

  // handle tone playing completion
  if (isPlayingTarget && millis() > toneEndTime) {
    noTone(BUZZER_PIN);
    isPlayingTarget = false;
  }

  // playing note corresponding to button presses 
  if (!isPlayingTarget) {
    bool anyPressed = false;
    for (int i = 0; i < 4; i++) {
      if (digitalRead(BUTTON_PINS[i]) == LOW) {
        anyPressed = true;
        if (currentPressed != i) {
          noTone(BUZZER_PIN);
          delay(5);
          currentPressed = i;
          tone(BUZZER_PIN, getPitch(i));
          Serial.println(i);
        }
        break;
      }
    }
    // no note should play when button is not pressed
    if (!anyPressed && currentPressed != -1) {
      noTone(BUZZER_PIN);
      currentPressed = -1;
    }
    // send final answer confirmation when button is pressed
    if (digitalRead(CONFIRM_PIN) == LOW) {
      noTone(BUZZER_PIN);
      Serial.println("CONFIRM");
      delay(300);
      newRound = true;
    }
  }
}

void handleMorseCode() {
  static unsigned long lastDebounceTime = 0;
  const unsigned long debounceDelay = 50; // ms
  
  int btn = digitalRead(BUTTON_PINS[0]);  // button on pin 2 is used for sending data
  
  // Button press detection with debouncing
  if (btn == LOW && !morsePressed && (millis() - lastDebounceTime) > debounceDelay) {
    morseStart = millis();
    morsePressed = true;
    lastDebounceTime = millis();
  }
  
  // Button release detection
  if (btn == HIGH && morsePressed) {
    unsigned long duration = millis() - morseStart;
    morsePressed = false;
    
    // short press sends . and long press sends -
    if (duration >= 20) {
      Serial.println(duration < 500 ? "." : "-");
    }
    lastDebounceTime = millis();
    delay(100); 
  }

  // pressing confirm button sends confirm to p5 which then checks if string formed by user matches morse code proivded
  if (digitalRead(CONFIRM_PIN) == LOW) {
    delay(50); // Debounce
    if (digitalRead(CONFIRM_PIN) == LOW) { 
      Serial.println("CONFIRM");
      while(digitalRead(CONFIRM_PIN) == LOW); 
      delay(300);
    }
  }
}
// 4 notes chosen for note match 
int getPitch(int index) {
  int pitches[] = {262, 294, 330, 349};
  return pitches[index];
}

p5js code:

let port;
let connectBtn;
let startBtn;
let baudrate = 9600;

// initiate all flags required
let showWelcome = true;
let showInstructions = false;
let gameStarted = false;
let currentGame = 0;
let gameCompleted = [false, false, false, false];
let codeDigits = [];
let userCodeInput = "";
let correctCode = "";
let bombDefused = false;
let bombExploded = false;
let stageCompleted = false;
let stageCompleteTime = 0;
let stageDigit = -1;
let imgWelcome, imgInstructions, imgButtonSmash, buttonMashSuccessImg, mathQuizSuccessImg, noteMatchSuccessImg, morseCodeSuccessImg, imgMathRiddle, imgNoteMatch, imgMorseCode1,imgMorseCode2, imgBombDefused, imgBombExploded,imgCodeEntry;
let bombSound;
let playedExplosionSound = false;
let successSound;
let playedSuccessSound = false;

// initiate all game variables
let totalTime = 80;
let startTime;

let pressCount = 0;
let targetPresses = 30;
let challengeActive = false;

let selectedNumber = 0;
let correctAnswer = 5;  
let mathAnswered = false;
let feedback = "";

let currentSelection = -1;
let lockedIn = false;
let noteMessage = "";
let noteAnswerIndex = 0;

let morseCode = "";
let userInput = "";
let roundActive = false;
let showSuccess = false;
let showFailure = false;

function preload() {
  imgWelcome = loadImage("start.png");
  imgInstructions = loadImage("instructions.png");
  imgButtonSmash = loadImage("button_smash.png");
  buttonMashSuccessImg = loadImage("stage1_success.png");
  mathQuizSuccessImg = loadImage("stage2_success.png");
  noteMatchSuccessImg = loadImage("stage3_success.png");
  morseCodeSuccessImg = loadImage("stage4_success.png");
  imgMathRiddle = loadImage("math_riddle.png");
  imgNoteMatch = loadImage("note_match.png");
  imgMorseCode1 = loadImage("morse_code1.png");
  imgMorseCode2 = loadImage("morse_code2.png");
  imgBombDefused = loadImage("defused.png");
  imgBombExploded = loadImage("exploded.png");
  bombSound = loadSound('bomb.mp3');
  successSound = loadSound('success.mp3');
  imgCodeEntry = loadImage("code_entry.png")
}

function setup() {
  createCanvas(600, 600);
  textAlign(CENTER, CENTER);

  port = createSerial();

}

function startGame() {
  startTime = millis(); // Set the start time for the timer
  gameStarted = true; // Set the flag to start the game
  currentGame = 0; // Set the current game to 0
  sendGameSwitch(0); // Send game switch signal to Arduino
  startButtonMashChallenge(); // Start the Button Mash Challenge
}

function draw() {
  background(220);
  
  // displays screen for when bomb is defused along with sound effects
  if (bombDefused) {
      image(imgBombDefused, 0, 0, width, height);
    if (!playedSuccessSound) {
        successSound.play();
        playedSuccessSound = true;
      }
      return;
    }
  // displays screen for when bomb is exploded along with sound effects
  if (bombExploded) {
      image(imgBombExploded, 0, 0, width, height);
      if (!playedExplosionSound) {
        bombSound.play();
        playedExplosionSound = true;
      }
      return;
    }

  // Welcome Screen display
  if (showWelcome) {
    image(imgWelcome, 0, 0, width, height);
    return;
  }

  //Instructions Screen display
  if (showInstructions) {
    image(imgInstructions, 0, 0, width, height);
    return;
  }
  // calculates time to keep track of explosion and so on
  let elapsed = int((millis() - startTime) / 1000);
  let remaining = max(0, totalTime - elapsed);
  
  // if time runs out bomb is exploded
  if (remaining <= 0 && !bombDefused) {
    bombExploded = true;
    return;
  }

  // handle all incoming data by reading and sending to function after trimming
  if (port.opened() && port.available() > 0) {
    let data = port.readUntil("\n").trim();
    if (data.length > 0) {
      handleSerialData(data);
    }
  }

  // toggles success screens for all games 
  if (stageCompleted) {
  switch (currentGame) {
    case 0:
      // Show success screen for Button Mash
      image(buttonMashSuccessImg, 0, 0, width, height);
      break;
    case 1:
      // Show success screen for Math Quiz
      image(mathQuizSuccessImg, 0, 0, width, height);
      break;
    case 2:
      // Show success screen for Note Match
      image(noteMatchSuccessImg, 0, 0, width, height);
      break;
    case 3:
      // Show success screen for Morse Code
      image(morseCodeSuccessImg, 0, 0, width, height);
      break;
  }
    // removes success screen afte 3 seconds and moves onto next game
    if (millis() - stageCompleteTime > 3000) {
      codeDigits.push(stageDigit);
      currentGame++;
      sendGameSwitch(currentGame);
      stageCompleted = false;

      // start the next game
      switch (currentGame) {
        case 1: startMathQuiz(); break;
        case 2: startNoteMatchChallenge(); break;
        case 3: startMorseCodeChallenge(); break;
        case 4: correctCode = "4297"; break; 
      }
    }
    return;
  }

  // display game screens using functions defined 
  switch (currentGame) {
    case 0: drawButtonMashChallenge(); break;
    case 1: drawMathQuiz(); break;
    case 2: drawNoteMatchChallenge(); break;
    case 3: drawMorseCodeChallenge(); break;
    case 4: 
      correctCode = "4297";
      if (userCodeInput === "") startCodeEntry();
      drawCodeEntry();
      break;

  }
  // timer display at top of screen
  if (gameStarted && !bombDefused && !bombExploded) {
    textSize(20);
    fill(0);
    textAlign(CENTER, TOP);
    text("Time Remaining: " + remaining + "s", width / 2, 20);
  }

}

function handleSerialData(data) {
  
  if (bombDefused) {
    return; // no data should be handled if bomb has been defused 
  }
  // stop handing data once bomb explodes
  if (data === "EXPLODED") {
    bombExploded = true;
    if (port.opened()) {
      port.write("EXPLODED\n");
    }
    return;
  }
  switch (currentGame) {
    case 0:
      if (data === "1" && challengeActive) {
        pressCount++;
        //  checks success condition, when user presses button 30 times
        if (pressCount >= targetPresses) {
          challengeActive = false;
          // handle necessary flags for this stage and keep track of time for success screen display
          stageDigit = 4;  
          gameCompleted[0] = true;
          stageCompleted = true;
          stageCompleteTime = millis();
        }
      }
      break;

    case 1:
      if (data.startsWith("SELECT:")) {// parses data for this specific game
        selectedAnswer = int(data.substring(7));  // 8th character gives actual numeric value
      } else if (data.startsWith("PRESS:")) {  // for confirm button press
        let val = int(data.substring(6));  // 7th character gives digit confirmed by user 
        // success condition
        if (val === correctAnswer) {
          feedback = "CORRECT";
          stageDigit = 2;
          gameCompleted[1] = true;
          stageCompleted = true;
          stageCompleteTime = millis();
        } else {
          // in case of wrong answer
          bombExploded = true;
          if (port.opened()) {
            port.write("EXPLODED\n");
          }
        }
      }
      break;
    // handling data for note match game
    case 2:
      // if user presses confirm button, checks answer
      if (!lockedIn) {
        if (data === "CONFIRM") {
          lockedIn = true;
          // if correct answer is selected 
          if (currentSelection === noteAnswerIndex) {
            noteMessage = "Correct!";
            stageDigit = 9;
            gameCompleted[2] = true;
            stageCompleted = true;
            stageCompleteTime = millis();
            // if user makes a mistake, they lose
          } else {
            bombExploded = true;
            if (port.opened()) {
              port.write("EXPLODED\n");
            }
          }
        } else if (!isNaN(int(data))) {
          currentSelection = int(data);  // reading data for option selected 
        }
      }
      break;
    // parsing user input based on arduino feedback to concatenate morse code and compare with original string
    case 3:
      if (data === "." || data === "-") {
        userInput += data;
        // if user confirms answer 
      } else if (data === "CONFIRM") {
        if (userInput === morseCode) {
          showSuccess = true;
          stageDigit = 7;
          gameCompleted[3] = true;
          stageCompleted = true;
          stageCompleteTime = millis();
          roundActive = false;
          // in case of incorrect answer
        } else {
          bombExploded = true;
          if (port.opened()) {
            port.write("EXPLODED\n");
          }
        } // displays morse code for 5 seconds for user to memorize then disappears
        setTimeout(() => {
          showSuccess = false;
          showFailure = false;
          userInput = "";
        }, 5000);
      }
      break;

    case 4:
      // handles code entry
      if (data === "CONFIRM") {
        if (userCodeInput.length !== 4) return; // Ignore if code is incomplete
        if (userCodeInput === "4297") {
          bombDefused = true;
        } else {
          bombExploded = true;
          if (port.opened()) {
            port.write("EXPLODED\n");
          }
        }
      }
      break;
  }
}
// to tell arduino to switch to game being sent
function sendGameSwitch(gameNum) {
  if (port.opened()) {
    port.write(gameNum + "\n");
  }
}

// all game display functions
function startButtonMashChallenge() {
  pressCount = 0;
  challengeActive = true;
}

function drawButtonMashChallenge() {
  image(imgButtonSmash, 0, 0, width, height);
  fill(255);
  textSize(44);
  textAlign(CENTER, CENTER);
  text(pressCount, 300,325); 
}

function startMathQuiz() {
  feedback = "";
  correctAnswer = 5; 
  selectedAnswer = 0;
}


function drawMathQuiz() {
  image(imgMathRiddle, 0, 0, width, height);

  fill(29,148,94);
  rect(width / 2 - 40, 350, 80, 80);
  fill(0);
  textSize(48);
  text(selectedAnswer, width / 2, 370);

}

function startNoteMatchChallenge() {
  lockedIn = false;
  noteMessage = "";
  currentSelection = -1;
  noteAnswerIndex = floor(random(0, 4));
  sendNoteChallenge(noteAnswerIndex);
}

function drawNoteMatchChallenge() {
  image(imgNoteMatch, 0, 0, width, height);
  
  textSize(24);
  textAlign(CENTER, CENTER); 
  fill(0);

  let labels = ["C", "D", "E", "F"];
  let size = 80;
  let spacing = 20;
  let totalWidth = labels.length * size + (labels.length - 1) * spacing;
  let startX = (width - totalWidth) / 2;
  let y = 300;

  for (let i = 0; i < labels.length; i++) {
    let x = startX + i * (size + spacing);
    
    if (i === currentSelection) {
      fill(0, 0, 255);
    } else {
      fill(255);
    }

    rect(x, y, size, size);

    fill(0);
    text(labels[i], x + size / 2, y + size / 2); 
  }

  fill(0);
  textSize(20);
  text(noteMessage, width / 2, height - 50);
}


function startMorseCodeChallenge() {
  morseCode = "..-.--.";
  userInput = "";
  roundActive = true;
  showSuccess = false;
  showFailure = false;
  setTimeout(() => {
    roundActive = false;
  }, 5000);
}
// displays image with code for 5 seconds for user to memorize code 
function drawMorseCodeChallenge() {
  if (roundActive) {
    image(imgMorseCode1, 0, 0, width, height);
  } else {
    image(imgMorseCode2, 0, 0, width, height);}
  fill(50,50,50);
  textSize(24);
  text("User input: " + userInput, width / 2, 300);
}

function drawCodeEntry() {
  image(imgCodeEntry, 0, 0, width, height);
  textSize(24);
  fill(0);
  text(userCodeInput, width / 2, 170);

  for (let i = 0; i <= 9; i++) {
    let x = 140 + (i % 5) * 80;
    let y = 220 + floor(i / 5) * 80;
    fill(200);
    rect(x, y, 60, 60);
    fill(0);
    text(i, x + 30, y + 30);
  }

  fill(255);
  rect(width / 2 - 35, 410, 70, 40);
  fill(0);
  text("Clear", width / 2, 420);
}

function mousePressed() {
  // handles navigation from welcome screen to instructions screen and instructions screen and back
  if (showWelcome) {
    if (mouseX > 112 && mouseX < 224 && mouseY > 508 && mouseY < 547) {
      try {
        // creating serial connection
        if (!port.opened()) {
          let usedPorts = usedSerialPorts();
          if (usedPorts.length > 0) {
            port.open(usedPorts[0], baudrate);
          } else {
            port.open(baudrate);
          }
        }
        console.log("Connected to serial!");
        startGame();
        showWelcome = false;
      } catch (err) {
        console.error("Connection failed:", err);
      }
    }
    
    if (mouseX > 275 && mouseX < 544 && mouseY > 506 && mouseY < 545) {
      showInstructions = true;
      showWelcome = false;
    }
    return;
  }

  if (showInstructions) {
    // Click anywhere to go back
    showInstructions = false;
    showWelcome = true;
  }
  // checks code entry
  if (currentGame === 4 && !bombDefused && !bombExploded) {
    for (let i = 0; i <= 9; i++) {
      let x = 140 + (i % 5) * 80;
      let y = 220 + floor(i / 5) * 80;
      if (mouseX > x && mouseX < x + 60 && mouseY > y && mouseY < y + 60) {
        userCodeInput += i;
        return;
      }
    }

    // clear button
    if (mouseX > width / 2 - 30 && mouseX < width / 2 + 30 &&
        mouseY > 400 && mouseY < 440) {
      userCodeInput = userCodeInput.slice(0, -1);
      return;
    }
    // successful code entry defuses bomb successfully
    if (userCodeInput.length === 4) {
      if (userCodeInput === correctCode) {
        bombDefused = true;
        port.write("DEFUSED\n");
      }
      else bombExploded = true;
    }
  }
}
// for arduino to choose note sent as correct note and play it for users to guess 
function sendNoteChallenge(noteIndex) {
  if (port.opened()) {
    port.write("NOTE:" + noteIndex + "\n");
  }
}
function startCodeEntry() {
  userCodeInput = "";
}

// resets all game variables for reset functionality once game ends 
function resetGame() {
  showWelcome = true;
  showInstructions = false;
  gameStarted = false;
  currentGame = 0;
  gameCompleted = [false, false, false, false];
  codeDigits = [];
  userCodeInput = "";
  correctCode = "";
  bombDefused = false;
  bombExploded = false;
  stageCompleted = false;
  stageCompleteTime = 0;
  stageDigit = -1;
  pressCount = 0;
  challengeActive = false;
  selectedAnswer = 0;
  correctAnswer = 5;
  feedback = "";
  currentSelection = -1;
  lockedIn = false;
  noteMessage = "";
  noteAnswerIndex = 0;
  morserrCode = "";
  userInput = "";
  roundActive = false;
  showSuccess = false;
  showFailure = false;
  playedExplosionSound = false;
  playedSuccessSound = false;
}
// if user presses key once game is over, it restarts everything
function keyPressed() {
  if (key === 'R' || key === 'r') {
    resetGame();
    port.write("RESET\n");
  }
}
Challenges & Lessons Learned
  • Serial Port Management: One recurring headache was managing serial port connections on browser refreshes and game resets. I had to add logic to prevent re-opening already open ports to avoid exceptions.

  • Real-Time Feedback: Timing and responsiveness were crucial. Since the game runs on a strict timer, any lag in serial communication or missed input could break the experience. Careful buffering and validation were necessary.

  • Game Flow Management: Keeping track of game state across 5 different modes, plus timers and sounds, took careful design. The stageCompleted flag and a timed transition window after each success proved essential.

week 14- final project

Concept

A physically interactive “Piano Tiles” game where players tap hand-buttons or pedal-buttons in time to falling/color coordinated tiles. Red tiles correspond to hand buttons (pins 2–5), blue tiles to foot pedals (pins 6–9). Finish the song without losing all three lives to win!

Interaction Demo

user testing:

copy_71E82669-1896-402A-A155-F76C8BE9E1AD

Me testing it:

copy_0FB66FA1-2527-4DBE-B6C6-86ACE6076571 2

Implementation Overview

Startup

-Menu screen with background image and “Start”/“Info” buttons

-Info screen explains with instructions on how to play

Song and Difficulty Selection

-Custom “song.png” and “difficulty.png” backgrounds

-3 levels (easy, medium, hard), as difficulty increased, speed of tiles moving down increases

Gameplay

-Tiles fall at a speed set by difficulty

-Serial data from Arduino (1–4 = hand zones, 5–8 = foot zones) drives hit detection

-Score and lives update live; MISS flashes when button misscliks/spam

Game Over

-Game over screen, shows score of the user: whether lost or won

-“Try Again” or “Home” appear and auto-returns to home screen after 5s

Interaction Design

Color coding: red/blue zones on screen match button colors

Audio cues: Crazy Frog sample plays in background

Lives and Score: heart icons and score button reinforce progress

Arduino Sketch

const int pins[] = {2,6,  3,7,  4,8,  5,9}; // even indices = hand, odd = foot

void setup() {
  Serial.begin(9600);
  for (int i = 0; i < 8; i++) pinMode(pins[i], INPUT);
}

void loop() {
  for (int i = 0; i < 8; i++) {
    if (digitalRead(pins[i]) == HIGH) {
      int baseZone = (i / 2) + 1;          // 1–4
      int sendVal  = (i % 2 == 0)
                     ? baseZone           // hand: 1–4
                     : baseZone + 4;      // foot: 5–8

      Serial.println(sendVal);

      // wait release
      while (digitalRead(pins[i]) == HIGH) delay(5);
      delay(300);
    }
  }
}

This Arduino sketch  scans eight digital inputs (pins 2–9, paired as hand vs. foot buttons), and whenever it detects a button press it:

  1. Determines which of the four “zones” you pressed (buttons in pairs 2/6: zone 1, 3/7: zone 2, etc).

  2. Adds 4 to the zone number if it was a “foot” button, so hands send 1–4 and feet send 5–8.

  3. Sends that zone ID over the serial port.

  4. Waits for you to release the button and debounces for 300 ms before scanning again.

Circuit Schematic

p5.js Sketch

let menuImg, infoImg, gameOverImg;
let axelSound;           
let songStarted = false;

let serial, canvas, scoreBtn;
let gameState = 'menu';

let useSound = false; 
let osc;

function preload(){ //preloading images and sounds
  menuImg = loadImage('menu.png');
  infoImg  = loadImage('info.png');
  gameOverImg = loadImage('gameover.png');
  axelSound = loadSound('axelf.mov');
  songImg = loadImage('song.png');
  diffImg = loadImage('difficulty.png');

}



const tileH = 60, zoneH = 20,  
      zoneY = 320, // tiles get removed at this point
      removalY = 400, colWidth = 400 / 4;


// UI buttons for navigation buttons, menu and gameover
const navButtons = { back:{x:10,y:10,w:80,h:30,label:'Back'}, home:{x:100,y:10,w:80,h:30,label:'Home'} };

const menu = { title:'Piano Tiles', start:{x:159,y:222,w:86,h:20,label:'Start'}, info:{x:160,y:261,w:86,h:20,label:'Info'} };

const gameOverButtons = { tryAgain:{x:150,y:180,w:100,h:40,label:'Try Again'}, home:{x:150,y:240,w:100,h:40,label:'Home'} };

// Song and pattern deciding what foot/hand tile goes to what column in order
const songs = [
   { id: 'axelf', name: 'Crazy Frog', sample: axelSound, pattern: [
  { col: 0, isFoot: true  },  
  { col: 1, isFoot: true },  
  { col: 0, isFoot: true  },  
  { col: 2, isFoot: false  },  
  { col: 3, isFoot: false },
  { col: 3, isFoot: true  },  
  { col: 0, isFoot: false }, 
  { col: 2, isFoot: false  },  
  { col: 0, isFoot: true  },  
  { col: 2, isFoot: false }, 
  { col: 1, isFoot: true  }, 
  { col: 1, isFoot: false },  
  { col: 3, isFoot: true  },  
  { col: 1, isFoot: true  },  
  { col: 0, isFoot: false },
  { col: 3, isFoot: true  },  
  { col: 2, isFoot: false },  
  { col: 1, isFoot: true  }, 
  { col: 0, isFoot: true  },  
  { col: 3, isFoot: true },
  { col: 0, isFoot: true  },  
  { col: 1, isFoot: true },  
  { col: 0, isFoot: true  },  
  { col: 2, isFoot: false  },  
  { col: 3, isFoot: false },
  { col: 3, isFoot: true  },  
  { col: 0, isFoot: false },  
  { col: 2, isFoot: false  },  
  { col: 0, isFoot: true  },  
  { col: 2, isFoot: false }, 
  { col: 1, isFoot: true  },  
  { col: 1, isFoot: false },  
  { col: 3, isFoot: true  }, 
  { col: 1, isFoot: true  },  
  { col: 0, isFoot: false },
  { col: 3, isFoot: true  },  
  { col: 2, isFoot: false },  
  { col: 1, isFoot: true  },  
  { col: 0, isFoot: true  },  
  { col: 3, isFoot: true }, 
 { col: 3, isFoot: true  },  
  { col: 2, isFoot: false },  
  { col: 1, isFoot: true  },  
  { col: 0, isFoot: true  },  
  { col: 3, isFoot: true },
  { col: 3, isFoot: true  },  
  { col: 2, isFoot: false }
] }
];
const songBoxes = []; 
// speed increases as difficulty increases
const difficulties = [ {label:'Easy',speed:4}, {label:'Medium',speed:6}, {label:'Hard',speed:8} ];
const diffBoxes = [];


let currentSong, noteIndex=0;
let score=0, lives=3, currentSpeed=0; // set score to 0 and lives 3
let tile=null, missTime=0;
let gameOverStartTime=0, gameOverTimeout=5000;

function setup(){
    songs[0].sample = axelSound;

  canvas=createCanvas(400,400);
  canvas.mousePressed(handleMouse);
  canvas.elt.oncontextmenu=()=>false;

  // audio
  useSound = typeof p5.Oscillator==='function';
  if(useSound) osc=new p5.Oscillator('sine');

  // serial
  serial=createSerial();
  const ports=usedSerialPorts(); if(ports.length) serial.open(ports[0],{baudRate:9600});

  // UI boxes for the song
  songBoxes.push({ x: 129, y: 216, w: 145, h:  75, idx: 0 });

  // and for difficulties:
  diffBoxes.length = 0; 

  //levels
diffBoxes.push({ x: 158, y: 182, w:  86, h:  32, idx: 0 }); //easy
diffBoxes.push({ x: 158, y: 235, w:  86, h:  32, idx: 1 }); //med
diffBoxes.push({ x: 158, y: 289, w:  86, h:  32, idx: 2 }); //hard

  // white button that represents score
  scoreBtn=createButton('Score: 0');
  scoreBtn.position(width-90,10);
  scoreBtn.style('background-color','#FFFFFF').style('color','rgb(25,1,1)').style('padding','6px 12px');
}

function draw(){
  
    // console.log(`x: ${mouseX}, y: ${mouseY}`); used for coordinates for the ui boxes

  background(30);
  switch(gameState){
    case 'menu':            drawMenu();         break;
    case 'info':            drawInfo();         break;
    case 'songSelect':      drawSongSelect();   break;
    case 'difficultySelect':drawDiffSelect();   break;
    case 'playing':         drawGame();         break;
    case 'gameOver':        drawGameOver();     break;
    
  }
}

function drawMenu()
{ 

  textAlign(CENTER,CENTER);
  fill(255); textSize(32);
  text(menu.title, width/2, 100);
  drawButton(menu.start);
  drawButton(menu.info);
  
    // draw menu background image
  image(menuImg, 0, 0, width, height);
} 

function drawInfo() {
  // full screen info background
  image(infoImg, 0, 0, width, height);

  // then draw back button on top:
  drawButton(navButtons.back);
}

function drawSongSelect()

{ 
    image(songImg, 0, 0, width, height);

 
drawButton(navButtons.back); 
drawButton(navButtons.home); }

function drawDiffSelect()
{ 
    image(diffImg, 0, 0, width, height);
  //  // debug draw the clickable areas
  // diffBoxes.forEach(b => {
  //   noFill();
  //   stroke(255, 0, 0);
  //   rect(b.x, b.y, b.w, b.h);
  // });
drawButton(navButtons.back); drawButton(navButtons.home); }

function drawGame(){
  
  // Lives
  noStroke(); fill('red'); for(let i=0;i<lives;i++) ellipse(20+i*30,20,20,20);
// Dividers and zones in the game
rectMode(CORNER);
stroke(255); strokeWeight(4);
line(100,0,100,height);
line(200,0,200,height);
line(300,0,300,height);

noStroke();
// draws the 4 coloured hit zones:
const colors = ['#3cb44b','#4363d8','#ffe119','#e6194b'];
noStroke();
colors.forEach((c,i) => {
  fill(c);
  rect(i * colWidth, zoneY, colWidth, zoneH);
});

// draws the falling tiles
if (tile) {
  tile.y += tile.speed;

  if (tile.y - tileH/2 > removalY) {
    missTime = millis();
    advanceTile(false);
  } else {
    rectMode(CENTER);
    noStroke();
    fill(tile.isFoot ? '#4363d8' : '#e6194b');
    rect(tile.x, tile.y, tile.w, tileH);
  }
}




  if(serial && serial.available()>0){
    let raw=serial.readUntil('\n').trim(); let z=int(raw)-1;
    if(z>=0&&z<8){ let col=z%4; let isHand=z<4; handleHit(col,isHand); }
  }


  // keeps count of the score
  noStroke(); 
  fill(255); 
  textAlign(LEFT,TOP); 
  textSize(16); 
  text('Score:'+score,10,40);
  if(millis()-missTime<500){ textAlign(CENTER,CENTER); textSize(32); fill('red'); text('MISS',width/2,height/2); }
}

function startPlaying() {
  // only play once
  if (currentSong.sample && !songStarted) {
    currentSong.sample.play();
    songStarted = true;
  }
}


function drawGameOver()
{ 
  if (currentSong.sample && songStarted) {
  currentSong.sample.stop();
}
  image(gameOverImg, 0, 0, width, height);
 
 let e=millis()-gameOverStartTime; 
 if(e>=gameOverTimeout){ resetGame(); gameState='menu'; return;} let r=ceil((gameOverTimeout-e)/1000); 
 textAlign(RIGHT,BOTTOM); 
 textSize(14); 
 fill(255); 
 text('Menu in:'+r,width-10,height-10); }

function drawButton(b)
{ rectMode(CORNER);
 fill(100); 
 rect(b.x,b.y,b.w,b.h);
 fill(255); 
 textAlign(CENTER,CENTER);
 textSize(16); 
 text(b.label,b.x+b.w/2,b.y+b.h/2); }

function handleMouse() {
  // game over screen
  if (gameState === 'gameOver') {
    if (hitBox(gameOverButtons.tryAgain)) {
      resetGame();
      noteIndex = 0;
      spawnNextTile();
      // replay sample if any
      if (currentSong.sample) {
        let s = (typeof currentSong.sample === 'function')
                  ? currentSong.sample()
                  : currentSong.sample;
        if (s && typeof s.play === 'function') s.play();
      }
      gameState = 'playing';
    } else if (hitBox(gameOverButtons.home)) {
      resetGame();
      gameState = 'menu';
    }
    return;
  }

  // info screen
  if (gameState === 'info') {
    if (hitBox(navButtons.back)) {
      gameState = 'menu';
    }
    return;
  }

  // navigation (Home/Back) before the game starts
  if (gameState !== 'playing') {
    if (hitBox(navButtons.home)) {
      resetGame();
      gameState = 'menu';
      return;
    }
    if (gameState === 'songSelect' && hitBox(navButtons.back)) {
      gameState = 'menu';
      return;
    }
    if (gameState === 'difficultySelect' && hitBox(navButtons.back)) {
      gameState = 'songSelect';
      return;
    }
  }

  // main menu
  if (gameState === 'menu') {
    if (hitBox(menu.start)) {
      gameState = 'songSelect';
    } else if (hitBox(menu.info)) {
      gameState = 'info';
    }
    return;
  }

  if (gameState === 'songSelect') {
    songBoxes.forEach(b => {
      if (hitBox(b)) {
        currentSong = songs[b.idx];
        gameState = 'difficultySelect';
      }
    });
    return;
  }

  // select difficulty
if (gameState === 'difficultySelect') {
  diffBoxes.forEach(b => {
    if (hitBox(b)) {
      currentSpeed = difficulties[b.idx].speed;
      resetGame();
      noteIndex = 0;
      spawnNextTile();

      // play sample only once
      if (currentSong.sample && !songStarted) {
        currentSong.sample.play();
        songStarted = true;
      }

      gameState = 'playing';
    }
  });
  return;
}


}


function resetGame() {
  score = 0;
  scoreBtn.html('Score: 0');
  lives = 3;
  missTime = 0;
  gameOverStartTime = 0;
  tile = null;
  songStarted = false;   // reset song playing here
}


function handleHit(col, isHandButton) {
  // wrong button type
  if ((tile.isFoot && isHandButton) ||
      (!tile.isFoot && !isHandButton)) {
    missTime = millis();
    lives--;
    if (lives <= 0) {
      if (currentSong.sample) axelSound.stop();
      if (useSound)         osc.stop();
      gameState = 'gameOver';
      gameOverStartTime = millis();
    }
    return;
  }

  // otherwise the normal hit test
  if (tile && col === tile.col
      && tile.y - tileH/2 < zoneY + zoneH
      && tile.y + tileH/2 > zoneY) {
    advanceTile(true);
  } else {
    missTime = millis();
    lives--;
    if (lives <= 0) {
      if (currentSong.sample) axelSound.stop();
      if (useSound)         osc.stop();
      gameState = 'gameOver';
      gameOverStartTime = millis();
    }
  }
}




function advanceTile(sc) {
  if (sc) {
    // check win condition: if this was the last note in the pattern, flag a win and go straight to Game Over
    if (noteIndex === currentSong.pattern.length - 1) {
      isWinner = true; // mark winner
      gameState = 'gameOver';
      gameOverStartTime = millis();
      return;
    }

    if (useSound && !currentSong.sample) {
      osc.freq(currentSong.melody[noteIndex]);
      osc.start();
      osc.amp(0.5, 0.05);
      setTimeout(() => osc.amp(0, 0.5), 300);
    }

    score++;
    scoreBtn.html('Score:' + score);

  } else {
    lives--;
    if (lives <= 0) {
      // lose flag
      isWinner = false;
      if (currentSong.sample) axelSound.stop();
      if (useSound)         osc.stop();

      gameState = 'gameOver';
      gameOverStartTime = millis();
      return;
    }
  }

  // advance to the next note
  noteIndex = (noteIndex + 1) % currentSong.pattern.length;
  spawnNextTile();
}


function spawnNextTile() {

  const p = currentSong.pattern[noteIndex];
  tile = {
    x:    p.col * colWidth + colWidth/2,
    y:    0,
    w:    colWidth,
    h:    tileH,
    speed: currentSpeed,
    col:  p.col,
    isFoot: p.isFoot,
    immune: false
  };
}



function hitBox(b){ return mouseX>=b.x&&mouseX<=b.x+b.w&&mouseY>=b.y&&mouseY<=b.y+b.h; }



 

Key p5.js Highlights

-preload() loads menu.png, info.png, gameoverwinner.png, gameoverloser.png, and axelf.mov

-drawMenu()/drawInfo() render full screen images before buttons

-handleMouse() manages navigation 

-drawGame() reads serial, falls and draws tiles, enforces hit logic

-advanceTile() checks win/lose and plays audio

Arduino and p5.js Communication

-Arduino sends an integer (1–8) on each button press

-p5.js reads via serial.readUntil(‘\n’), maps zone%4 to columns and zone<5 to hand/foot

-Immediate feedback loop: tile hit/miss on-screen mirrors physical input

Proud of

i’m proud of how I designed my pedals to work so that it holds the button right under the top part of the pedal, and it doesn’t activate till stepped on it. I’m also proud of how I soldered the wires together since it’s my first time. I also like how my code aesthetics turned out.

Future Improvements

I was thinking of maybe trying to have a multiplayer mode would be cooler: split-screen hand vs. foot competition, and maybe implementing a leaderboard function: save high scores.

Week 13 – User Testing

My user testing for the DIY Disco Set revealed that the project was generally intuitive and easy to navigate, largely due to the  instructions page I added and the exploratory nature of the buttons. Users were able to understand the mapping between controls and the resulting effects without much guidance, which speaks to the project’s accessible design. However, I think there was noticeable confusion around the vinyl mechanism. The instructions I wrote mentioned that they should spin the vinyl to start dj-ing, on a mechanical perspective this was supposed to just jump start the motor and then it would start spinnning on its own, but users continued to manually spin and touch it even after the motor had taken over, unintentionally disrupting its movement. This suggests that while the concept was understood, the transition from manual to motorized spinning was not entirely clear. To address this, I think the instructions could be refined to emphasize that spinning is only required at the beginning, and clearer visual feedback, so such as an indicator light or animation, could help users recognize when the motor is engaged.

Week 12 – Finalized Concept

For my final project, I decided to shift away from my original idea which was a traditional music box, because I realized it wasn’t interactive enough. Instead, I’ve reworked the concept into a DIY DJ set. So, the new version still keeps the essence of the music box, especially the spinning element, but now users will be spinning a vinyl or maybe a disco ball.

The project will allow users to shuffle through songs, change the playback speed, and add sound effects to simulate dj-ing. I’m also thinking of  incorporating NeoPixels to either synchronize with the music or enhance the overall atmosphere visually.  For this project, the Arduino would handle the  physical input, so when users press buttons, specific actions are triggered (e.g shuffle, speed, sound effect). On the p5.js side, I would build a music visualizer that will deconstruct the audio and represent it visually by showing changes in loudness and frequency over time.

 

week 13- user test

Overall: All navigated through the Start, Song, Difficulty successfully and began playing within 30 s.

copy_71E82669-1896-402A-A155-F76C8BE9E1AD

Confusion points:

the hit zones being the same colors as the hand buttons, they think the colors of the tile corresponds to the ‘color’ of the hand button.

It was avoided when people actually read the instructions page.

What worked well:

Physical button feedback, Audio cues, Lives (heart icons) and score counter provided clear feedback.

Areas to improve:

Add visual cues for “red = hand” and “blue = foot” like an icon for each tile