SpideySense – Progress Report

Finalized Concept

For my final project, I’m building an interactive Spider-Man experience where you can “swing” around a simplified version of the NYUAD campus using a glove with a sensor in it. The glove basically becomes your controller, but in a way that feels way more like Spiderman. When you do the signature web shooting gesture, Spider-Man reacts instantly on screen. I Instead of clicking buttons or pressing keys, you just move your hand, and the game turns that motion into a web-shooting movement across the campus. The whole idea is to make it feel intuitive and fun, like you’re actually guiding him through the space instead of controlling him from a distance.

Hardware

I’m using a capacitive touch sensor to detect the Spider-Man web-shooting gesture. The sensor is placed on the palm of the glove, and the fingertips are left exposed so that when the player touches the sensor with the correct fingers, mimicking Spider-Man’s iconic “thwip” pose, the system registers a web shot. When the hand relaxes and the fingers release the sensor, the web disappears in p5.

The challenge is integrating the sensor onto the glove in a way that keeps it comfortable and responsive. 

Arduino

The Arduino will:

    • Continuously read values from the capacitive touch sensor on the glove’s palm
    • Detect when the correct fingertips touch the sensor to register the Spider-Man web-shooting gesture
    • Send a signal to p5.js via Serial whenever the web-shooting gesture is made or released

This creates a one-way connection from the glove to the game, letting the hand gesture directly control Spider-Man’s web-shooting action in p5.js.

P5.js

    • Receive the readings from Arduino
    • Detect the web-shooting gesture to attach/release the web
    • Draw Spider-Man as a sprite moving across a simplified NYUAD campus
    • Apply physics for swinging: gravity, momentum, web forces

Currently, the p5 prototype is in progress. I plan to make campus look like the setting of a playstation game, by putting pictures of campus and asking AI to game-ify.

Progress So Far

I have the basic structure of the Spider-Man glove game implemented in p5.js. This includes multiple screens (intro, instructions, game), a building with target points, and the beginning logic for shooting a web. Right now, the web-shooting gesture is simulated with the SPACE key, which triggers a web from Spider-Man to a random target for a short duration. The game is set up so that the Arduino input from the capacitive touch sensor can be integrated later to replace the key press.

 

 

Week 12: Tokyo Flight inspired by The Wind Rises (Final project concept)

Final Project Concept:

“Tokyo Flight” is an interactive flying game that is inspired by one of Ghibli movies called The Wind Rises. The player controls a cardboard airplane equipped with an accelerometer and a button, both of which are connected to Arduino. The way the game works is that the user is going to tilt the airplane and the airplane on p5 will follow the movement of the tilt. The goal of the game is to pass through as many floating rings as possible without touching them or going out of the canvas. If users press the button on the airplane, then the speed of the airplane on p5 will boost up to help reach difficult rings. If the player collides with a ring or flies out of the screen, the flight ends. 

 

Arduino Components:

    • Accelerometer (MPU6050) inside the airplane: It measures the tilt forward and backward. Possibly left or right tilt as well.
    • Push buttons: It triggers a temporary speed boost.
    • LED lights: Light up when the boost is active

 

What Arduino does:

    1. Read sensor values: 
      • It continuously read acceleration + gyroscope data from the MPU 6050
      • We smooth the data to remove the noise 
      • We convert the tilt into a single value (from -90 degrees to 90)
    2. Read button state: 
      • We check if the button is pressed or not
      • Sends a signal (0 or 1) 
    3. We send data to p5.js through webserial 
    4. Additional feature: 
      • We receive signals from p5 to blink LED when boost is available 
      • Vibrate the motor when the user crashes 

 

What p5.js game will do:

    1. Connect to arduino via webserial 
      • We parse incoming serial strings and extract tilt angle and button state
    2. Control the on-screen airplane
      • We map tilt to the airplane’s vertical position. 
      • We also map horizontal tilt to slight horizontal drift
      • If speed boost button is pressed, we increase forward movement temporarily 
    3. Generate game objects: 
      • Rings appear from the right and move left 
      • Rings vary in size and vertical position 
      • We generate a wind to affect the airplane’s movement 
    4. Collision detection: 
      • If the airplane touches ring boundary, it is game over 
      • If the airplane goes beyond the canvas, it is game over. 
    5. Scoring System 
      • +5 for passing through small rings
      • +2 for passing through big rings 
    6. Visual and Audio design: 
      • We will have a soft blue sky with clouds and a green glassfield at the bottom to represent Wind Rises.
      • We will also play Ghibli audio while the game is playing 
    7. Send feedback to Arduino
      • Arduino flashes LED lights when the airplane crashes
      • Arduino keeps lights on when the airplane is on boost mode 
      • Arduino vibrates when the airplane goes through the rings.

 

Week 12 – Change of Final Project Idea and Proposal

Finalised Concept

For my final project, I will design and build an interactive penalty-kick football game that combines p5.js, Arduino, and a custom physical controller. My goal is to create a fun, responsive experience where players physically control the shooter’s movement and the direction of their kick using a real-world device. At the same time, the screen will display a dynamic game environment – complete with a goalkeeper, goalpost, and shooting animations.

This project will allow me to explore hardware-software integration, serial communication, real-time input handling, and user-centered interface design.

Gameplay

Player one will use a custom-built controller, which I will design using laser-cut casing and embed an Arduino joystick. The joystick will allow the player to select the ball’s direction – left, center, or right. Player one will also have a push button to initiate the kick toward the selected direction. The player sprite will move toward the ball when the button is pressed, and then the ball will move toward the chosen direction while the shooter remains in position.

Player two will control the goalkeeper using hand movements detected via ML5.js. By moving their hands left, right, or center in front of a camera, the player will slide the goalkeeper sprite across the goal to attempt a save. The game will simulate a real penalty shootout in a simplified and engaging way.

Arduino to p5
  • The joystick and push button inputs from the custom controller will be read by Arduino and sent to p5.js via serial.

  • These inputs will determine the direction of the ball and when the kick is initiated.

p5 to Arduino
  • After the ball reaches the goal or the goalkeeper makes a save, p5.js will send feedback to Arduino.

  • If a goal is scored, a green LED connected to the Arduino will turn on.

  • If the goalkeeper saves the ball, a red LED will turn on.

  • This provides tactile and visual feedback, creating a more immersive experience for the players.

Progress So Far

So far, I have focused on setting up the digital environment and assets for the game. I have:

  1. Prepared the visual assets

    • Loaded images for the football pitch, shooter, goalkeeper, ball, and goalpost into p5.js.

    • Scaled the images dynamically based on the window size to ensure the game will be responsive in full-screen mode.

    • Adjusted the positions of the shooter, ball, goalkeeper, and goalpost so that the gameplay layout is accurate and visually balanced.

  2. Implemented basic classes in p5.js

    • Created classes for the pitch, shooter, ball, goalkeeper, and goalpost.

    • Ensured each class has a display() method to render its sprite correctly on the canvas.

    • Tested the scaling logic and made the goalpost larger for a better gameplay area.

  3. Set up fullscreen functionality

    • Pressing the F key now toggles full-screen mode.

    • All sprites and background images adjust automatically to fit the screen size.

  4. Planned the shooting logic

    • Decided that the shooter will move toward the ball when the push button is pressed.

    • Once the shooter reaches the ball, the ball will move toward the selected direction (left, center, or right).

    • The shooter itself will remain in position while the ball moves.

  5. Laid groundwork for Arduino communication

    • Decided on bidirectional communication: joystick and push button inputs from Arduino will control the ball direction, and LEDs will provide visual feedback for goals and saves.

    • Begun planning the serial communication structure between Arduino and p5.js.

Next steps:
  • Build and test the custom controller once I receive the Arduino joystick.

  • Implement keyboard-controlled ball movement to simulate the shooter action before integrating the controller.

  • Add hand-detection logic with ml5.js to control the goalkeeper.

  • Integrate LED feedback for scoring and saves.

Week 12 – Final Project Progress

(Progress pictures)
The idea of the project is still the same, but renamed it to Mädenīet, which means culture in Kazakh. The game itself is focused on learning about various aspects of Kazakh culture. I found 3D models of dombyra, kiiz ui, taqiya, and asyq online through Sketchfab. They were of fbx format, so I used online converter to make them obj format, which p5 accepts. After trying to upload them to p5.js, they were too big, so I cloned everything to my VS Code, and decided that I will be running them locally instead. From Arduino side, I soldered arcade button, and added a circular Adafruit Neopixel with 24 LED. If we want to get into details, I connected Neopixel to digital pin 11, and LED side of arcade button to pin 9, and pin 6 to switch side of it. I laser printed out a box that I designed with Kazakh national ornaments in Adobe Illustrator. Made special holes from the back for Arduino cable to go through. I will be making a special holes for Neopixel cable to go from the front as well. I uploaded a background music of my favourite Soviet Kazakh ‘Vocal and Instrumental Ensemble’ – Dos Mukasan, and users will be listening through headphones and also be able to experience the game. User will press ‘Space’ to connect to Arduino port, then be required to press the arcade button (from Arduino to p5js connection). As soon as the hand is detected by the camera, the game and the music starts. With hand gestures of index finger user is able to rotate, and pinch between index and thumb allows to zoom in or zoom out. Making a fist, will transition to the next 3D model. Also, on the left side of each 3D model will be a short description of what this 3D model is and how valuable it is to the Kazakh culture. Change in different 3D models will send a signal from p5.js to Arduino by turning a neopixel into different colors (blue, orange, red, green). After Professor’s initial feedback, I decided not to pursue glove sensors and use Handpose ml5.js library to track the hand using a webcam, which made the coding much easier and so much more engaging. 

Final Project Concept

Project Concept

For my final project, I’m creating a gesture-based vocabulary learning system (more like a game, basically). The idea came from noticing how flashcard apps never really stick for me because they’re so passive, and I wanted to create something where your body is actually involved in the learning process. Hover your left hand to see just a hard definition, challenging you to guess the word. When curiosity gets the better of you, hover your right hand to reveal the answer. A quick swipe over both sensors is like saying “got it” and moves to the next word, while holding both hands still gives you an example sentence to see the word in context. The interface tracks which words trip you up and brings them back more often, while words you nail consistently take a break. It’s spaced repetition, but you’re physically interacting with it rather than just clicking buttons.

 GRE prep, LSAT terms, fancy academic writing words, or a custom list of interesting words you’ve encountered and want to remember, the system visualizes your progress over time so you can actually see your vocabulary growing.

Arduino Design

Both photoresistors constantly feed readings to the Arduino, which compares them against threshold values to detect when your hands are hovering. The key is making the gesture detection smooth rather than jumpy, so I’m building in some debouncing logic that prevents flickering when your hand is at the edge of the sensor range. The timing matters too because the Arduino needs to distinguish between a quick swipe that says “next word” and deliberately holding both hands there to request an example sentence. I’m planning to add more to this but this is it for now.

P5.js Program Design

Large, readable text dominates the screen. When you hover your hand over a sensor, there’s subtle visual feedback confirming the system detected you, so you’re never wondering if it’s working. The progress dashboard lives off to the side, quietly showing you how many words you’ve learned today and your overall mastery percentage. The “Word of the Day” feature adds a nice ritual to opening the system. The system saves your progress using local storage, so your learning history persists between sessions. Over time, you build up this visualization of your vocabulary growth that’s genuinely satisfying to watch. You can see which word sets you’ve conquered, which ones still need work, and how consistent you’ve been with your practice. It’s the kind of feedback that makes you want to keep going.

Implementation Progress

I’m starting with getting the Arduino sensors working reliably because everything else depends on that foundation. First step is just wiring up the photoresistors and watching the raw values in the Serial Monitor to understand what I’m working with. Once I know what “hand hovering” actually looks like in terms of sensor readings, I can write the gesture detection code with appropriate thresholds. After the gestures feel solid and responsive when I test them by hand, I’ll set up the serial connection to P5 and make sure the commands are flowing through correctly.  Then it’s on to building the P5 interface, starting simple with a hard-coded list of maybe ten words that cycle through in response to Arduino commands. Once that basic interaction loop works, I’ll layer in the vocabulary database, spaced repetition algorithm, and progress tracking features. The final polish phase is where I’ll add the Word of the Day, multiple vocabulary sets, and any visual refinements that make it feel complete. The goal is to have something functional quickly, then make it delightful.

Week 12: Final Project Proposal

Concept

The concept behind my final project is a simulation of one of my favorite activities to do, visiting smaller galleries. The project will include two elements, a virtual room that you can explore that holds the different artworks, and a physical floorplan board that the user will be able to interact with to move around the virtual room. There will be four artworks that the user can interact with each exploring a different element of a joined overarching theme. One work will be an auditory experience, where the user will contribute to the sound being created by the work, while one will be a visual experience where the user controls the flow of the visual. The other two works will have an experience that is interactive in the floorplan board where once again we explore audio and visuals, though through different elements.

Design & Description of the Arduino 

Arduino acts as the physical interface between the user and the virtual gallery. It reads interactions from the floorplan board and sends that data to the p5.js program, which controls the virtual room and the artworks inside it. The board will send signals from the pressure sensor to both signal which direction to go and how fast to move in that direction. The pressure sensors will be placed in front of the placeholders for the physical artworks. There will also be LED’s placed at each work that will receive a signal from p5 if the current artwork is the one the user is on the LED will start flashing. Additionally, two of the art works will be receiving signals and creating an output on the board one with sound and one with visuals. While it would more efficient to produce sound from p5, the idea behind the fourth work is the interaction of sound between the p5 and arduino at the same time.

Design & Description of the P5

 The p5 will be the main hub of the project where all the artworks will be displayed, the room will be navigable where the user will give signals to the arduino that will be received by the p5 that will control the navigation direction and speed. The works will also be sending a signal when they are pressed by a user on the p5, the work will enlarge on the screen and the LEDs will be sent a signal to start blinking. Additionally, conversely to the arduino two of the works will be sending signals to control audio and visuals on the floor plan.

Progress 

Below is a rough draft of the vision I have for the floorplan board that the user can interact with to control their experience. It will be laser-cut acrylic that will have pressure sensor below each of the artworks that will be used to navigate to the work. The lights at the bottom of each work can we an LED that will blink of that is the work the use is currently in.

For the p5.js, I explored possibilities on how to create an interactive 3D room, and created a draft of what it could possibly look like, using WEBGL, orbitControl() and camera controls. It was interesting to see the possibilities within p5 that I haven’t yet explores and it really changed my perspective on what the project could look like.

 

Week 12- Final Project Proposal

Finalised Concept for the Project

Plant Care Station is an interactive, game-like system that combines physical sensors through Arduino with a browser-based visual interface through p5.js.
The project teaches users about plant care (light, water, soil, and touch) through playful interactions, animations, glowing suns, and confetti celebrations.

My project uses sensor data from the Arduino : including four light sensors,  capacitive touch sensors using foil, and a soil moisture sensor — which are visualized in p5.js as a growing, living plant environment. As the user helps the plant receive enough light, fertilizer and moisture, the interface responds through movement, glowing suns, page transitions, and celebratory effects.

Arduino to P5:

  • 4 Light Sensors: Once the light has hit the sensors at a specific threshold, we use the p5 interface to show the user which sensors need more light. Once the light threshold is met, then confetti is thrown and the user can move to the next page.
  • Capacitative Touch sensors: I used aluminium foil to act as a capacitative touch sensor. This foil will be stuck to the scissors so that everytime the reading goes from NO-CONTACT to CONTACT: I count that as one leaf cut.
  • Capacitative Touch sensors/Pressure sensors: Once the fertilizer is paced in the soil we confirm that on the p5 screen and allow the user to move to the next page
  • Soil Moisture Sensor: Once the plant is watered to a specific threshold we notify the user on the p5 screen

 P5 to Arduino:

  • Once all the Steps are complete, p5 will send a signal to Arduino and the colourful LEDs will start flashing- to celebrate the completion.

 

Final Project Proposal

Finalized Concept For my final project, I am building a competitive 2-player game inspired by “Tomb of the Mask,” except physical steadiness determines who wins. Two players will race against each other to navigate a digital maze on the screen. Instead of using a keyboard, each player will hold a custom physical controller that detects tilt.

By tilting their controller forward, backward, left, or right, the players will steer their avatars on the screen. The game will require steady hands and precise movements to avoid crashing into walls while trying to beat the opponent to the finish line.

Arduino Design & Logic The Arduino will  translate physical movements into data.

Inputs: I will use two separate Accelerometers connected to a single Arduino board. One sensor is for Player 1, and the other is for Player 2.

Processing: The Arduino program will constantly read the X, Y, and Z tilt angles from both sensors to understand exactly how each player is holding their controller.

Communication (Sending): It will package these four values (Player 1 X/Y/Z and Player 2 X/Y/Z) and send them continuously to the computer via the USB cable.

P5.js Design & Logic P5.js will act as the “brain” and “display” of the game.

Visuals: The program will draw the maze walls, the start/finish zones, and the two player avatars.

Communication (Receiving): P5.js will continuously listen for the tilt numbers coming from the Arduino.

Game Logic:

    • It takes the tilt numbers and uses them to change the speed and direction of the player dots. (e.g., Tilt left = Move left).

    • It checks for collisions. If a player hits a wall, they stop or bounce back.

    • P5 checks for the winner and notifies the Arduino to reward the winner.

Project Roadmap To complete this project, I will follow these steps:

  1. Wiring: Connect both accelerometers to the Arduino and confirm I can get readings from both at the same time.

  2. Connection: Write the code to successfully send the sensor numbers from Arduino to P5.js.

  3. Gameplay: Create the maze graphics in P5 and program the physics so the dots move smoothly with the sensors.

  4. Building: Construct two simple handheld boxes (or 3D printed shells) to house the sensors so they are easy to hold and tilt.

Wild Runes – Final Project Proposal

Concept + How it Works

Inspired by Season 2 of the Netflix series Arcane, this project prioritize the experience of the concept while incorporating subtle interactive responses. On the first page, there will be a brief instruction of how the experience works. In simple terms, there will be an animated object floating in the screen once the user clicks one of the keys in the computer. This floating object, inspired by the “Wild Rune” introduced as a seed of destruction and chaos, is a colorful sphere floating in a dark space. In the series, this is an unstable, magical object that is constantly distorting and causing chaos around it. For this reason, in my project, this sphere will be open to drastic changes: From changes in size (growing and shrinking), to changes in speed, all while some of the most popular songs from the season’s soundtrack play.

Goal of the game: “Stabilize the Wild Rune”

From Arduino to P5

For the interaction,  I aim to have p5 respond to two potentiometers and one button.

  • Button 1 will allow the users to change the songs played in the background while the “game state” is active. After doing an array, each song will play on infinite loop while it runs in the background, and will only change when the player presses the button.
soundFormats("mp3", "ogg"); // Loading the sound file.
songList = [
  loadSound("Woodkid_To_Ashes_and_Blood.mp3"),
  loadSound("Blood_Sweat_&_Tears.mp3"),
  loadSound("Imagine Dragons_Enemy.mp3"),
  loadSound("Stromae, Pomme_Ma_Meilleure_Ennemie.mp3"),

 

  • Potentiometer 1 will allow the users to change the size of the sphere. After connecting Arduino UNO on p5, once the user presses “start” they will be able to increment or decrease the size of the sphere by twisting the potentiometer. (As shown in the VIDEO)
  • Potentiometer 2 will allow the user to change the speed at which the sphere is rotating. Just as the previous analog sensor, the second potentiometer will be accessible as soon as the Arduino UNO is connected to the p5.
// -------------------------
   // READ FROM ARDUINO
   // -------------------------
   let data = port.readUntil("\n");
   data = trim(data);

   if (data.length > 0) {
     // ---- POTENTIOMETER DATA ----
     if (data.startsWith("P:")) {
       let potValue = int(data.substring(2));
       if (!isNaN(potValue)) {
         sensorValue = potValue;
       }
     }

     // ---- BUTTON PRESSED ----
     else if (data === "BTN" && gameState === "playing") {
       playNextSong();
     }
   }
 }
   // Smoothly transition sphere size
   rune.size = lerp(rune.size, runeSize, 0.1);

   // Draw the sphere
   rune.display();

 

From P5 to Arduino

  • If the player grows the sphere past a specific radius on the canvas,  a red LED light will turn on.  Once they fix the size or return to the home page, the LED will turn off.
  • Similarly, if the player shrinks the sphere past a specific radius on the canvas,  a red LED light will turn on.  Once they fix the size or return to the home page, the LED will turn off.
  • When the player keeps the sphere at a certain size, a green LED light will turn on and take the user to the “winning” page. The LED will turn off.

Background Songs:

Imagine Dragons feat. JID – “Enemy (Opening Title Version)” (from Arcane) [Official Visualizer]

https://youtu.be/UqcE-IIevf0?si=1fEAqWTDhATXA7Av 

Woodkid – “To Ashes and Blood” (from Arcane Season 2) [Official Visualizer]

https://youtu.be/Gj-jmBi0aK8?si=6Yn53yDESy4y0R0K

Blood Sweat & Tears (from the series Arcane League of Legends)

https://youtu.be/ySbgRwQi3Rc?si=uf5mUQyy-rPUKWLz 

 

Objectives:

  • Add button to change music (in process)
  • Add potentiometer to adjust speed of spinning sphere
  • Fix Title (must show when the game begins (it currently doesn’t show since the text is used to showing on a 2D layer)
  • Make the size (too big and too small) cause RED LED to show
  • Make the right size cause the user to win (show GREEN LED)

Tasks completed so far:

  • Created a sphere in a WEBGL space for the three dimensional elements to work (shades, colors, among others)
  • Added potentiometer from p5 to Arduino to successfully grow and shrink the sphere
  • Defined a structure of p4 including: Introduction page, “playing state”, “winning page”.
  • Established the functions necessary for Arduino and p5 to exchange more than one piece of information (potentiometer + button 1)
  • Downloaded and uploaded GitHub files to p5 , learned in class, to stabilize bidirectional serial communication

VIDEO

Main Inspiration: Arcane Season 2 Clip 

https://youtu.be/8PU2iKx0YtQ?si=qtSexmSyrLLJLWcc

Chat GTP

Given the complexity of this project, I have used Chat GTP so far to assist me in organizing the code and guiding me step by step when something is missing or a mistake is preventing the code from running. For example, I struggled to make both the button and the potentiometer work at the same time, so I explained my situation to Chat GTP and it made me realize the error was in my code where the Arduino exchanges information with p5. It also helped me realize I hadn’t uploaded the files from Github, causing the port connection to malfunction.

Week 12 – Final Project Proposal

Concept & Progress

For my final project, I chose to design an interactive healthy-choice game inspired by foods and habits familiar in Kazakh culture. By combining a digital game with physical inputs using Makey Makey, I want to create an experience that is playful, educational, and rooted in cultural references that feel personal and recognizable. The game features a girl character standing at the bottom of the p5.js canvas. The main mechanic of the game revolves around catching and avoiding falling items that drop from the top of the screen.  I have already made a design of the playing page, you can see it below.

These items that will fall down divided into healthy and unhealthy options that commonly appear in everyday life in Kazakhstan. Healthy items such as apples, dates, milk, and water increase the player’s score, while unhealthy items like burgers and Coca-Cola must be avoided. Each healthy item gives a specific number of points: apples give +20, dates +10, milk +20, and water +10. This scoring system encourages players not just to move and react quickly, but also to distinguish between foods visually. The game is structured in two phases, which alternate after a set time. In Phase 1, apples, dates, and burgers fall. In Phase 2, milk, water, and Coca-Cola appear. When the timer runs out, the round ends, and p5.js communicates with Arduino to display GAME OVER on an LCD module attached to the Arduino. Players can restart the game to continue the experience. Physical interaction happens through real apples connected with Makey Makey, which transforms the apples into left and right movement controls. Touching one apple makes the character move left, and touching the other moves her right. This adds a tangible element to the game and ties the controller design back to the cultural theme of the project.

The p5.js program handles all visuals, item generation, collision detection, scoring, phase transitions, and timer logic. It also sends a serial signal to Arduino at the end of the game so that the LCD screen displays “GAME OVER.” Arduino, meanwhile, receives movement inputs from Makey Makey and translates the end-of-game signal into LCD output. I have started working on the basic structure of the p5.js code, including the falling item class, the character movement, the timer, and the two-phase logic. I am also working on the sprite sheet for the girl character so she matches the visual style of the game.

To summarize inputs and outputs

Arduino → p5.js

  • Makey Makey “key” inputs from apples (Left / Right movement)
    (interpreted in p5 as arrow keys or assigned keys)

  • p5.js → Arduino
  • Message: Game Over sent via serial

  • Arduino displays GAME OVER on the LCD screen