Final Project Concept

For my final project, I decided to stick to the idea of controlling music using Arduino sensors. I absolutely love listening to music, so this project is a kind of implementation of personal interest. At some point in time in my childhood I was thinking of becoming a DJ, so here it is, my final project will be a bunch of effects and filters for a song controlled by the self-made DJ panel that will incorporate sensors, switches, and the buttons from the Arduino kit.

What will Arduino do?

I will turn the objects and controls from the kit into the DJ toolbox.

1) I will use buttons and switches to control the songs. Pause, play, repeat, etc. Moreover, apart from using the real songs (thinking of doing 4 of them), I am planning to incorporate the controls for sounds that DJs use to enhance the listener’s experience, e.g. to make the transitions between songs smoother.

2) I will use sensors like ultrasonic distance sensors or potentiometers to control the music. A user will be able to apply filters like isolate basses, high notes, speed up/slow down, switch the songs, etc. using the motions and movements of hands. I am not sure about how diverse my DJ toolset will be, because it will depend heavily on the functions inside p5.js.

3) Potentially, if I have time, I will also incorporate the control of music visualization using buttons or switches or sensors.

What will p5.js do?

Music visualization is the part that I want to implement through p5.js. In my opinion, if I find the correct guidance on how exactly to accomplish that, it will significantly improve the project by adding nice visual effects to it. It is cool to hear the music, but it is even cooler to see it. I want the canvas to change according to the rhythm, tone, and other features of the song that I will be playing, corresponding to the filters that will be put on by the Arduino DJ panel.

 

 

 

 

Week 12: Final Project Draft

Concept:

The proposed project creates an interactive hand-gesture-based drawing system that enables users to draw on a digital canvas using hand movements. By combining P5.js, ML5.js Handpose, and Arduino, the project bridges physical gestures with digital creativity. The system uses hand tracking to allow natural interactions and integrates real-time physical feedback through LEDs and a vibration motor, offering an immersive experience.

The primary goal is to deliver a tactile and visual interface where users feel a sense of creation and engagement as their gestures directly influence the drawing and physical environment. The interplay between the digital canvas and physical feedback fosters an innovative and intuitive drawing system.

 

Arduino Program

The Arduino system responds to commands received from P5.js, controlling physical feedback components:

  • NeoPixel LEDs display colors that reflect the brush dynamics, such as the current color selected for drawing. Faster hand movements can increase brightness or trigger lighting effects.
  • Vibration motors provide tactile feedback during specific actions, such as clearing the canvas or activating special effects.

The Arduino continuously listens to P5.js via serial communication, mapping the incoming data (e.g., RGB values, movement speed) to appropriate hardware actions

P5.js Program

The P5.js program is using ML5.js Handpose to track the user’s hand and fingers. The tracked position—primarily the palm center or fingertip—is mapped to the digital canvas for drawing.

As the user moves their hand, the P5.js canvas renders brush strokes in real-time. Brush properties such as size and color dynamically adjust based on gestures or key inputs. For instance:

  • Moving the palm creates brush strokes at the detected location.
  • Pressing specific keys changes brush size (+/-) or color (C).
  • Gestures like a quick swipe could trigger special visual effects.

The program communicates with Arduino to send brush-related data such as the selected color and movement intensity. This ensures the physical environment (e.g., LED lighting) mirrors the user’s actions on the canvas.

Interaction Flow

  1. The system starts with a webcam feed processed by ML5.js Handpose.
  2. P5.js tracks hand movements and maps positions to the canvas.
  3. Users draw by moving their hand, with brush strokes appearing at the tracked position.
  4. Real-time brush properties, such as color and size, are sent to Arduino.
  5. Arduino reflects these changes through LEDs and tactile feedback:
    • The LEDs light up with the same brush color.
    • Vibration occurs during specific gestures or interactions, providing physical confirmation of actions.
  6. The user experiences synchronized digital and physical interactions, offering a sense of control and creativity.

Goals and Outcomes

The project aims to deliver an interactive tool that provides:

  1. A dynamic and intuitive drawing experience using hand gestures.
  2. Real-time physical feedback through LEDs and vibration motors.
  3. A visually and physically engaging system that bridges the digital and physical worlds.

Current Code:

Week 12: Final Project Progress

Finalised Idea

My finalised idea, from last week’s two options is the NYUAD Puzzle Adventure. 

On P5js the game will display a puzzle with shuffled pieces of  different places of NYU Abu Dhabi campus.  The pieces will be telling a story as a hole in the game. The entire game control will be from  the Arduino hardware.

Progress 

Up until this week, I have been working on the P5js designing the possible structure for the puzzle as seen on the image below:

The hardware design will consist of two potentiometer that will be used to move the puzzle pieces one potentiometer for horizontal movements and one for vertical movements. I will also have two buttons that will be used whenever player wants to lift or drop a piece in the game. I will also have hidden motors (vibration motors potentially) to provide players with feedback as the game progresses. The hardware box will also have an opening mechanism to reward players with NYUAD inspired present given that they correctly solve the puzzle and the grand prize for whoever beats the fastest time.

My hardware schematic will be as shown below:

Schematic

 

 

Week 12: Final Project Progress

Concept

Given the limited time constraint, the transforming robot presented a greater challenge, necessitating a reevaluation of the narrative and the concept as a whole.

At its core, the redesigned Project Utopia centers around an elegant yet challenging puzzle box. Each compartment represents a unique layer of interaction, requiring the user to solve riddles or manipulate physical components to unlock the next. Inside the compartments, users find carefully crafted informational cards and audio prompts that guide or enrich the experience.

Also, this redesign was inspired by the constraints of available technology, but it provided an opportunity to reimagine how interactive installations can connect with users through simplicity and creativity.

Features

At its heart, Project Utopia revolves around a puzzle box with multiple compartments. Each compartment contains riddles, cards with hidden clues, and audio prompts powered by p5.js. However, the twist lies in ARIS, the robotic overseer. As users engage with the puzzle box, ARIS intervenes—blocking, guiding, or assisting—depending on the challenge. This integration transforms the project into a blend of physical interaction, robotics, and storytelling, elevating the experience for participants.

The Box

A 3-d painted box, it has compartments which each secured with a unique mechanism, requiring users to solve challenges to access. The clearing of puzzles means that the person would be able to access the prompts and clues inside.

The Bot

The bot moves on two motored wheels with a castor wheel in the front for stability. It has one servo motor powered hand (made of cardboard), and

  • LCD Screen Eyes: Display emotions like suspicion, curiosity, or delight.
  • Servo Hand: Blocks premature access to the box.

it also is an expressive robot, meaning that major actions from the user will prompt expressions and sound from him.

The Code so far
#include <LiquidCrystal.h>
#include <Servo.h>

// Initialize LCDs
LiquidCrystal lcdLeft(2, 3, 4, 5, 6, 7);  
LiquidCrystal lcdRight(8, 9, 10, 11, 12, 13);

// Initialize the servos
Servo servoLeft;
Servo servoRight;

// Custom character arrays for expressions
byte happyEye[8] = {
  B00000,
  B01010,
  B00000,
  B10001,
  B01110,
  B00000,
  B00000,
  B00000
};

void setup() {
  // Begin LCDs
  lcdLeft.begin(16, 2);
  lcdRight.begin(16, 2);

  // Load custom characters
  lcdLeft.createChar(0, happyEye);
  // lcdLeft.createChar(1, confusedEye);

  lcdRight.createChar(0, happyEye);
  // lcdRight.createChar(1, confusedEye);

  // Start with a happy expression
  showHappyExpression();
}

void loop() {
  // Alternate expressions every 3 seconds
  showHappyExpression();
  delay(3000);
}

void showHappyExpression() {
  // Left Eye
  lcdLeft.clear();
  lcdLeft.setCursor(0, 0);
  lcdLeft.write(byte(0)); // Happy Eye

  // Right Eye
  lcdRight.clear();
  lcdRight.setCursor(0, 0);
  lcdRight.write(byte(0)); // Happy Eye
}

Week 12 – Progress On The Final Project

Concept

Set off with the idea of transforming my midterm project into an online PVP game from an offline PVE game (which has been achieved by this time as my major progress this week—we’ll come back to this later), the concept of my final now includes another dimension of the human-robot competition—utilizing the PVP mechanism I have now and building a physical robot to operate one of the players in the game.

Of course, technically speaking, it would be mission impossible to develop a robot that could actually play and compete with humans within a week or so. Therefore, the physical installation of the robot would be a puppet that ostensibly controls one of the players, and the actual gaming logic would still be programmed in p5.

Illustration

The bottle is a placeholder for the hand robot to be realized.

Potential Hand Robot

Based on an open-source printable robot hand 3D model, servos could be attached to the fingers’ ends to bend each of them when needed—if parameters are tuned.

p5-Arduino Communication

As the gaming logic would be done within p5, what needs to be sent to Arduino are the translated movements of the robot hand—specifically the servos, such as for how long a specific finger needs to press a key that feeds back to trigger the movement of the player in the game.

Current Progress (Code Available)

By far, based on the midterm, the following functions have been achieved

  1. Control mode selection – determines if the game is going to be controlled by head movement (for humans) or direction keys (for the robot)
  2. Synchronized online battle – realizes the real-time communication between two (or more) sketches on two (or more) computers by setting up an HTTP WAN server using Node.js and socket.io. Necessary data from one sketch is shared with the other in real time (with a buffer time of 5 ms). Core code of this part of the sketch:
    // sketch.js
    
    // The global broadcast dicitonaires to communicate with the other player
    let globalBroadcastGet = {
      x: 0,
      y: 0,
      rotationX: 0,
      rotationY: 0,
      rotationZ: 0,
      health: 100,
      energy: 100,
      tacticEngineOn: false,
      laserCooldown: 100, // milliseconds
      lastLaserTime: 0,
      colliderRadius: 30, // Example radius for collision detection
      destroyCountdown: 90,
      toDestroy: false,
      laserFired: 0,
      damageState: false,
      readyToPlay: false,
      bkgSelected: 'background1'
    }
    let globalBroadcastSend = {
      x: 0,
      y: 0,
      rotationX: 0,
      rotationY: 0,
      rotationZ: 0,
      health: 100,
      energy: 100,
      tacticEngineOn: false,
      laserCooldown: 100, // milliseconds
      lastLaserTime: 0,
      colliderRadius: 30, // Example radius for collision detection
      destroyCountdown: 90,
      toDestroy: false,
      laserFired: 0,
      damageState: false,
      readyToPlay: false,
      bkgSelected: 'background1'
    }
    
    // Interval for sending broadcasts (in milliseconds)
    const BROADCAST_INTERVAL = 5; // 5000 ms = 5 seconds
    
    // Setup function initializes game components after assets are loaded
    function setup() {
      ...
    
      // Initialize Socket.io
      socket = io();
    
      // Handle connection events
      socket.on('connect', () => {
        console.log('Connected to server');
      });
    
      socket.on('disconnect', () => {
        console.log('Disconnected from server');
      });
    
      // Reconnection attempts
      socket.on('reconnect_attempt', () => {
        console.log('Attempting to reconnect');
      });
    
      socket.on('reconnect', (attemptNumber) => {
        console.log('Reconnected after', attemptNumber, 'attempts');
      });
    
      socket.on('reconnect_error', (error) => {
        console.error('Reconnection error:', error);
      });
    
      // Listen for broadcast messages from other clients
      socket.on('broadcast', (data) => {
        // console.log('Received broadcast');s
        try {
          // Ensure the received data is a valid object
          if (typeof data === 'object' && data !== null) {
            globalBroadcastGet = data; // Replace the entire BroadcastGet dictionary
          } else {
            console.warn('Received data is not a valid dictionary:', data);
          }
        } catch (error) {
          console.error('Error processing received data:', error);
        }
      });
    
      // Set up the periodic sending
      setInterval(sendBroadcast, BROADCAST_INTERVAL);
    
      ...
    }
    
    // Function to send the BroadcastSend dictionary
    function sendBroadcast() {
    
      // Update BroadcastSend dictionary
      let BroadcastSend = globalBroadcastSend;
    
      // Send the entire dictionary to the server to broadcast to other clients
      socket.emit('broadcast', BroadcastSend);
      // console.log('Sent broadcast:', BroadcastSend);
    }
    // server.js
    
    /* Install socket.io and config server
    npm init -y
    npm install express socket.io
    node server.js
    */
    
    /* Install mkcert and generate CERT for https
    choco install mkcert
    mkcert -install
    mkcert <your_local_IP> localhost 127.0.0.1 ::1
    mv 192.168.1.10+2.pem server.pem
    mv 192.168.1.10+2-key.pem server-key.pem
    mkdir certs
    mv server.pem certs/
    mv server-key.pem certs/
    */
    
    const express = require('express');
    const https = require('https');
    const socketIo = require('socket.io');
    const path = require('path');
    const fs = require('fs'); // Required for reading directory contents
    
    const app = express();
    
    // Path to SSL certificates
    const sslOptions = {
      key: fs.readFileSync(path.join(__dirname, 'certs', 'server-key.pem')),
      cert: fs.readFileSync(path.join(__dirname, 'certs', 'server.pem')),
    };
    
    // Create HTTPS server
    const httpsServer = https.createServer(sslOptions, app);
    // Initialize Socket.io
    const io = socketIo(httpsServer);
    
    // Serve static files from the 'public' directory
    app.use(express.static('public'));
    
    // Handle client connections
    io.on('connection', (socket) => {
      console.log(`New client connected: ${socket.id}`);
    
      // Listen for broadcast messages from clients
      socket.on('broadcast', (data) => {
        // console.log(`Broadcast from ${socket.id}:`, data);
        // Emit the data to all other connected clients
        socket.broadcast.emit('broadcast', data);
      });
    
      // Handle client disconnections
      socket.on('disconnect', () => {
        console.log(`Client disconnected: ${socket.id}`);
      });
    });
    
    // Start HTTPS server
    const PORT = 3000; // Use desired port
    httpsServer.listen(PORT, () => {
      console.log(`HTTPS Server listening on port ${PORT}`);
    });

    Other changes in classes to replace local parameter passing with externally synchronized data, such as:

    // EnemyShip.js
    
    class EnemyShip {
      ...
    
      update() {
        this.toDestroy = globalBroadcastGet.toDestroy;
    
        this.health = globalBroadcastGet.health;
        this.energy = globalBroadcastGet.energy;
    
        if (this.toDestroy === false) {
          this.x = globalBroadcastGet.x;
          this.y = globalBroadcastGet.y;
          
          // Update rotation based on head movement
          this.rotationX = globalBroadcastGet.rotationX;
          this.rotationY = globalBroadcastGet.rotationY;
          this.rotationZ = globalBroadcastGet.rotationZ;
          
          if (globalBroadcastGet.tacticEngineOn === true && this.tacticEngineUsed === false) {
            this.tacticEngine()
          }
    
          // Tactic engine reset
          if (this.tacticEngineOn === true) {
            let currentTime = millis();
            if (this.model === assets.models.playerShip1) {
              this.health = 100;
            } else {
              this.energy = 100;
            }
            if (currentTime - this.tacticEngineStart > 15000) {
              this.tacticEngineOn = false;
              if (this.model === assets.models.playerShip1) {
                this.health = 100;
              } else {
                this.energy = 100;
              }
            }
          }
        }
      }
    
      ...
    
    }
  3. Peripheral improvement to smoothen PVP experience, including recording the winner/loser, synchronizing game configuration, etc.

Week 12 – Final Project Concept

Concept 

An interactive game that combines physical hardware inputs with digital visuals to create a fun, engaging experience. The objective of the game is to catch falling fruits of different colors into a basket by pressing the corresponding colored buttons. Each button plays a musical note, adding an auditory layer of feedback to the gameplay. The speed of the falling fruits increases as the game progresses, making it more challenging. Players have three lives, and the game tracks both the current score and the highest score achieved, fostering a sense of competition.

Inspiration 

The concept is inspired by classic arcade-style games like “Fruit Ninja” and “Tetris,” which involve quick reflexes and colorful visuals. I wanted to create a game that not only challenges the player’s reaction time but also provides a multisensory experience by integrating music and physical interaction. The picnic theme stems from the cheerful idea of catching fruits for a basket, evoking a sense of fun and lightheartedness. The addition of musical notes tied to the button presses makes the game playful and unique, while the increasing difficulty ensures sustained engagement.

Implementation Plan 

Hardware:

  1. Arduino Components:
    • Four colored buttons: Each button corresponds to a fruit color (e.g., red for apples, yellow for bananas).
    • A piezo buzzer or speaker: Plays a unique note when a button is pressed, creating a musical effect.
  2. Connections:
    • Each button will be connected to a specific digital pin on the Arduino.
    • A buzzer will also be connected to a pin to generate sounds.
  3. Power Supply: A USB cable will power the Arduino during gameplay.

Software:

  1. Arduino Code:
    • The buttons will detect user input and send signals to the p5.js sketch via serial communication.
    • Debounce logic will be implemented to prevent multiple readings from a single press.
  1. p5.js Sketch:
    • Fruit Animation:
      • Fruits of various colors will fall from random positions at the top of the screen toward a basket at the bottom.
      • The basket will automatically “catch” fruits when the correct button is pressed.
    • Game Logic:
      • The player scores points for catching fruits correctly.
      • Missing a fruit or pressing the wrong button reduces a life.
      • The game ends when the player loses all three lives.
    • Difficulty:
      • The speed of the fruits will gradually increase to challenge the player as the game progresses.
    • Scoreboard:
      • A live scoreboard will display the current score and the highest score achieved.

Interaction 

  • User Inputs: The player uses physical buttons to interact with the game, linking the tactile experience of pressing buttons with the visual and auditory feedback.
  • Feedback Mechanisms:
    • Musical notes for correct actions.
    • On-screen effects for catching or missing fruits.
    • A real-time display of lives and scores for progress tracking

Week 12: Final Project Proposal

Final Concept 

My final concept for my IM final will be an interactive map representing the  diversity and culture of NYU students and our families through dance and music. I will have a framed, 2D paper map with buttons marked on different location around the world. When the user presses the button a song corresponding to that specific location will play. I plan to compile 10 songs with their selection being based on interviews from people I know at school. When a song is played, I’d also like to have a video of people dancing to the song as well. My hope is that the music and dance video will prompt users to partake in dancing to the music themselves as a type of cultural exchange but if I am being realistic, I’m not too sure that will happen. However, I will try to tweak the presentation of the dance video as much as possible so users feel invited to move their bodies in some way.

Concerns

I don’t think my ideas will be too difficult to implement, so in keeping it simple I hope to truly perfect each aspect of it. However, I do still have some concerns going into it. I have yet to make any final decisions on what the software side will look like,  I want it to be simple yet playful but am not exactly sure how I’d like to implement that aesthetic yet. Additionally, I originally designed a frame that stands up for users to press the buttons but after further consideration I realized they might not be able to properly push the buttons in because there is no support behind the map. My possible solutions for this are either using light sensors instead of buttons so they basically just have to hover their finger over the location, or making a 2D frame of sorts that just lays flat along the table. I think the map laying flat will be easier to implement and design so I will probably go with that as a solution but am still hesitant because I think it sacrifices a bit of the aesthetic of the overall design.

On the left is the standing up version and what the support would look like and on the right is the ‘tabletop’ design.

As I move further into the implementation stage I also want to consider the possibility of users being able to add their own songs but don’t want to get too far ahead of myself before the foundation is established.

Arduino Component 

For the most part the main thing the Arduino will be dealing with is receiving the information of what location the user selects. If time permits, I’d love to create some kind of LED visual to go along with the music but that is not my priority for now. Although I was originally leaning towards using the light sensors to detect the location a user selects, I am afraid of inconsistency in the data being read because their will be so much going on at the showcase so I think the buttons will be more reliable. With that being said, the Arduino program will signal P5 when a city is chosen and then P5 will handle the rest in terms of playing the song and whatnot. A small feature I am interested in implementing as well is using the LED board display that simply writes “Welcome to [country of song’s origin” to another component.

SchematicI did not include the LCD display in this schematic because I’m not sure how I’d balance those pins with what I need for the switches so I am temporarily pausing that idea. I will see if I can split half of the switched on analog and the other half on digital since LCD needs to be on digital pins.

P5 Component 

On the P5 side, I would really like to have an animation aspect of sorts that zooms in from a full map view to the specific city in order to play a song. I don’t think this will be too difficult to implement because I can just play a video, but I’m not exactly sure how I’d do it because I would need to create a different animation for each of the 10 cities which is a bit tedious. Here is a rough idea of what I mean by animation:

Once the animation plays a little information card about the song, it’s origins and the person who submitted it will play. Then, the user will be prompted to start the song/video via an on screen button and then can repeat and explore the map from there. I do not plan on playing the entire song for each video, I am thinking 20-30 seconds of the main chorus of the person who submitted it’s favorite part of the song.

P5 will be receiving the information about what button was selected from Arduino and will send a ‘current state’ flag to Arduino to prevent users from trying to select another location while a current one’s video is playing.

Week 12 – Final Project Progress

Concept

I have decided to work on the idea to use a box to control the platform for the hero. There will be 1 box that the user can move (physical part) that will control the platform in p5js. The project is inspired by the game “My Shadows are Bright” from GMTK game jam 2024.

For my game, the physical/arduino part will include a cardboard box that can be moved on the x, y axis by the user. Depends on the position of the box, it will be drawn on the screen in p5js in form of a platform. If the box is nearer to the front sensor (or user), it will be drawn bigger, and if it’s further away it will be smaller. This game is a puzzle game type, where the user have to move the box to get the correct size and position for the game character to move from the left side of the screen to the right side.

In terms of the story, I want to tell a story of how rejection is not always bad, sometimes it leads you to a path that is more suitable. It will be carried out using the character prince who is always rejected by the princess.

Implementation

Arduino

  • 2 photoresistors/ultrasonic sensors: 1 in front and 1 on right hand side
  • 1 LED to show game started
  • Resistors and wires
  • cardboards

P5js

  • Platform/box class
  • Hero class
  • Collision detection: detect when hero is standing on the platform and when not standing on the platform
  • Stages (introduction, game play, ending)

Progress

I am not sure if this project is feasible or not so I make a small prototype to see if I can use arduino to track the position of the box. In this code I get analog read value from photoresistor and draw a box based on how near or far it is from the sensor. It seems feasible so far.

Week 12: Final Project Proposal

For my final project, I decided to bring my midterm project to life and create controllers for the game characters that I designed. If it’s possible, I would like to make two game pads which two players will have to press physically with their feet, which then controls the two characters on the screen. The game pads would be inspired by the video game Dance Dance Revolution (DDR). In addition to the game pads, I was also thinking about adding sound effects using the beeper when the health decreases in the game, the LCD display to depict some artwork or health bar, or the LEDs to give some feedback to the players. I would like to make use of the woodshop, but if time doesn’t permit this idea, I wouldn’t mind scaling down to use the 3D printers or laser cutters.

Some Code Snippet for the Week

const int UP_BUTTON = 11;
const int DOWN_BUTTON = 10;
const int RIGHT_BUTTON = 6;
const int LEFT_BUTTON = 5;

void setup() {
  // Start serial communication at 9600 baud
  Serial.begin(9600);

  // Set button pins as input with internal pull-up resistors
  pinMode(UP_BUTTON, INPUT_PULLUP); 
  pinMode(DOWN_BUTTON, INPUT_PULLUP);
  pinMode(RIGHT_BUTTON, INPUT_PULLUP);
  pinMode(LEFT_BUTTON, INPUT_PULLUP); 
}

void loop() {
  // Read the state of the UP_BUTTON (HIGH = not pressed, LOW = pressed)
  int upButtonState = digitalRead(UP_BUTTON); 
  int downButtonState = digitalRead(DOWN_BUTTON); 
  int rightButtonState = digitalRead(RIGHT_BUTTON); 
  int leftButtonState = digitalRead(LEFT_BUTTON); 

  // Print the value (HIGH or LOW) to the Serial Monitor
  Serial.print(upButtonState);
  Serial.print(",");
  Serial.print(downButtonState);
  Serial.print(",");
  Serial.print(rightButtonState);
  Serial.print(",");
  Serial.println(leftButtonState);

}

 

Week 12: Final Project Proposal

Hi everyone! 👋

I have a bit of a better idea now for my final project (though I’m not ruling out the possibility of doing something completely different 😅). I’m not really sure how best to describe it, but simply put, you have the power to manipulate objects with your hands, and you use to protect yourself and fight against a villain, who you must also chase down (a bit of a reference to my midterm).

# Design Description

Arduino:
Inputs:
  • Pressure sensor (inside of a stress ball), for detecting when the player closes their hand.
  • Capacitive touch sensor strip, as a control for firing.
  • Some game state info (from p5)
Outputs:
  • Neopixel (addressable LED strip)
  • Send values of pressure sensor and touch sensor to p5.
Circuit View:

Note: I used a flex sensor in place of the touch sensor as that wasn’t available right now.
Schematic:

Note: I used a flex sensor in place of the touch sensor as that wasn’t available right now.
p5:
Inputs:
  • MoveNet pose detection, to get angle of the player’s hand in relation with the screen.
  • Pressure sensor and touch sensor values from Arduino.
Outputs:
  • The game on screen.
  • Some game state info to the Arduino.