Week 13: Final Project User Testing

User Testing Experience
I conducted a user test with my friend, creating a scenario where he had to interact with the clock from scratch. I completely reset the device and watched how he would handle the initial setup and configuration process.

Clock Setup

Clock interaction

What I Observed
The first hurdle came with the connection process. Since the clock needs to connect to WiFi and sync with the time server, this part proved to be a bit challenging for a first-time user. My friend struggled initially to understand that he needed to connect to the clock’s temporary WiFi network before accessing the configuration interface. This made me realize that this step needs better explanation or perhaps a simpler connection method.
However, once past the connection stage, things went much smoother. The configuration interface seemed intuitive enough – he quickly figured out how to adjust the display settings and customize the clock face. The color selection and brightness controls were particularly straightforward, and he seemed to enjoy experimenting with different combinations.

Key Insights
The most interesting part was watching how he interacted with the hexagonal display. The unusual arrangement of the ping pong balls actually made him more curious about how the numbers would appear. He mentioned that comparing different fonts was satisfying , especially with the lighting background effects.

Areas for Improvement
The initial WiFi setup process definitely needs work. I’m thinking about adding a simple QR code on the device that leads directly to the configuration page, or perhaps creating a more streamlined connection process. Also, some basic instructions printed on the device itself might help first-time users understand the setup process better.
The good news is that once configured, the clock worked exactly as intended, and my friend found the interface for making adjustments quite user-friendly. This test helped me identify where I need to focus my efforts to make the project more accessible to new users while keeping the features that already work well.

Moving Forward
This testing session was incredibly valuable. It showed me that while the core functionality of my project works well, the initial user experience needs some refinement. For the future I will focus particularly on making the setup process more intuitive for first-time users.

Week 10 Reading Response

The Future of Interaction

Reading this article really made me rethink how I interact with technology. I’ve always been fascinated by touchscreens and their simplicity, but I never stopped to consider how limiting they actually are. The author’s critique of “Pictures Under Glass” really hit me, especially when they described how flat and numb these interfaces feel. It’s true—I use my phone every day, but when I think about it, the experience of swiping and tapping feels so disconnected compared to how I interact with physical objects.

One part that really stood out to me was the comparison to tying shoelaces. It reminded me of when I was little and struggling to learn how to tie mine. My hands learned by feeling, adjusting, and figuring things out without needing to rely on my eyes all the time. That’s such a natural way for us to interact with the world, and it’s crazy to think how little that’s reflected in our technology today.

The section about making a sandwich was also a moment of realization for me. It’s such a simple, everyday task, but it involves so much coordination and subtle feedback from your hands—how the bread feels, the weight of the knife, the texture of the ingredients. None of that exists when I swipe through apps or scroll on a website. It made me wonder: why do we settle for technology that ignores so much of what our hands can do?

This article really inspired me to think differently about the future of technology. I agree with the author that we need to aim higher—to create interfaces that match the richness of our human abilities. Our hands are capable of so much more than sliding on glass, and it’s exciting to imagine what might be possible if we started designing for that.

Responses: A Brief Rant on the Future of Interaction Design

I found this follow-up just as thought-provoking as the original rant. The author’s unapologetic tone and refusal to offer a neatly packaged solution make the piece feel refreshingly honest. It’s clear that their main goal is to provoke thought and inspire research, not to dictate a specific path forward. I really appreciated the comparison to early Kodak cameras—it’s a great reminder that revolutionary tools can still be stepping stones, not destinatione.

The critique of voice and gesture-based interfaces resonated with me too. I hadn’t really considered how dependent voice commands are on language, or how indirect and disconnected waving hands in the air can feel. The section on brain interfaces was particularly interesting. I’ve always thought of brain-computer connections as a futuristic dream, but the author flipped that idea on its head. Instead of bypassing our bodies, why not design technology that embraces them? The image of a future where we’re immobile, relying entirely on computers, was unsettling but eye-opening.

I love how the author frames this whole discussion as a choice. It feels empowering, like we’re all part of shaping what’s next. It’s made me more curious about haptics and dynamic materials—fields I didn’t even know existed before reading this. I’m left thinking about how we can create tools that actually respect the complexity and richness of human interaction.

 

Week 10 Project Echoes of Light

Concept
Our project, “Light and Distance Harmony,” emerged from a shared interest in using technology to create expressive, interactive experiences. Inspired by the way sound changes with distance, we aimed to build a musical instrument that would react naturally to light and proximity. By combining a photoresistor and distance sensor, we crafted an instrument that lets users shape sound through simple gestures, turning basic interactions into an engaging sound experience. This project was not only a creative exploration but also a chance for us to refine our Arduino skills together.

Materials Used
Arduino Uno R3
Photoresistor: Adjusts volume based on light levels.
Ultrasonic Distance Sensor (HC-SR04): Modifies pitch according to distance from an object.
Piezo Buzzer/Speaker: Outputs the sound with controlled pitch and volume.
LED: Provides an adjustable light source for the photoresistor.
Switch: Toggles the LED light on and off.
Resistors: For the photoresistor and LED setup.
Breadboard and Jumper Wires
Code
The code was designed to control volume and pitch through the analog and digital inputs from the photoresistor and ultrasonic sensor. The complete code, as documented in the previous sections, includes clear mappings and debugging lines for easy tracking.

// Define pins for the components
const int trigPin = 5; // Trigger pin for distance sensor
const int echoPin = 6; // Echo pin for distance sensor
const int speakerPin = 10; // Speaker PWM pin (must be a PWM pin for volume control)
const int ledPin = 2; // LED pin
const int switchPin = 3; // Switch pin
const int photoResistorPin = A0; // Photoresistor analog pin

// Variables for storing sensor values
int photoResistorValue = 0;
long duration;
int distance;

void setup() {
Serial.begin(9600); // Initialize serial communication for debugging
pinMode(trigPin, OUTPUT); // Set trigger pin as output
pinMode(echoPin, INPUT); // Set echo pin as input
pinMode(speakerPin, OUTPUT); // Set speaker pin as output (PWM)
pinMode(ledPin, OUTPUT); // Set LED pin as output
pinMode(switchPin, INPUT_PULLUP); // Set switch pin as input with pull-up resistor
}

void loop() {
// Check if switch is pressed to toggle LED
if (digitalRead(switchPin) == LOW) {
digitalWrite(ledPin, HIGH); // Turn LED on
} else {
digitalWrite(ledPin, LOW); // Turn LED off
}

// Read photoresistor value to adjust volume
photoResistorValue = analogRead(photoResistorPin);

// Map photoresistor value to a range for volume control (0-255 for PWM)
// Higher light level (LED on) -> lower photoresistor reading -> higher volume
int volume = map(photoResistorValue, 1023, 0, 0, 255); // Adjust mapping for your setup

// Measure distance using the ultrasonic sensor
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

duration = pulseIn(echoPin, HIGH);

// Calculate distance in cm
distance = duration * 0.034 / 2;

// Set frequency based on distance in the range of 2-30 cm
int frequency = 0;
if (distance >= 2 && distance <= 30) {
frequency = map(distance, 1, 100, 20000, 2000); // Closer = higher pitch, farther = lower pitch
tone(speakerPin, frequency);
analogWrite(speakerPin, volume); // Apply the volume based on photoresistor reading
} else {
noTone(speakerPin); // Silence the speaker if the distance is out of range
}

// Debugging output
Serial.print("Photoresistor: ");
Serial.print(photoResistorValue);
Serial.print("\tVolume: ");
Serial.print(volume);
Serial.print("\tDistance: ");
Serial.print(distance);
Serial.print(" cm\tFrequency: ");
Serial.println(frequency);

delay(100); // Short delay for sensor readings
}

 

Video Demonstration
In our video demonstration, we showcase how the instrument responds to changes in light and proximity. We toggle the LED to adjust volume and move a hand closer or farther from the ultrasonic sensor to change pitch, demonstrating the instrument’s sensitivity and interactive potential.

Reflections
What Went Well: The project successfully combines multiple sensors to create a reactive sound device. The integration of volume and pitch control allows for intuitive, responsive sound modulation, achieving our goal of designing an engaging, interactive instrument.

Improvements: To improve this instrument, we would enhance the melody range, creating a more refined and versatile sound experience. This could involve using additional sensors or more sophisticated sound generation methods to provide a broader tonal range and a richer melody.

Week 13: User Testing

Overall Reaction 
My user testing experience went about how I expected it to. The person who participated was already familiar with my concept so it was a bit challenging to get an authentic reaction but we still had a productive conversation about what steps I need to take to improve. Although he was aware of my concept, I did not prompt him with any steps on how to run the software. Therefore, I was quite pleased that he was able to navigate it fairly seamlessly but am considering altering a few of the prompts just so users have a better idea of what to expect when they choose a location.

Areas for Improvement 

One thing he highlighted in our discussion afterwards was the quality of my mappings. Because you are viewing the same thing on the screen (with brief instructions) and what you touch on the paper, it is clear that you should press a button to generate that location’s music. Right now my biggest concern in terms of improvements in finishing adding the switches to the map. This weekend I focused my time on finishing the software but now realize how that limited the user experience for testing. To help remedy this, I plan to conduct user testing again this Saturday to get a full critique of the design.

Another feature that we discussed would be beneficial in adding is a small screen where users can see themselves dancing to help promote full interaction with the piece. Given that my user already knew the intended goal of my design, he danced for the sake of full participation but I’m not sure it’s clear to users that I also want them to dance on their own. Thus, in terms of next steps, my immediate goals are: finishing the hardware, adding the video capture component, and creating another tab with specific instructions of how to use the program.

Testing Video

Week 13: User Testing

Google Drive Video Link 

Through my user testing experience, I realized that I still have a lot of work to do before the IM Show. The person I had for my user testing experience had already played through the game on the web, but when it came to the physical components, there were a couple of mishaps.

One of the key issues was the user interface to interact with the shifting of game states from the start menu to the tutorial, to the instructions, to the game, etc. In setting up the serial port, I had the setup key connected to the SPACE bar. When the user tried to proceed to the next game state, sometimes they would instinctively reach for the SPACE bar, press it, and the program would then be stuck and unplayable. As such that is a critical point which I need to address as to how player will interact with the game.

As for the game experience, I had them play with a prototype so there were no obviously signs of what each of the button does besides the small scribbles of “up”, “down”, “left,” and “right.” I hope in the near future to get it cleaned up and have it be more presentable because the controls got a bit confusing as the game when on and more creatures appeared on the screen. However, I am very surprised that the prototype worked as well as it did, especially since it was made out of cardboard, straws, tape, and a dream. Some feedback I received was that the button was less sensitive towards the edges, so that can be something of interest to change in the coming future.

On the software side, and while the game was a success, I would like to make the progression easier and less demanding on the players. Watching others play the game and having experienced the game itself, I felt the game become overwhelming really quick. I want people to sit and enjoy the game with someone else, so I hope to make things easier and slow down the pacing of the game as well.

Final Project Concept

For my final project, I decided to stick to the idea of controlling music using Arduino sensors. I absolutely love listening to music, so this project is a kind of implementation of personal interest. At some point in time in my childhood I was thinking of becoming a DJ, so here it is, my final project will be a bunch of effects and filters for a song controlled by the self-made DJ panel that will incorporate sensors, switches, and the buttons from the Arduino kit.

What will Arduino do?

I will turn the objects and controls from the kit into the DJ toolbox.

1) I will use buttons and switches to control the songs. Pause, play, repeat, etc. Moreover, apart from using the real songs (thinking of doing 4 of them), I am planning to incorporate the controls for sounds that DJs use to enhance the listener’s experience, e.g. to make the transitions between songs smoother.

2) I will use sensors like ultrasonic distance sensors or potentiometers to control the music. A user will be able to apply filters like isolate basses, high notes, speed up/slow down, switch the songs, etc. using the motions and movements of hands. I am not sure about how diverse my DJ toolset will be, because it will depend heavily on the functions inside p5.js.

3) Potentially, if I have time, I will also incorporate the control of music visualization using buttons or switches or sensors.

What will p5.js do?

Music visualization is the part that I want to implement through p5.js. In my opinion, if I find the correct guidance on how exactly to accomplish that, it will significantly improve the project by adding nice visual effects to it. It is cool to hear the music, but it is even cooler to see it. I want the canvas to change according to the rhythm, tone, and other features of the song that I will be playing, corresponding to the filters that will be put on by the Arduino DJ panel.

 

 

 

 

Week 12: Final Project Draft

Concept:

The proposed project creates an interactive hand-gesture-based drawing system that enables users to draw on a digital canvas using hand movements. By combining P5.js, ML5.js Handpose, and Arduino, the project bridges physical gestures with digital creativity. The system uses hand tracking to allow natural interactions and integrates real-time physical feedback through LEDs and a vibration motor, offering an immersive experience.

The primary goal is to deliver a tactile and visual interface where users feel a sense of creation and engagement as their gestures directly influence the drawing and physical environment. The interplay between the digital canvas and physical feedback fosters an innovative and intuitive drawing system.

 

Arduino Program

The Arduino system responds to commands received from P5.js, controlling physical feedback components:

  • NeoPixel LEDs display colors that reflect the brush dynamics, such as the current color selected for drawing. Faster hand movements can increase brightness or trigger lighting effects.
  • Vibration motors provide tactile feedback during specific actions, such as clearing the canvas or activating special effects.

The Arduino continuously listens to P5.js via serial communication, mapping the incoming data (e.g., RGB values, movement speed) to appropriate hardware actions

P5.js Program

The P5.js program is using ML5.js Handpose to track the user’s hand and fingers. The tracked position—primarily the palm center or fingertip—is mapped to the digital canvas for drawing.

As the user moves their hand, the P5.js canvas renders brush strokes in real-time. Brush properties such as size and color dynamically adjust based on gestures or key inputs. For instance:

  • Moving the palm creates brush strokes at the detected location.
  • Pressing specific keys changes brush size (+/-) or color (C).
  • Gestures like a quick swipe could trigger special visual effects.

The program communicates with Arduino to send brush-related data such as the selected color and movement intensity. This ensures the physical environment (e.g., LED lighting) mirrors the user’s actions on the canvas.

Interaction Flow

  1. The system starts with a webcam feed processed by ML5.js Handpose.
  2. P5.js tracks hand movements and maps positions to the canvas.
  3. Users draw by moving their hand, with brush strokes appearing at the tracked position.
  4. Real-time brush properties, such as color and size, are sent to Arduino.
  5. Arduino reflects these changes through LEDs and tactile feedback:
    • The LEDs light up with the same brush color.
    • Vibration occurs during specific gestures or interactions, providing physical confirmation of actions.
  6. The user experiences synchronized digital and physical interactions, offering a sense of control and creativity.

Goals and Outcomes

The project aims to deliver an interactive tool that provides:

  1. A dynamic and intuitive drawing experience using hand gestures.
  2. Real-time physical feedback through LEDs and vibration motors.
  3. A visually and physically engaging system that bridges the digital and physical worlds.

Current Code:

Week 12: Final Project Progress

Finalised Idea

My finalised idea, from last week’s two options is the NYUAD Puzzle Adventure. 

On P5js the game will display a puzzle with shuffled pieces of  different places of NYU Abu Dhabi campus.  The pieces will be telling a story as a hole in the game. The entire game control will be from  the Arduino hardware.

Progress 

Up until this week, I have been working on the P5js designing the possible structure for the puzzle as seen on the image below:

The hardware design will consist of two potentiometer that will be used to move the puzzle pieces one potentiometer for horizontal movements and one for vertical movements. I will also have two buttons that will be used whenever player wants to lift or drop a piece in the game. I will also have hidden motors (vibration motors potentially) to provide players with feedback as the game progresses. The hardware box will also have an opening mechanism to reward players with NYUAD inspired present given that they correctly solve the puzzle and the grand prize for whoever beats the fastest time.

My hardware schematic will be as shown below:

Schematic

 

 

Week 12: Final Project Progress

Concept

Given the limited time constraint, the transforming robot presented a greater challenge, necessitating a reevaluation of the narrative and the concept as a whole.

At its core, the redesigned Project Utopia centers around an elegant yet challenging puzzle box. Each compartment represents a unique layer of interaction, requiring the user to solve riddles or manipulate physical components to unlock the next. Inside the compartments, users find carefully crafted informational cards and audio prompts that guide or enrich the experience.

Also, this redesign was inspired by the constraints of available technology, but it provided an opportunity to reimagine how interactive installations can connect with users through simplicity and creativity.

Features

At its heart, Project Utopia revolves around a puzzle box with multiple compartments. Each compartment contains riddles, cards with hidden clues, and audio prompts powered by p5.js. However, the twist lies in ARIS, the robotic overseer. As users engage with the puzzle box, ARIS intervenes—blocking, guiding, or assisting—depending on the challenge. This integration transforms the project into a blend of physical interaction, robotics, and storytelling, elevating the experience for participants.

The Box

A 3-d painted box, it has compartments which each secured with a unique mechanism, requiring users to solve challenges to access. The clearing of puzzles means that the person would be able to access the prompts and clues inside.

The Bot

The bot moves on two motored wheels with a castor wheel in the front for stability. It has one servo motor powered hand (made of cardboard), and

  • LCD Screen Eyes: Display emotions like suspicion, curiosity, or delight.
  • Servo Hand: Blocks premature access to the box.

it also is an expressive robot, meaning that major actions from the user will prompt expressions and sound from him.

The Code so far
#include <LiquidCrystal.h>
#include <Servo.h>

// Initialize LCDs
LiquidCrystal lcdLeft(2, 3, 4, 5, 6, 7);  
LiquidCrystal lcdRight(8, 9, 10, 11, 12, 13);

// Initialize the servos
Servo servoLeft;
Servo servoRight;

// Custom character arrays for expressions
byte happyEye[8] = {
  B00000,
  B01010,
  B00000,
  B10001,
  B01110,
  B00000,
  B00000,
  B00000
};

void setup() {
  // Begin LCDs
  lcdLeft.begin(16, 2);
  lcdRight.begin(16, 2);

  // Load custom characters
  lcdLeft.createChar(0, happyEye);
  // lcdLeft.createChar(1, confusedEye);

  lcdRight.createChar(0, happyEye);
  // lcdRight.createChar(1, confusedEye);

  // Start with a happy expression
  showHappyExpression();
}

void loop() {
  // Alternate expressions every 3 seconds
  showHappyExpression();
  delay(3000);
}

void showHappyExpression() {
  // Left Eye
  lcdLeft.clear();
  lcdLeft.setCursor(0, 0);
  lcdLeft.write(byte(0)); // Happy Eye

  // Right Eye
  lcdRight.clear();
  lcdRight.setCursor(0, 0);
  lcdRight.write(byte(0)); // Happy Eye
}

Week 12 – Progress On The Final Project

Concept

Set off with the idea of transforming my midterm project into an online PVP game from an offline PVE game (which has been achieved by this time as my major progress this week—we’ll come back to this later), the concept of my final now includes another dimension of the human-robot competition—utilizing the PVP mechanism I have now and building a physical robot to operate one of the players in the game.

Of course, technically speaking, it would be mission impossible to develop a robot that could actually play and compete with humans within a week or so. Therefore, the physical installation of the robot would be a puppet that ostensibly controls one of the players, and the actual gaming logic would still be programmed in p5.

Illustration

The bottle is a placeholder for the hand robot to be realized.

Potential Hand Robot

Based on an open-source printable robot hand 3D model, servos could be attached to the fingers’ ends to bend each of them when needed—if parameters are tuned.

p5-Arduino Communication

As the gaming logic would be done within p5, what needs to be sent to Arduino are the translated movements of the robot hand—specifically the servos, such as for how long a specific finger needs to press a key that feeds back to trigger the movement of the player in the game.

Current Progress (Code Available)

By far, based on the midterm, the following functions have been achieved

  1. Control mode selection – determines if the game is going to be controlled by head movement (for humans) or direction keys (for the robot)
  2. Synchronized online battle – realizes the real-time communication between two (or more) sketches on two (or more) computers by setting up an HTTP WAN server using Node.js and socket.io. Necessary data from one sketch is shared with the other in real time (with a buffer time of 5 ms). Core code of this part of the sketch:
    // sketch.js
    
    // The global broadcast dicitonaires to communicate with the other player
    let globalBroadcastGet = {
      x: 0,
      y: 0,
      rotationX: 0,
      rotationY: 0,
      rotationZ: 0,
      health: 100,
      energy: 100,
      tacticEngineOn: false,
      laserCooldown: 100, // milliseconds
      lastLaserTime: 0,
      colliderRadius: 30, // Example radius for collision detection
      destroyCountdown: 90,
      toDestroy: false,
      laserFired: 0,
      damageState: false,
      readyToPlay: false,
      bkgSelected: 'background1'
    }
    let globalBroadcastSend = {
      x: 0,
      y: 0,
      rotationX: 0,
      rotationY: 0,
      rotationZ: 0,
      health: 100,
      energy: 100,
      tacticEngineOn: false,
      laserCooldown: 100, // milliseconds
      lastLaserTime: 0,
      colliderRadius: 30, // Example radius for collision detection
      destroyCountdown: 90,
      toDestroy: false,
      laserFired: 0,
      damageState: false,
      readyToPlay: false,
      bkgSelected: 'background1'
    }
    
    // Interval for sending broadcasts (in milliseconds)
    const BROADCAST_INTERVAL = 5; // 5000 ms = 5 seconds
    
    // Setup function initializes game components after assets are loaded
    function setup() {
      ...
    
      // Initialize Socket.io
      socket = io();
    
      // Handle connection events
      socket.on('connect', () => {
        console.log('Connected to server');
      });
    
      socket.on('disconnect', () => {
        console.log('Disconnected from server');
      });
    
      // Reconnection attempts
      socket.on('reconnect_attempt', () => {
        console.log('Attempting to reconnect');
      });
    
      socket.on('reconnect', (attemptNumber) => {
        console.log('Reconnected after', attemptNumber, 'attempts');
      });
    
      socket.on('reconnect_error', (error) => {
        console.error('Reconnection error:', error);
      });
    
      // Listen for broadcast messages from other clients
      socket.on('broadcast', (data) => {
        // console.log('Received broadcast');s
        try {
          // Ensure the received data is a valid object
          if (typeof data === 'object' && data !== null) {
            globalBroadcastGet = data; // Replace the entire BroadcastGet dictionary
          } else {
            console.warn('Received data is not a valid dictionary:', data);
          }
        } catch (error) {
          console.error('Error processing received data:', error);
        }
      });
    
      // Set up the periodic sending
      setInterval(sendBroadcast, BROADCAST_INTERVAL);
    
      ...
    }
    
    // Function to send the BroadcastSend dictionary
    function sendBroadcast() {
    
      // Update BroadcastSend dictionary
      let BroadcastSend = globalBroadcastSend;
    
      // Send the entire dictionary to the server to broadcast to other clients
      socket.emit('broadcast', BroadcastSend);
      // console.log('Sent broadcast:', BroadcastSend);
    }
    // server.js
    
    /* Install socket.io and config server
    npm init -y
    npm install express socket.io
    node server.js
    */
    
    /* Install mkcert and generate CERT for https
    choco install mkcert
    mkcert -install
    mkcert <your_local_IP> localhost 127.0.0.1 ::1
    mv 192.168.1.10+2.pem server.pem
    mv 192.168.1.10+2-key.pem server-key.pem
    mkdir certs
    mv server.pem certs/
    mv server-key.pem certs/
    */
    
    const express = require('express');
    const https = require('https');
    const socketIo = require('socket.io');
    const path = require('path');
    const fs = require('fs'); // Required for reading directory contents
    
    const app = express();
    
    // Path to SSL certificates
    const sslOptions = {
      key: fs.readFileSync(path.join(__dirname, 'certs', 'server-key.pem')),
      cert: fs.readFileSync(path.join(__dirname, 'certs', 'server.pem')),
    };
    
    // Create HTTPS server
    const httpsServer = https.createServer(sslOptions, app);
    // Initialize Socket.io
    const io = socketIo(httpsServer);
    
    // Serve static files from the 'public' directory
    app.use(express.static('public'));
    
    // Handle client connections
    io.on('connection', (socket) => {
      console.log(`New client connected: ${socket.id}`);
    
      // Listen for broadcast messages from clients
      socket.on('broadcast', (data) => {
        // console.log(`Broadcast from ${socket.id}:`, data);
        // Emit the data to all other connected clients
        socket.broadcast.emit('broadcast', data);
      });
    
      // Handle client disconnections
      socket.on('disconnect', () => {
        console.log(`Client disconnected: ${socket.id}`);
      });
    });
    
    // Start HTTPS server
    const PORT = 3000; // Use desired port
    httpsServer.listen(PORT, () => {
      console.log(`HTTPS Server listening on port ${PORT}`);
    });

    Other changes in classes to replace local parameter passing with externally synchronized data, such as:

    // EnemyShip.js
    
    class EnemyShip {
      ...
    
      update() {
        this.toDestroy = globalBroadcastGet.toDestroy;
    
        this.health = globalBroadcastGet.health;
        this.energy = globalBroadcastGet.energy;
    
        if (this.toDestroy === false) {
          this.x = globalBroadcastGet.x;
          this.y = globalBroadcastGet.y;
          
          // Update rotation based on head movement
          this.rotationX = globalBroadcastGet.rotationX;
          this.rotationY = globalBroadcastGet.rotationY;
          this.rotationZ = globalBroadcastGet.rotationZ;
          
          if (globalBroadcastGet.tacticEngineOn === true && this.tacticEngineUsed === false) {
            this.tacticEngine()
          }
    
          // Tactic engine reset
          if (this.tacticEngineOn === true) {
            let currentTime = millis();
            if (this.model === assets.models.playerShip1) {
              this.health = 100;
            } else {
              this.energy = 100;
            }
            if (currentTime - this.tacticEngineStart > 15000) {
              this.tacticEngineOn = false;
              if (this.model === assets.models.playerShip1) {
                this.health = 100;
              } else {
                this.energy = 100;
              }
            }
          }
        }
      }
    
      ...
    
    }
  3. Peripheral improvement to smoothen PVP experience, including recording the winner/loser, synchronizing game configuration, etc.