Week 13: User Testing – NYUAD Puzzle Adventure

My game involves solving puzzles of NYU Abu Dhabi images.
The game involves some nice images of the NYU Abu Dhabi campus which are initially is square pieces. I decided to go with square pieces instead of jigsaw pattern because it was easy to  combine move and manipulate square pieces than any other shape.
I invited a friend of mine to test the game and the video can be seen below:

My schematic is as shown here below:

I have four buttons to control game states and actions and the joystick to move image pieces. The speakers are attached to the controller to provide feedback when the pieces move.

Reflection

1. Simplifying Instructions: I realized that users will skip long instructional text. To address this, I need to make my design as intuitive and self-explanatory as possible.

2. Sound Feedback: The sound feedback when pieces attach is effective in making players aware of their actions and creating a sense of progress. This helps avoid confusion or mistaken attachments.

3. Adjusting Final Sound: The final pop-up sound when an image is correctly solved was too loud and needs to be adjusted for a better user experience.

4. Speed of Image Pieces: The speed of the image pieces was too slow, which could make the gameplay feel boring. Increasing the speed would make the experience more engaging.

5. Magnetic Precision: To improve ease of use, I need to reduce the precision required for pieces to attach. Allowing pieces to magnetically snap together would make it easier for players to combine and move them.

6. Background Music: Given that my user testing was quite, and my volunteer was shocked by the celebration sound, I would like to add a background music in order to make the game more interesting.

Final Project – Blue Lovable Utilitarian Reading Buddy (B.L.U.R.B.)

p5.js Sketch   |   GitHub URL   |   Schematic Diagram

Inspiration

For all my assignments so far, I’ve tried to do something that’s personal and meaningful to me in some way, and this project is no different. My favorite hobby, without a doubt, is reading—it’s something I’ve loved for as long as I can remember. So, when I started brainstorming for this project, I decided I wanted to create something that revolves around the joy of books and the experience of reading as a whole.

Now, of course I’ve always thought there was something magical about getting lost in a story and exploring new worlds through words. But as much as I love reading, it can sometimes feel like a solitary experience, especially when I come across a thought or idea I want to share but no one’s around to talk to. That’s what inspired me to create a book buddy robot—something that could bring a sense of companionship to reading, making it feel a little less lonely while still keeping the focus on the stories I love so much.

Concept

The concept behind my book buddy robot, B.L.U.R.B. i s to create a small, interactive companion that enhances your reading experience. Made using p5.js and Arduino, B.L.U.R.B. is designed to be there with you as you dive into a good book, offering a few fun features along the way.

  • Flip Pages (Arduino): Using Arduino, B.L.U.R.B. has the ability to automatically flip the pages of your book. This page-flipping function helps keep the flow going without having to pause and turn the page yourself. It gives the feeling of having a personal assistant to handle the mundane task, allowing you to stay immersed in your story.
  • Play Ambient Music (p5.js): B.L.U.R.B. uses p5.js to play soft, ambient music tailored to enhance your reading experience. Whether it’s a gentle instrumental tune or nature sounds, the music helps set the atmosphere, making it easier to focus and get lost in your book. You can choose different genres or moods depending on the type of book you’re reading, helping you stay engaged and relaxed.
  • Track Reading Time (p5.js): With p5.js, B.L.U.R.B. can track how much time you’ve spent reading, giving you a sense of accomplishment as you progress through your book. It can display a timer on the screen, helping you set reading goals or simply monitor how long you’ve been immersed in the story. It’s a subtle way to keep track of time without taking away from the reading experience.
  • Give Break Suggestions (p5.js): Sometimes, reading for too long can leave you feeling drained. When you take a break, B.L.U.R.B. suggests helpful activities that could refresh your mind. These suggestions might include stretching, taking a short walk, drinking water, or even doodling on a piece of paper. It’s a fun way to make sure you’re taking care of yourself while you dive into your books.
  • Display Messages (Arduino): B.L.U.R.B. features an Arduino powered LCD screen that displays small, personalized messages. These messages could tell you when B.L.U.R.B. says “Hello” and also display whether you’re currently reading or on break. The idea is to add a little more interaction to the reading process, keeping you engaged and providing a sense of companionship while reading.
  • Move Head (Arduino): B.L.U.R.B.’s head movement is controlled by Arduino, reacting to your actions. Whenever you switch between reading and break modes,  B.L.U.R.B. moves his head from side to side. It’s a fun way to make him feel more like a real companion that’s in tune with your reading habits.
  • Light Up Reading Lamp (Arduino): B.L.U.R.B. has a small reading lamp that can be activated by tapping its foot. This feature adds a practical element to your reading experience. When the lights are dim or you need a little extra light, a simple tap on B.L.U.R.B.’s foot will turn on the lamp, giving you the perfect amount of illumination to read comfortably.

Implementation

B.L.U.R.B’s Design

B.L.U.R.B.’s design started with a cute 3D model found on Thingiverse. However, 3D printing issues led to the need to split the model into multiple parts, which were then glued together later. After printing and assembling the pieces, I spray-painted all the components in a soft pastel blue, matching B.L.U.R.B.’s calming design.

This is a video I took after the initial 3d printing stages, while trying to work on making his head move from side to side:

On the p5 side, B.L.U.R.B. is seen sitting on a cozy sofa in a softly lit library room. The setting is designed to create a comfortable atmosphere, enhancing the user’s reading experience. B.L.U.R.B. interacts with the user through animated speech bubbles, where the text appears in a typewriter effect, adding a dynamic element to the conversation. This combination of design and animation gives B.L.U.R.B. a friendly, approachable personality while keeping the focus on a relaxing, cozy environment.

Flipping Pages

To figure out how to flip pages mechanically, I watched a few videos online for inspiration. My initial idea was to use two servo motors and a temporary adhesive, like Blu Tack, that doesn’t dry out too easily. I built several prototypes and tested different adhesives, but none worked perfectly. Strong adhesives stuck too firmly to the paper and couldn’t unstick, while weaker ones didn’t have enough grip to flip a page.

The adheseive would be stuck on the lower end of the shorter stick of the prototype below:

Eventually, I decided to try a different approach. This involved using a combination of a wheel and a stick attached to a servo motor. The wheel made contact with the page to be flipped, rotating clockwise for some time to lift the page. Then, the stick would get into the gap that was created by this page lift and swing 180 degrees to complete the flip, before returning to its original position. To ensure the next page lay flat, the wheel briefly rotated counterclockwise to smooth it down.

The wheels available in the lab didn’t work well for this setup, so I had a custom wheel 3D printed. This improved the mechanism somewhat, but it still wasn’t perfect. The main issue was that the wheel would sometimes lift multiple pages at once, causing the system to lack the power to flip them all successfully. This happened because the wheel’s motion often lifted more than one page during its run.

It took a lot of iterations to figure this out but the mechanism worked much better at the end with a small change: in the initial few seconds of the wheel running (when it is likely to have slightly lifted the first page), the stick moves slightly inward to fill this gap. The wheel then runs a little more and the servo motion happens as it did before. This ensures that the stick (more likely than not) gets into the right spot (under just one page) and the servo motion causes the page above it to flip.

Playing Ambient Music

For this feature, integrated into the p5 part of the project, I added three different sound files: Lofi, Nature, and Piano. These give the user the option to choose between them whenever they want to listen to ambient music while reading. The user can easily switch between the sounds or stop the music entirely whenever they wish, giving them full control over their listening experience. These sounds were specifically chosen because they are calming and help create a focused atmosphere without being distracting.

Tracking Reading Time

To help users monitor how long they spend reading, I added a reading timer to the project. The timer starts automatically when the user clicks the “Begin Reading” button in p5. If the user decides to take a break, the timer pauses and saves the current reading time. Once the user clicks to continue reading, the timer resumes from where it left off.

This feature is designed to work in the background, ensuring the user can focus on their reading without worrying about keeping track of time manually. It provides a simple and efficient way to measure reading sessions, making it easy for users to reflect on their habits or set reading goals.

Giving Break Suggestions

Whenever the user takes a break, the robot, B.L.U.R.B., adds a personal touch by saying, “Enjoy your break! I suggest you [insert break suggestion] :)”. (all of these messages show up on p5). The break suggestion is randomly selected from a list of  about 40 healthy activities stored in a CSV file. These activities are thoughtfully curated to encourage wellness and productivity during breaks.

The suggestions include ideas like:

  • Doing a quick exercise routine
  • Grabbing a healthy snack
  • Tidying up your desk
  • Meditating for a few minutes
  • Sketching or doodling
  • Taking a short walk

This feature not only adds personality to B.L.U.R.B. but also helps users make the most of their breaks by promoting healthy habits and keeping their minds refreshed.

Displaying Messages

On the p5 side, B.L.U.R.B. introduces himself to the user and provides a brief explanation of the features he offers and how he can assist with reading.

On the Arduino side, an LCD screen is attached to B.L.U.R.B.’s neck to display messages and add personality.. Initially, the screen shows a cheerful “Hellooo!” to greet the user.

  • When the user begins reading or resumes after a break, the screen updates to display “Reading Time” along with a beating heart animation. This beating heart adds a sense of life to the robot and are also in tune with the passing seconds on the reading timer.
  • When the user takes a break, the screen switches to display a friendly “Break Time :)” message, creating a fun and relaxing vibe during the pause.
Moving Head

To make the experience more interactive, B.L.U.R.B. moves his head from side to side whenever the user switches between reading and break modes. This movement is controlled by a servo motor attached to B.L.U.R.B.’s head, adding a playful and lifelike element to the robot.

Light Up Reading Lamp

To enhance the reading experience, a blue NeoPixel ring is used as a reading lamp, positioned near the book to provide sufficient lighting. The color of the light matches B.L.U.R.B.’s blue design, creating a cohesive look. The lamp is controlled through a simple and interactive mechanism: a piezo pressure sensor placed on B.L.U.R.B.’s foot, which is marked with a lamp sticker. Tapping the foot toggles the light on or off. This setup makes it easy for the user to control the lamp without interrupting their reading. The soft blue light from the NeoPixel ring adds a calming and functional touch to the environment.

Initial prototype:

Some Work In Progress Pictures


Overall Setup

The overall setup was designed with a focus on functionality and aesthetic appeal. A wooden plank of the right size was covered with brown paper, creating a smooth surface to lay the book and house the flipping mechanism. The mechanism includes a wheel with a motor attached to a block of wood, which is fixed to the plank, along with a servo mounted on the bottom edge to control the page flipping.

To keep the wiring organized, I used another box, also covered in brown paper, to house all the electronics. I carefully cut slits in the box to allow the wiring to extend outside when necessary. To add a touch of personality, I printed out cute book-related quotes that resemble stickers, cut them out, and stuck them to the box. The robot, B.L.U.R.B., was placed on top of the box, completing the setup and making it visually appealing while still functional. This design not only keeps everything neatly organized but also enhances the overall aesthetic of the reading space.

Final Setup in Progress

User Testing

This project was displayed in the Fall 2024 end-of-semester Interactive Media Showcase at NYU Abu Dhabi. Here are some photos/videos of interaction from the showcase:

Potential Improvements

There are a few areas where I could improve the project to make it even more engaging. One idea is to increase interaction with B.L.U.R.B. by having him move his head randomly to keep the user engaged. Additionally, if the screen in p5 is left idle for a while, I could make everything except the timer disappear, creating a more minimalistic and focused reading environment. Lastly, positioning the reading lamp above the book would provide better coverage of light across the pages, improving the overall reading experience. These changes would enhance both the functionality and the user experience, making the project even more dynamic and enjoyable :))

Reflections

I’m really proud and happy with how this project turned out! When I first decided on the idea, I had no clue whether it would be possible to bring everything together and make it all work. But I was determined to give it my best and worked on it every day, tackling one challenge at a time. The most difficult part, of course, was the page-flipping mechanism—I spent several days experimenting and testing different approaches before I finally got it to work.

In the end, seeing all the different components come together felt incredibly rewarding. Presenting it at the showcase was a really proud moment for me. Watching people interact with B.L.U.R.B. and enjoy the experience I had created made all the effort worth it. I’m happy with how the project reflects my passion for reading and combines functionality with a cute, thoughtful design. It’s been a journey full of challenges, but the result really does feel like an accomplishment :))

Week 13: Final Project – User Testing

Hey everyone! 👋

As we’re finishing up our final projects, we also have to conduct user testing to gain valuable feedback and identity potential issues before the showcase.

Here’s one of the short user testing sessions:

 

As you can see, I did not give any prompt or instructions, and they were able to figure it out nearly instantly (that this is a music synthesizing glove), which is a good sign. They were easily able to play around with it and make music, and understood that bending the fingers was controlling the sound.

However, one piece of feedback I got was that while it was obvious which finger controlled which bar, it wasn’t clear what they were mapped to. I had initially expected that the answer would reveal itself upon the glove being played with for a while (especially with the helpful addressable LEDs even providing hints and feedback on their actions), but clearly this wasn’t the case. Most people were able to somewhat guess the action of the first finger (controlling the pitch (actually the base frequency)), but weren’t able to tell more than that. To be honest though, even I wouldn’t have been able to immediately tell the action of the last 2 fingers. So taking this into account, I added little labels at the bottom of the bars, indicating what each finger controlled.

For some curious enough, I also explained how the sound was actually being synthesized. While this does mean there was something to explain, I think this was more the theory behind the scenes, and so it isn’t vitally important to know. In fact, I think people should be able to play around and enjoy the sound, without knowing the actual technical details behind it, which was the case.

Other than that, I’ll polish up the interface, and although I would also like to improve the music generation, I doubt I’ll be able to change it much between now and the showcase.

Final Project – Sense of Music

Concept

It took me some time to come up with the idea for my final project. I could not decide for sure because while most of my peers were either expanding their midterm project or creating a game that would use Arduino as a user-control input, I wanted to create something that correlated with my interests and would be more unique for my developer experience. Thanks to the assignment where we needed to find out the way to control music with the Arduino sensor, and my love for music, I decided to do something associated with music, particularly enhancing its experience.

Since childhood, I have always liked to play with music controls. At some point, I even wanted to become a DJ. While such a dream never came true, a couple of weeks ago I realized that I could utilize Arduino to create something similar to the DJ panel (a simplified version of it). Moreover, I can make images on the computer to interact with the music and visualize it. After thorough research, I found out that there is a built-in p5.js library that allows to work with the sounds. This is exactly what I needed.

My project now consists of the physical part – the box powered by Arduino controls and sensors that allow a user to interact with music and the p5.js, and a digital visualization that is based both on the user actions and the music characteristics (which can also be changed by the user to the certain extent).

Project Demonstration

Implementation

Interaction Design:

A user interacts with the Arduino through box that has a number of input controls.
– 3 buttons on the right side turn on 3 different songs
– 3 buttons on the top provide control over the visualization of music style and effects on the digital screen
The red button in the top left corner is a play/pause button
– White and Black buttons on the left increase and decrease the pitch speed of the music
– Potentiometer controls the volume
– Ultrasonic Distance Sensor controls the High Pass and Low Pass filters

The input from Arduino through Serial Communication allows to influence the music as well as the image that is generated inside p5.js. 

Code:

p5.js: Using the p5.FFT the music characteristics are converted into the image. My code analyses the music, e.g. frequencies, and creates a dynamic visualization. There are 3 main music visualizers that I created – Circle (set by default), Sunburst (can be turned on/off using the button), and Particles (set by default, can be switched off). All of those patterns are based on the same idea of synchronizing with music amplitude. Function fft.getEnergy() measures the bass range in Hz. The value then passed further and is used to control the waveform of the Circle, length of lines of Sunburst, and speed of Particles.

Additionally, I leveraged the knowledge from the previous projects to equip Sense of Music with features like a fullscreen toggle or Rotation effect that can also be activated by pressing the corresponding button. Moreover, as a bonus, I decided to incorporate one of my favorite projects of this semester into Sense of Music. The Lines’ visual effect, although is not influenced by music, can be activated using one of the buttons as a substitute for Particles. Personally, I enjoy combining the Lines with the Rotating effect.

Arduino: Code for Arduino is quite straightforward as well as the connections on the board (see the schematic below). I am really glad I learned how to use and connect potentiometers and ultrasonic distance sensors in class, so I had no problem with intuitively using them for my project.

Link for p5.js code with Arduino code as a comment on the bottom:
https://editor.p5js.org/rbashokov/sketches/SwAigPMVa

Thoughts and Conclusion

I am extremely happy with the final project that I have created. More than that, I am happy and thankful for the Intro to IM class and professor Mang for helping and assisting me through the whole semester. I learned a lot in this class. More specifically, I, in some way, stepped out of the environment I got used to by constantly making myself think outside the box and create something unique, something that can be called interactive art. Other than that, the most difficult part for me was creating the physical box and setting up the buttons and sensors on it. The last time I did something similar was in middle school, so I had literally 0 knowledge of how to approach it. However, I managed to figure out a way to create schematics of the box and print them using the laser cut and connect the wires and buttons using glue. I am still very far away from being close to the results I would like to achieve, but for now, I feel like I made progress and reached the goals I set for this class at the beginning of the semester.

Speaking of my final project, Sense of Music was a great idea, and its implementation was good but not perfect. While the code was not a problem, there is still a lot to build on top of what I have for now to further enhance the user experience. The same goes for the physical parts – the box and the DJ panel. I could make them look more aesthetic and pleasant by using different styles of buttons, perhaps adding more colors to the box, etc. Unfortunately, the time was very limited, especially considering my trip to Shanghai that took the whole National Holiday break. Nevertheless, I am satisfied with the results, and as I see it, the fact that there is a lot of room for project improvement shows that the idea is actually great.

Thank you!

 

Final Project: NYUAD’s Global Dance Party

Concept

From the design to implementation phase I am rather satisfied with how close I was able to stay to my original concept. In the design phase I knew I wanted to create some type of map interaction with different countries and songs for users to dance too. When conceptualizing this project, my mind kept returning to the alphabet/poem rain installation we saw in the first half of the semester. I think it’s the best example we’ve seen of interactive art where there’s a message and a strong interactive component. Although, I was not able produce something of that quality, I am nevertheless pleased with way in which I combine meaning and interaction in my piece.

As a study away student being immersed in NYUAD’s vibrant culture has genuinely changed my perspective of the world. Growing up in New York City, and the US in general I’ve been taught a very specific definition of what diversity looks like and being a student here has completely redefined that for me. Thus, as the semester comes to a close, I thought this project would be a perfect way for me to share a glimpse of what I’ve come to appreciate so much.

Design

As I mentioned previously, throughout the design process I had a pretty clear idea of what I wanted to create.  There were other features that I considered in order to strengthen the interactivity such as body tracking or a point system but I settled on a simpler design to ensure maximum functionality. Nevertheless I took a lot of inspiration from the video game Just Dance, and tried to consider that level of interaction throughout the design and implementation process.

Implementation 

Once I began the ‘build’ process the difficult part was just in mainly getting started. Surprisingly enough, I also had a hard time figuring out what songs/countries would be represented. The hardware, although rather simple, tested a lot of skills in terms of soldering and using tools I’m not super familiar with so I’m glad I was able to use what we learned in class to my advantage. Furthermore, this is the first time I’m working with any kind of hardware components on top of software design so it took a bit of creativity to consider what materials I wanted to use to make my vision come alive.

Caption: I originally had a ton of extra wires because I thought I wanted to extend the Arduino behind my computer but after assessing the material available, I shorted them and added this box to tuck them away. – Phase 2

My prototype consisted of just using the map itself and a few switches to test both the hardware and software. Once that was good to go, my main concern became ‘cleaning up the look’ so that I had a clear, finished project to present. This was rather easy, I just needed to identify areas of concern, and figure out how I could further simplify them.

Prototype video from user testing – no sound

In terms of software implementation, once I decided how I wanted to exchange information between P5 and Arduino, I was able to make significant progress in my code. Basically, when a button on the map is pressed, a flag corresponding to that country in Arduino is sent to P5 which then sets a series of commands off to play it’s respective video. However, a major road block that I hit was the 5 MB upload limit to video files in P5. My solution to this problem is fairly simple, but the route I took to get there was very long and complicated (as is most debugging).  I eventually was able to implement an array based system to cycle through clips of each video so that they could all play as one.

Schematic

Arduino Code

int colombiaButton = 13; // Button connected to pin 13
int indiaButton = 12;
int koreaButton = 11;
int chinaButton = 10;
int italyButton = 9; //black button with whie & blue wire
int prButton = 7; //yellow button with green and blue wire 
int saButton = 5; //green button with white and yellow
int nigeriaButton = 2; //red button with orange and green 
int kazakhButton = 3; //blue button with green wires
int fpButton = 4; //black button yellow and white wire  

void setup() {
  // Start serial communication so we can send data
  Serial.begin(9600);

  // Configure the button pin as input with pullup
  pinMode(colombiaButton, INPUT_PULLUP);
  pinMode(indiaButton, INPUT_PULLUP);
  pinMode(koreaButton, INPUT_PULLUP);
  pinMode(chinaButton, INPUT_PULLUP);
  pinMode(italyButton, INPUT_PULLUP);
  pinMode(kazakhButton, INPUT_PULLUP);
  pinMode(prButton, INPUT_PULLUP);
  pinMode(saButton, INPUT_PULLUP);
  pinMode(nigeriaButton, INPUT_PULLUP);
  pinMode(fpButton, INPUT_PULLUP);
}

void loop() {
  // Check if button is pressed
  if (digitalRead(colombiaButton) == LOW) { // Button pressed (active low)
    Serial.println("colombia"); // Send "push"
  } else if (digitalRead(indiaButton) == LOW) { 
    Serial.println("india"); // Send "push"
  } else if (digitalRead(koreaButton) == LOW) { 
    Serial.println("korea"); // Send "push"
  } else if (digitalRead(chinaButton) == LOW) { 
    Serial.println("china"); // Send "push"
  } else if (digitalRead(italyButton) == LOW){
    Serial.println("italy");
  } else if (digitalRead(prButton) == LOW){
    Serial.println("pr");
  } else if (digitalRead(saButton) == LOW){
    Serial.println("sa");
  } else if (digitalRead(nigeriaButton) == LOW){
    Serial.println("nigeria");
  } else if (digitalRead(fpButton) == LOW){
    Serial.println("fp");
  } else if (digitalRead(kazakhButton) == LOW){
    Serial.println("kazakh");
  } else {
    Serial.println(0);
  }
  delay(150); // Debounce delay
}

P5 Code

https://editor.p5js.org/jajones/sketches/SaANLV2D5

Aspect’s I’m Proud Of

Overall, I am pretty proud of my final product. What I struggled with the most in the midterm was designing realistic plans for my project which left feeling overwhelmed throughout the implementation process. Throughout this process, I gave myself enough space to design new ideas while staying focused on what my original plan was which I think really helped the overall process.

A more concrete component I am proud of this the solution to the file size upload issue I faced earlier. I did a lot of reading on StackOverflow and random P5 forums to help me solve the issue and so even though it isn’t perfect I am really happy with how it turned out.

Future Improvement 

In terms of future improvement, I would love to incorporate more way to encourage people to physically react. Although in my user testing people were active and participating, I feel as though if I wasn’t standing there, watching, they wouldn’t necessarily feel as inclined to do so. To help alleviate this I did try to replicate a dance mirror effect when the videos play so users felt incentivized to dance, but I’m sure I could go even further to record them dancing and then have the program send it to them or add it to some dance database which they might enjoy being a part of.

Progress on Final Project Week 12

Game Title: Maze Craze

Game Image:


Concept Overview: This project is an interactive maze game where players use a joystick connected to an Arduino Uno to navigate a ball through a maze displayed on a p5.js canvas. The game includes challenges like obstacles, a life system (represented by heart icons), and sound effects for feedback. The player wins by reaching a target point in the maze. The game tracks the fastest time for completion and the fastest time is also displayed. The walls are obstacles where the player loses a life when they touch the wall. This is to increase difficulty of the game and make sure players have to be careful while navigating the maze.

 

Arduino Program Design
Inputs:

Joystick (HW 504):

VRX (X-axis): Controls horizontal movement of the ball.
VRY (Y-axis): Controls vertical movement of the ball.
SW (Button): Can be used to reset the game .
Serial Communication:

Sends joystick X and Y axis values to the p5.js program.

Serial Data:
Sends joystick data in the format x,y to p5.js.

 

Arduino Logic:
Read X and Y values from the joystick.
Send the data via serial to p5.js in the format x,y.

Arduino Code:

void setup() {
  Serial.begin(9600); // Initialize serial communication at 9600 baud rate
}

void loop() {
  int xPos = analogRead(A0); // Joystick X-axis
  int yPos = analogRead(A1); // Joystick Y-axis

  // Map analog readings (0-1023) to a more usable range if needed
  int mappedX = map(xPos, 0, 1023, 0, 1000); // Normalize to 0-1000
  int mappedY = map(yPos, 0, 1023, 0, 1000); // Normalize to 0-1000

  // Send joystick values as CSV (e.g., "500,750")
  Serial.print(mappedX);
  Serial.print(",");
  Serial.println(mappedY);

  delay(50); // Adjust delay for data sending frequency
}

P5.js Logic
Game Start:
Display a start screen and wait for mouse click or button press to begin.
Joystick Integration:
Map joystick X and Y data to control the ball’s position on the canvas.
Collision Detection:
Check for collisions with obstacles and deduct a life upon collision.
Game End:
Display a victory or loss message based on game outcomes.

Code for handling serial data:`

function handleSerialData() {
  let data = serial.readLine().trim(); // Read and trim data
  if (data.length > 0) {
    let values = data.split(",");
    if (values.length === 2) {
      joystickX = Number(values[0]);
      joystickY = Number(values[1]);
    }
  }
}

 

 

Week 11 Reading Response

When Design Meets Disability

Reading Graham Pullin’s ‘Design Meets Disability’ made me rethink how we view assistive devices and how much design influences our perception of them. Pullin argues that assistive technology doesn’t have to be just functional—it can also be beautiful, creative, and reflective of individuality. This idea stood out to me because it flips the usual way we think about devices like hearing aids, wheelchairs, or prosthetics. Instead of being tools to hide or blend in, they can be seen as things that people can show off and be proud of, just like any other accessory or piece of technology.

One example Pullin mentions is hearing aids and how they’re often designed to be invisible. I never thought about how strange that is—why do we feel the need to hide something that helps people? What if hearing aids could be stylish, like jewelry, or customized to fit someone’s personality? It’s a simple shift in thinking, but it makes such a big difference. It reminds me of how glasses used to be seen as embarrassing, but now people wear bold frames to express their style. Why can’t assistive devices evolve in the same way? It’s not just about function; it’s about identity and empowerment.

This idea also connects to the bigger issue of how design often caters to an ‘average user, which leaves a lot of people feeling excluded. Pullin’s focus on inclusive design challenges that by showing how products can be more adaptable and personal. It made me imagine what prosthetic limbs could look like if they were designed with personality in mind—like having different patterns, colors, or even glowing lights. A prosthetic arm could be just as much a fashion statement as a designer handbag or a cool pair of sneakers. This would help break down the stigma around disability by celebrating the creativity and individuality of the people using these devices.

Pullin also makes a really interesting point about beauty. He argues that beauty doesn’t have to mean perfection. Instead, it can come from things that are unique, unexpected, or even imperfect. This reminded me of the Japanese concept of wabi-sabi, which finds beauty in imperfection and the natural flow of life. If we applied that to assistive technology, we could design devices that are not only functional but also artistic and meaningful. For example, a wheelchair could have a sleek, futuristic look, or a prosthetic leg could be designed with intricate patterns that make it stand out in a good way. These designs could change how people think about disability, not as something to pity but as something to appreciate and admire.

In the end, Pullin’s book shows that design is never just about solving problems—it’s about making statements and shaping how people see the world. By bringing creativity into assistive technology, we can create a world that’s not only more inclusive but also more exciting and diverse. Design Meets Disability opened my eyes to how much potential there is in rethinking design and how even small changes can make a huge difference in people’s lives.

Final Project; Pengu Pounce

Pengu Pounce — IT’S OVER!

I am ECSTATIC to say that I am finally done with this project! It has certainly been a dynamic experience.

As I had proposed, I thought of incorporating one of my favorite things ever into my final project –  my love for penguins. Therefore, I decided to create a fun little game where a Penguin called Pengu, has to jump over platforms — inspired by the Doodle Jump game. 

A lot has changed since my previous User Proposal, as my idea now is fully fleshed out/ In terms of the game itself, the primary objective is for Pengu to hop on different platforms till the timer ends — the person is supposed to last sixty seconds in the game. If the penguin falls off – then they lose.

In terms of the physical implementation, this game has four buttons: Restart, Left, Right, and Jump. 

There are several challenges I faced, most of them mainly to do with the game itself rather than the arduino.

For example, I was struggling with generating the actual platforms for the penguin to jump on. After I added the special ‘disappear’ platforms, it felt like the screen was being overcrowded. In addition, sometimes, the penguin would start on a disappear platform and therefore lose the game immediately,  so I decided on a set of three  normal platforms for the penguin to jump on at the start of the game. 

I also had struggled with making the platforms disappear once the penguin moved up, and ,make new ones appear. However, my friend had taught me about a handy concat built in function and filter, and as well as the spread operator, which I actually ended up finding useful and using it here now.

<iframe src=”https://editor.p5js.org/zv2029/full/otoQ9nLsh”></iframe>

Here is a link my complete p5.js sketch: https://editor.p5js.org/zv2029/sketches/otoQ9nLsh

Here is my code for the Arduino IDE that I used: 

 

const int jumpButtonPin = 10; // Button for jump
const int leftButtonPin = 13; // Button for move left
const int rightButtonPin = 4; // Button for move right
const int restartButtonPin = 2; // Button for restart

void setup() {
  pinMode(jumpButtonPin, INPUT_PULLUP);
  pinMode(leftButtonPin, INPUT_PULLUP);
  pinMode(rightButtonPin, INPUT_PULLUP);
  pinMode(restartButtonPin, INPUT_PULLUP); 
  Serial.begin(9600);
}

void loop() {
  // Read button states
  int jumpState = digitalRead(jumpButtonPin);
  int leftState = digitalRead(leftButtonPin);
  int rightState = digitalRead(rightButtonPin);
  int restartState = digitalRead(restartButtonPin);


  if (jumpState == LOW) {
    Serial.println("JUMP");
  } else if (leftState == LOW) {
    Serial.println("LEFT");
  } else if (rightState == LOW) {
    Serial.println("RIGHT");
  } else if (restartState == LOW) {
    Serial.println("RESTART");
  } else {
    Serial.println("IDLE"); 
  }

}

Here is a video of the game being played:

 

Assignment 10: Testing Phase

Title: Testing Phase of “Catch the Horse” – Insights and Improvements

With the development of “Catch the Horse” completed, I conducted user testing to evaluate the intuitiveness, playability, and overall experience of the game. The goal of this phase was to observe how players interacted with the game, identify areas where they struggled, and determine how the gameplay mechanics could be made more intuitive and accessible.

User Testing Overview

During testing, participants were asked to engage with the game without receiving any prior instructions or guidance. The goal was to simulate the experience of a first-time player encountering the game. I recorded their interactions and noted points of confusion, questions they asked, and their overall feedback.

Observations and Findings

  1. What Users Figured Out on Their Own
    • Jump, Bird, and Rock Buttons:
      Players intuitively understood the jump mechanic and the functions of the bird and rock buttons. These actions had immediate visual feedback (e.g., the cowboy jumped, birds appeared, or rocks were thrown), which made the controls feel natural and responsive.
  2. What Needed Explanation
    • Lasso Button (Choice Button):
      Players struggled to understand how the lasso button worked. I had to explain that pressing the lasso button initiated the “Choice Screen” and that they could select between Skill or Luck to catch the horse.

      • Skill vs. Luck:
        The difference between the Skill and Luck options was unclear without explanation. Participants were unsure why they would choose one option over the other.
  • Interaction with Rocks and Birds:
    Although players understood how to use the rock and bird buttons, they were initially confused about how the cowboy was supposed to interact with these obstacles. For example, they weren’t sure if the rocks could be dodged or destroyed and if the birds required a specific action to avoid.

    What Worked Well

  • Physical Integration:
    The physical crouching mechanic, detected by the ultrasonic sensor, added an engaging and immersive element to the game. Users enjoyed the novelty of having to physically move to interact with the game.
  • Visual Feedback:
    Immediate visual feedback for the jump, bird, and rock mechanics allowed players to quickly understand these actions without explanation.
  • Game Flow and Balance:
    Cooldowns for the horse’s abilities (mud, fences, and booster) were well-received, as they maintained a fair and balanced gameplay experience.

    Lessons Learned

    • Mapping Between Controls and Gameplay:
      Intuitive mapping between controls and gameplay actions is critical. For example, the jump button and crouching were easy to grasp because the controls directly mirrored the in-game actions. However, abstract mechanics like the lasso required additional explanation due to their more complex interactions.
    • The Importance of Feedback:
      Immediate feedback helped players connect their actions to in-game effects. Enhancing feedback for less intuitive mechanics (like the lasso) will likely make the game easier to pick up.
    • Balancing Physical and Digital Gameplay:
      Players found the integration of physical actions (like crouching) and digital gameplay highly engaging. This balance between physical and virtual interaction should remain a cornerstone of the game’s design.

      Next Steps

      1. Add an Instructions Page or Tutorial:
        Include a brief tutorial or instructions page at the beginning of the game to explain the mechanics of the lasso button, Skill vs. Luck, and how to interact with obstacles like birds and rocks.
      2. Enhance In-Game Prompts:
        Add dynamic text prompts or animations during gameplay to guide players through challenging mechanics. For example:

        1. “Press J to Jump Over Rocks!”
        2. “Dodge Birds to Avoid Game Over!”

    Demonstration

Final Project – Motion Ship Against Robot Hands

Source Code Available Here

Previously in:

ContentS


1. Project Concept

The initial idea of my final is to transform my midterm project from an offline PvE game into an engaging online PvP experience. Building upon the PvP framework, I realized in the first weeks working on the final that the latest concept incorporates a physical robot to seemingly operate one of the players within the game. This dual-player setup creates a dynamic competition between a human-controlled player and a robot-controlled player, leveraging the newly established online PvP mechanism. As the physical installation is actually an illusion, the project also serves as a mind experiment to observe to what extent users will discover the installation during the experience.


2. Project Demonstration


3. Implementation Details

Interaction Design

The key components include:

  • Game Logic (p5.js): Manages game states, player actions, and AI behaviors.
  • Robot Hand (Arduino): Translates game commands into physical movements by controlling servos that simulate key presses.
  • Serial Communication: Facilitates real-time data exchange between the p5.js application and the Arduino-controlled robot hand, ensuring synchronized actions.

Physical Installation and Arduino Integration

  1. 3D Printing:
    • Materials: PLA filaments
    • Process:
      Experimental Print

      Separate Print Of Joints and Body
    • Hinges Construction:
      • Materials: 3D-printed molds and hot glue gun.
      • Purpose: Form sturdy and flexible hinges for finger movements.
      • Process: Injected hot glue into the molds and install the hinges between the joints.
    • Tendon Implementation:
      • Materials: Fishing lines.
      • Purpose: Act as tendons to control finger movements.
      • Process: Attached fishing lines to servos and the tips of the fingers.
    • Servo Control:
          • Components: 6 9g servo motors.
          • Control Mechanism: Driven by serial commands from the Arduino, allowing the robot hand to mimic key presses (`w`, `a`, `s`, `d`, `space`, `x`) by turning to specific angles

    • Assembly of the Installation
        • Components: All listed above, LEDs, jump wires, acrylic plates
        • Process:
          Acrylic assembly

#include <Servo.h>

unsigned long previousMillis = 0; // Store the last time the LED was updated
const long interval = 250; // Interval to wait (2 seconds)

// Define servo objects for each finger
Servo indexServo;
Servo middleServo;
Servo ringServo;
Servo pinkyServo;
Servo indexServo2;
Servo ringServo2;

// Define servo pins
const int indexPin = 2;
const int middlePin = 3;
const int ringPin = 4;
const int pinkyPin = 5;
const int indexPin2 = 6;
const int ringPin2 = 7;

// Define LED pins
const int LEDPins[] = {8, 9, 10, 11, 12, 13};
// indexLEDPin, middleLEDPin, ringLEDPin, pinkyLEDPin, indexLEDPin2, ringLEDPin2

// Array to hold servo objects for easy access
Servo servos[6];

// Blink LED while waiting for serial data
const int ledPin = LED_BUILTIN;

// Array to hold default angles
const int fingerDefaultAngles[] = {0, 15, 20, 20, 60, 30};

void setup() {
  // Initialize serial communication
  Serial.begin(9600);
  
  // Attach servos to their respective pins
  servos[0].attach(indexPin);
  servos[1].attach(middlePin);
  servos[2].attach(ringPin);
  servos[3].attach(pinkyPin);
  servos[4].attach(indexPin2);
  servos[5].attach(ringPin2);

  // Set LED pins to output mode
  pinMode(8, OUTPUT);
  pinMode(9, OUTPUT);
  pinMode(10, OUTPUT);
  pinMode(11, OUTPUT);
  pinMode(12, OUTPUT);
  pinMode(13, OUTPUT);
  
  // Initialize all servos to 0 degrees (open position)
  for(int i = 0; i < 6; i++) {
    servos[i].write(0);
    delay(100);
  }
  
  // Initialize LED pin
  pinMode(ledPin, OUTPUT);
  
  // Handshake: Wait for p5.js to send initial data
  while (Serial.available() <= 0) {
    digitalWrite(ledPin, HIGH); // LED on while waiting
    Serial.println("0,0,0,0,0,0"); // Send initial positions
    delay(300);
    digitalWrite(ledPin, LOW);
    delay(50);
  }
}

void loop() {
  // Check if data is available from p5.js
  while (Serial.available()) {
    // digitalWrite(ledPin, HIGH); // LED on while receiving data
    
    // Read the incoming line
    String data = Serial.readStringUntil('\n');
    data.trim(); // Remove any trailing whitespace
    
    // Split the data by commas
    int angles[6];
    int currentIndex = 0;
    int lastComma = -1;
    for(int i = 0; i < data.length(); i++) {
      if(data[i] == ',') {
        angles[currentIndex++] = data.substring(lastComma + 1, i).toInt();
        lastComma = i;
      }
    }
    // Last value after the final comma
    angles[currentIndex] = data.substring(lastComma + 1).toInt();
    
    // Get the current time
    unsigned long currentMillis = millis();

    // Check if the interval has passed
    if (currentMillis - previousMillis >= interval) {
      // Save the last time the LED was updated
      previousMillis = currentMillis;

        // Update servo positions
      for(int i = 0; i < 6; i++) {
        servos[i].write(angles[i]); // Set servo to desired angle
      } 
    }

    for(int i = 0; i < 6; i++) {
      digitalWrite(LEDPins[i], angles[i] != fingerDefaultAngles[i]? HIGH : LOW); // Light the LED accordingly
    }
    // Echo back the angles
    Serial.print(angles[0]);
    for(int i = 1; i < 6; i++) {
      Serial.print(",");
      Serial.print(angles[i]);
    }
    Serial.println();
    // digitalWrite(ledPin, LOW); // Turn off LED after processing
  }
}

p5.js

semi collapsed Project structure

project-root/
├── certs/
├── node_modules/
├── pics/
├── public/
│   ├── gameAssets/
│   ├── src/
│   │   ├── components/
│   │   ├── controllers/
│   │   └── utilities/
│   │   ├── index.html
│   │   ├── ml5.js
│   │   ├── p5.js
│   │   ├── p5.sound.min.js
│   │   ├── p5.web-server.js
│   │   ├── sketch.js
│   │   └── style.css
├── robot_hand_test/
├── install_dependencies.bat
├── LICENSE.txt
├── package-lock.json
├── package.json
├── README.md
├── run_local_server.bat
└── server.js

Online Game Setup

  1. Node.js and Socket.io:
    • Purpose: Establish real-time, bi-directional communication between clients.
    • Implementation: Set up a local server using Node.js and integrated Socket.io to handle event-based communication for synchronizing game states.
  2. Local Server for Data Communication:
    • Function: Manages user connections, broadcasts game state updates, and ensures consistency across all clients.
  3. Synchronized Game State:
    • Outcome: Ensures that both players have an up-to-date and consistent view of the game, enabling fair and competitive interactions.
// server.js

/* Install socket.io and config server
npm init -y
npm install express socket.io
node server.js
*/

/* Install mkcert and generate CERT for https
choco install mkcert
mkcert -install
mkcert <your_local_IP> localhost 127.0.0.1 ::1
mv <localIP>+2.pem server.pem
mv <localIP>+2-key.pem server-key.pem
mkdir certs
mv server.pem certs/
mv server-key.pem certs/
*/

const express = require('express');
const https = require('https');
const socketIo = require('socket.io');
const path = require('path');
const fs = require('fs'); // Required for reading directory contents

const app = express();

// Path to SSL certificates
const sslOptions = {
  key: fs.readFileSync(path.join(__dirname, 'certs', 'server-key.pem')),
  cert: fs.readFileSync(path.join(__dirname, 'certs', 'server.pem')),
};

// Create HTTPS server
const httpsServer = https.createServer(sslOptions, app);
// Initialize Socket.io
const io = socketIo(httpsServer);

// Serve static files from the 'public' directory
app.use(express.static('public'));

// Handle client connections
io.on('connection', (socket) => {
  console.log(`New client connected: ${socket.id}`);

  // Listen for broadcast messages from clients
  socket.on('broadcast', (data) => {
    // console.log(`Broadcast from ${socket.id}:`, data);
    // Emit the data to all other connected clients
    socket.broadcast.emit('broadcast', data);
  });

  // Handle client disconnections
  socket.on('disconnect', () => {
    console.log(`Client disconnected: ${socket.id}`);
  });
});

// Start HTTPS server
const PORT = 3000; // Use desired port
httpsServer.listen(PORT, () => {
  console.log(`HTTPS Server listening on port ${PORT}`);
});

Computer Player Algorithm

The computer player, controlled by AI within p5.js, employs sophisticated algorithms to simulate human-like behaviors, including:

  • Threat Detection and Evasion:
    • Mechanism: Continuously scans for incoming threats (e.g., enemy lasers, objects) and calculates optimal evasion paths to avoid collisions.
  • Strategic Movement and Firing:
    • Behavior: Moves toward or away from the enemy and fires lasers when within range, balancing offensive and defensive strategies based on current game states.
  • Tactic Engine Activation:
    • Function: Activates special abilities (e.g., infinite health or energy) when certain conditions are met, enhancing strategic depth and competitiveness.
// ComputerPlayer.js

class ComputerPlayer extends Player {
    constructor(model, texture, difficulty = 1, behaviorPriority = 'attack') {
      super(model, texture);
      this.difficulty = difficulty; // Higher values mean smarter AI
      this.behaviorPriority = behaviorPriority; // 'survival' or 'attack'
      this.enemy = game.enemy; 
      this.lastActionTime = millis();
      this.actionCooldown = map(this.difficulty, 1, 10, 500, 50); // in milliseconds
      this.actionQueue = []; // Queue of actions to perform
      this.currentAction = null;
      this.firingRange = 100; // Define firing range threshold
      this.bornTime = millis();
      this.difficultyTime = frameCount;
    }
  
    updateAI() {
      // Set local enemy target
      this.enemy = game.enemy; 

      // Count in frame, 1200 = 20s, to increase AI difficulty
      if (frameCount - this.difficultyTime > 1200) {
        this.difficulty ++;
      }

      if (currentTime - this.lastActionTime > this.actionCooldown) {
        console.log(`[AI][${this.behaviorPriority.toUpperCase()}] Deciding next action...`);
        this.decideNextAction();
        this.lastActionTime = currentTime;
      }
  
      // Execute actions from the queue
      this.executeActions();
    }
  
    decideNextAction() {
      // Determine behavior based on priority
      if (this.behaviorPriority === 'survival') {
        this.decideSurvivalActions();
      } else if (this.behaviorPriority === 'attack') {
        this.decideAttackActions();
      } else {
        // Default behavior
        this.decideAttackActions();
      }
    }
  
    decideSurvivalActions() {
      // Abandoned method, will not be used 
      // (unless another behavior mode 'Survival' is to be used)
    }
  
    decideAttackActions() {
      console.log(`[AI][DECIDE] Assessing attack strategies...`);

      // 1. Detect and handle threats
      let threats = this.detectThreats();
      if (threats.hasThreats) {
        console.log(`[AI][DECIDE] Threats detected: ${threats.allThreats.length} threats.`);
        
        if (threats.hasCriticalObjectThreat && this.energy >= 30) {
          console.log(`[AI][DECIDE] Critical object threat detected. Attempting to destroy it.`);
          for (let j = 0; j < 3; j++) {
            this.queueAction('fireAt', threats.criticalObject);
          }
        }
        
        // Evade all detected threats
        let evadeDirection = this.calculateEvasionDirection(threats.allThreats);
        console.log(`[AI][EVADE] Evasion direction: ${JSON.stringify(evadeDirection)}`);
        this.queueMovement(evadeDirection);
      
      } else {
        console.log(`[AI][DECIDE] No immediate threats detected.`);
        // 2. No immediate threats
        if ((this.energy < 40) && (this.enemy.health > 15)) {
          console.log(`[AI][DECIDE] Energy low (${this.energy.toFixed(2)}).`);
          
          if (30 <= this.energy) {
            console.log(`[AI][DECIDE] Energy low. Wait for replenish.`);
          } else {
            // Move towards the closest energyOre to gain energy
            let closestEnergyOre = this.findClosestEnergyOre();
            if (closestEnergyOre) {
                console.log(`[AI][DECIDE] Closest energy ore at (${closestEnergyOre.x}, ${closestEnergyOre.y}). Moving towards it.`);
                
                this.moveTowardsObject(closestEnergyOre);
                for (let j = 0; j < 3; j++) {
                this.queueAction('fireAt', closestEnergyOre); // Attempt to destroy it to collect energy
                }
            } else {
                console.log(`[AI][DECIDE] No energy ore found. Proceeding to attack.`);
                
                // Move towards the enemy and attack
                this.moveTowardsEnemy();
                for (let j = 0; j < 3; j++) {
                this.queueAction('fireAt', this.enemy);
                }
            }
          }
        } else {
          console.log(`[AI][DECIDE] Energy healthy (${this.energy.toFixed(2)}). Moving towards enemy to attack.`);
          
          // Move towards the enemy and attack
          this.moveTowardsEnemy();
          for (let j = 0; j < 3; j++) {
            this.queueAction('fireAt', this.enemy);
          }
        }
      }
  
      // 3. Utilize tactic engine if advantageous
      if (this.shouldUseTacticEngineAttack()) {
        console.log(`[AI][DECIDE] Activating tactic engine.`);
        this.difficulty ++;
        this.queueAction('activateTacticEngine');
      }
    }
  
    executeActions() {
      while (this.actionQueue.length > 0) {
        this.currentAction = this.actionQueue.shift();
        switch (this.currentAction.type) {
          case 'move':
            this.simulateMovement(this.currentAction.direction, this.currentAction.duration);
            break;
          case 'fireAt':
            this.simulateFireAt(this.currentAction.target);
            break;
          case 'activateTacticEngine':
            this.simulateTacticEngine();
            break;
          default:
            break;
        }
      }
    }
  
    simulateMovement(direction, duration = 500) {
      // Log the movement simulation
      console.log(`[AI][MOVE] Simulating movement directions: ${JSON.stringify(direction)} for ${duration}ms.`);
  
      // Direction is an object { up: bool, down: bool, left: bool, right: bool }
      // Duration is in milliseconds; map duration to number of frames based on difficulty
      const frames = Math.max(Math.floor((duration / 1000) * 60 / (11 - this.difficulty)), 1); // Higher difficulty, fewer frames
      console.log(`[AI][MOVE] Calculated frames for movement: ${frames}`);
  
      for (let i = 0; i < frames; i++) {
        if (direction.up) game.aiKeysPressed.w = true;
        if (direction.down) game.aiKeysPressed.s = true;
        if (direction.left) game.aiKeysPressed.a = true;
        if (direction.right) game.aiKeysPressed.d = true;
      }
    }
  
    simulateFire() {
      let currentTime = millis();
      if (currentTime - this.bornTime > stateBufferTime) {
        console.log(`[AI][FIRE] Simulating space key press for firing laser.`);
        // Simulate pressing the space key
        game.aiKeysPressed.space = true;
      } else {
        console.log(`[AI][CEASEFIRE] AI Waiting For Game Loading.`);
      }
    }
  
    simulateFireAt(target) {
      // Calculate distance to target before deciding to fire
      let distance = dist(this.x, this.y, target.x, target.y);
      console.log(`[AI][FIRE_AT] Distance to target (${target.type}): ${distance.toFixed(2)}.`);
  
      if (distance <= this.firingRange) {
        console.log(`[AI][FIRE_AT] Target within firing range (${this.firingRange}). Firing laser.`);
        // Target is close enough; simulate firing
        this.simulateFire();
      } else {
        console.log(`[AI][FIRE_AT] Target out of firing range (${this.firingRange}). Skipping fire.`);
        // Optional: Implement alternative actions if target is out of range
      }
    }
  
    simulateTacticEngine() {
      console.log(`[AI][TACTIC_ENGINE] Simulating 'x' key press for tactic engine activation.`);
      // Simulate pressing the 'x' key
      game.aiKeysPressed.x = true;
    }
  
    queueMovement(direction) {
      // console.log(`[AI][QUEUE] Queuing movement: ${JSON.stringify(direction)}.`);
      this.actionQueue.push({ type: 'move', direction: direction, duration: 500 });
    }
  
    queueAction(actionType, target = null) {
      if (actionType === 'fireAt' && target) {
        // console.log(`[AI][QUEUE] Queuing fireAt action for target: ${target.type} at (${target.x}, ${target.y}).`);
        this.actionQueue.push({ type: actionType, target: target });
      } else {
        // console.log(`[AI][QUEUE] Queuing action: ${actionType}.`);
        this.actionQueue.push({ type: actionType });
      }
    }
  
    detectThreats() {
      let threatsFound = false;
      let criticalObjectThreat = null;
      let allThreats = [];
  
      const laserThreatRange = 5 * this.difficulty; // Adjustable based on difficulty
      const objectThreatRange = 25 * this.difficulty; // Larger range for objects
  
      // Detect laser threats
      for (let laser of game.enemyLaser) {
        let distance = dist(this.x, this.y, laser.x, laser.y);
        if (distance < laserThreatRange) {
          threatsFound = true;
          allThreats.push(laser);
          // console.log(`[AI][DETECT] Laser threat detected at (${laser.x}, ${laser.y}) within range ${laserThreatRange}.`);
        }
      }
  
      // Detect object threats
      for (let obj of game.objects) {
        let distance = dist(this.x, this.y, obj.x, obj.y);
        if (distance < objectThreatRange) {
          // Additionally check z-axis proximity
          if ((obj.z - this.z) < 200) { // Threshold for z-axis proximity
            threatsFound = true;
            criticalObjectThreat = obj;
            allThreats.push(obj);
            // console.log(`[AI][DETECT] Critical object threat detected: ${obj.type} at (${obj.x}, ${obj.y}) within range ${objectThreatRange} and z-proximity.`);
          } else {
            threatsFound = true;
            allThreats.push(obj);
            // console.log(`[AI][DETECT] Object threat detected: ${obj.type} at (${obj.x}, ${obj.y}) within range ${objectThreatRange}.`);
          }
        }
      }
  
      return {
        hasThreats: threatsFound,
        hasCriticalObjectThreat: criticalObjectThreat !== null,
        criticalObject: criticalObjectThreat,
        allThreats: allThreats
      };
    }
  
    calculateEvasionDirection(threats) {
      // Determine evasion direction based on all threats
      let moveX = 0;
      let moveY = 0;
  
      for (let threat of threats) {
        if (threat.z > -2000) {
          let angle = atan2(this.y - threat.y, this.x - threat.x);
          moveX += cos(angle);
          moveY += sin(angle);
          console.log(`[AI][EVADE] Calculating evasion for threat at (${threat.x}, ${threat.y}). 
                        Angle: ${angle.toFixed(2)} radians.`);
        }
      }
  
      // Normalize and determine direction
      if (moveX > 0.5) moveX = 1;
      else if (moveX < -0.5) moveX = -1;
      else moveX = 0;
  
      if (moveY > 0.5) moveY = 1;
      else if (moveY < -0.5) moveY = -1;
      else moveY = 0;
  
      return {
        up: moveY === 1,
        down: moveY === -1,
        left: moveX === -1,
        right: moveX === 1
      };
    }
  
    findClosestEnergyOre() {
      let energyOres = game.objects.filter(obj => obj.type === 'energyOre'); // Assuming objects have a 'type' property
      if (energyOres.length === 0) {
        console.log(`[AI][ENERGY] No energy ore available to collect.`);
        return null;
      }
  
      let closest = energyOres[0];
      let minDistance = dist(this.x, this.y, closest.x, closest.y);
  
      for (let ore of energyOres) {
        let distance = dist(this.x, this.y, ore.x, ore.y);
        if (distance < minDistance) {
          closest = ore;
          minDistance = distance;
        }
      }
  
      console.log(`[AI][ENERGY] Closest energy ore found at (${closest.x}, ${closest.y}) with distance ${minDistance.toFixed(2)}.`);
      return closest;
    }
  
    moveTowardsObject(target) {
      // Determine direction towards the target object
      let dx = target.x - this.x;
      let dy = target.y - this.y;
  
      let direction = {
        up: dy < 20,
        down: dy > -20,
        left: dx < -20,
        right: dx > 20
      };
  
      console.log(`[AI][MOVE_TO_OBJECT] Moving towards ${target.type} at (${target.x}, ${target.y}). Direction: ${JSON.stringify(direction)}.`);
      this.queueMovement(direction);
    }
  
    moveTowardsEnemy() {
      // Determine direction towards the enemy
      let dx = this.enemy.x - this.x;
      let dy = this.enemy.y - this.y;
  
      let direction = {
        up: dy < 20,
        down: dy > -20,
        left: dx < -20,
        right: dx > 20
      };
  
      console.log(`[AI][MOVE_TO_ENEMY] Moving towards enemy at (${this.enemy.x}, ${this.enemy.y}). Direction: ${JSON.stringify(direction)}.`);
      this.queueMovement(direction);
    }
  
    shouldUseTacticEngineSurvival() {
      // Abandoned method
    }
  
    shouldUseTacticEngineAttack() {
      // Decide whether to activate tactic engine based on attack advantage
      if (!this.tacticEngineUsed) {
        if (this.health < 30) {
          console.log(`[AI][TACTIC_ENGINE] Conditions met for tactic engine activation (Health: ${this.health}, Energy: ${this.energy}).`);
          return true;
        }
        if (this.model === assets.models.playerShip2) {
          // Additional condition: If enemy health is low and need more energy to destroy it
          if (game.enemy.health < 30 && this.energy < 50) {
            console.log(`[AI][TACTIC_ENGINE] Condition met for playerShip2: Enemy health is low (${game.enemy.health}).`);
            return true;
          }
        }
      }
      return false;
    }
  
    render() {
      // Add indicators or different visuals for ComputerPlayer
      super.render();
      // Draw AI status
      push();
      fill(255);
      textFont(assets.fonts.ps2p);
      textSize(12);
      textAlign(LEFT, TOP);
      text(`X: ${this.x.toFixed(1)}`+`Y: ${this.y.toFixed(1)}`, this.x - 50, this.y - 75);
      text(`AI Difficulty: ${this.difficulty}`, this.x - 50, this.y - 60);
      if (this.currentAction != null) {
        text(`Behavior: ${this.currentAction.type}`, this.x - 50, this.y - 45);
      }
      pop();
    }
  }

Servo Motion Control

Commands from the AI are translated into servo movements to control the robot hand:

  1. Command Translation:
    • Process: Maps AI decisions to corresponding servo angles, ensuring accurate physical representations of game inputs.
  2. Async Update:
    • Outcome: Ensures that physical actions performed by the robot hand are crowded out by serial communication while keeping in sync with the game’s digital state.
class RobotHandController {
    constructor() {
      this.lastUpdateTime = millis();
    }

    init() {
      //
    }
    
    update() {
      // Update finger bends to Arduino
      this.updateFingerAngles();
    }

    // Update Fingers according to the virtual keys
    updateFingerAngles() {
      // Stop function if no serial connections
      if (!serialActive) return;

      let currentTime = millis();

      const keys = ['w', 'a', 's', 'd', 'space', 'x'];
      const angles = [30, 50, 50, 60, 75, 70]; // Different angles for each key
    
      for (let i = 0; i < 6; i++) {
        if (game.aiKeysPressed[keys[i]] === true) {
          if (fingerAngles[i] != angles[i]) {
            fingerAngles[i] = angles[i];
          } 
        }
      }

      // Send data every second
      if (frameCount % 120 === 0) {
        this.sendAngles(fingerAngles);
        
        // Schedule Release
        setTimeout(() => {
          console.log('reached')
          this.sendAngles(fingerDefaultAngles);
        }, 2000 / 2);
      }
      
      this.lastUpdateTime = currentTime;
      
    }

    // Send Current Angles to Arduino via Serial
    sendAngles(angles) {
      if (serialActive) {
        let message = angles.join(",") + "\n";
        writeSerial(message);
        console.log("Sent to Arduino:", message.trim());
      }
    }
}

/*
function readSerial(data) {
  // Handle incoming data from Arduino
  // For this project, we primarily send data to Arduino
}
  */

 


4. Project Highlights

Network Communication

  • Real-Time Synchronization: Successfully implemented real-time data exchange between clients using Node.js and Socket.io.
  • Robust Server Setup: Developed a stable local server that handles multiple connections.

Physical Installation

  • Robot Hand Fabrication: Crafted a functional robot hand using 3D printing, hot-glued hinges, and fishing line tendons.
  • Servo Integration: Connected and controlled multiple servos via Arduino to simulate human key presses.

AI Player Algorithm

  • Dynamic Threat Handling: Developed an AI that intelligently detects and responds to multiple simultaneous threats, prioritizing evasion and strategic attacks based on predefined behavior modes.

5. Future Improvements

Strengthening the Robot Hand

  • Enhanced Strength: Upgrade materials and servo to increase the robot hand’s strength and responsiveness, realizing actual control over the physical buttons.

Network Communication Structure

  • Peer-to-Peer Networking: Transition from a broadcast-based communication model to a peer-to-peer (P2P) architecture, facilitating support for more than two players and reducing server dependencies.