Final Project – Meditative Moth

At its core, the Meditative Moth consists of a physical setup with an ultrasonic sensor, mounted on a servo motor, which detects objects within its range. This data is then transmitted to a p5.js sketch running on a laptop, where a virtual moth mirrors the movements of the physical sensor on screen. Adding another layer to the experience, a stage light follows the virtual moth’s path, creating an immersive and dynamic visual display.

The interaction is simple yet profound. As the sensor detects objects, the virtual moth flutters and dances across the screen, its movements guided by the presence and position of the objects. This interplay between the physical and digital, the real and the virtual, encourages us to reflect on our own attention and how we engage with the world around us.

The question of control in the Meditative Moth project adds a layer of intrigue to its artistic interpretation. Whether your movements directly guide the moth’s flight, with the spotlight following in its wake, or you command the spotlight, drawing the moth towards its illumination, the experience delves into the complexities of attention. The first scenario emphasizes conscious direction, where you actively choose your focus, while the second highlights the subconscious forces that influence our attention, drawing us towards certain stimuli. Ultimately, the ambiguity of control invites contemplation on the intricate interplay between conscious choice and subconscious influence, prompting us to explore the depths of our own attention and its ever-shifting nature.

Arduino and p5 files:

Arduino code:

#include <Servo.h>

// Define servo and sensor pins
const int servoPin = 9;
const int trigPin = 10;
const int echoPin = 11;

// Define variables for servo movement and distance
int distance;
int targetAngle = 90;  // Initial servo position
int sweepDirection = 1; // Sweep direction: 1 for right, -1 for left
int sweepAngle = 30;    // Angle to sweep from the target angle
int minDist = 50;

Servo myServo;

void setup() {
  myServo.attach(servoPin);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  Serial.begin(9600);
}

void loop() {
  // Scan for objects by sweeping the servo
  scanForObjects();

  // Track the object if found
  if (distance < minDist) {
    trackObjects();
  } 
  delay(50);
}

void scanForObjects() {
  //Serial.println("scanning");
  for (int angle = 20; angle <= 120; angle += 2) {
    myServo.write(angle);
    delay(50);
    distance = getDistance();
    Serial.print(angle);
    Serial.print(',');
    Serial.println(distance);
    if (distance < minDist) {
      //Serial.println("target found");
      targetAngle = angle;
      return;
    }
  }
}


void trackObjects() {
  while (distance < minDist) {
    distance = getDistance();
    //Serial.println("tracking");
    myServo.write(targetAngle);
  }
  sweepForObjects();
}

void sweepForObjects() {
  //Serial.println("sweeping");
  int currentAngle = targetAngle;
  for (int i = 0; i < 2; i++) { // Sweep left and right
    for (int angle = currentAngle; angle >= 20 && angle <= 120; angle += sweepDirection) {
      myServo.write(angle);
      delay(50);
      distance = getDistance();
      Serial.print(angle);
      Serial.print(',');
      Serial.println(distance);
      if (distance < minDist) {
        //Serial.println("target found while sweeping");
        targetAngle = angle;
        trackObjects(); // Return to tracking
        return;
      }
    }
    // Change sweep direction
    sweepDirection *= -1;
  }
  // If the object is not found during sweeping, return to scanning
  scanForObjects();
}

int getDistance() {
  long duration;
  int distanceCm;

  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  duration = pulseIn(echoPin, HIGH);
  distanceCm = duration / 29 / 2;

  return distanceCm;
}

p5 Project (the project will not display anything without an Arduino):

p5 Project (non-interactive version, no Arduino needed):


Implementation:

Here’s a breakdown of the different components:

  • Physical Setup: The ultrasonic sensor and servo motor act as the eyes of the project, constantly scanning for objects and relaying their positions.
  • Arduino Code: The Arduino acts as the brain, controlling the servo motor and sending angle and distance data to the p5.js sketch via serial communication.
  • p5.js Sketch: The sketch receives the data and translates it into the movements of the virtual moth and spotlight on the screen. The moth’s flight path, as well as the spotlight’s location, directly corresponds to the detected object’s position.
Everything put together
Everything put together
Connection between sensor and motor, using Velcro pads
Connection between sensor and motor, using Velcro pads
Project taken apart
All the internal wiring before the project was put together
An error message after my laptop updated

User interactions

Reflections and Future Directions:

One of the project’s highlights is the successful implementation of the tracking algorithm within the Arduino code. Although it may not be as good as I initially wanted it, this is more of a hardware issue than a code issue. This intricate dance between the physical and virtual environments forms the foundation of the entire experience. Additionally, the integration of physical and virtual elements creates a truly captivating and thought-provoking experience for the audience.

At the end of the project, right before the deadline, I ran into a pretty severe error after my laptop updated which prevented me from connecting my Arduino to p5. I tried many ways to debug, and eventually even tried getting a new laptop from the library. None of that worked, however when I plugged the Arduino into the new laptop, something in it updated, and the next time I plugged it into my laptop, the project started working again.

Looking ahead, there are many possibilities to enhance the Meditative Moth:

  • Enhanced Visuals: Refining the visual representation of the moth and the stage light effects could create an even more mesmerizing and aesthetically pleasing experience.
  • Auditory Expansion: Introducing music and crowd cheering, could deepen the audience’s engagement and further enrich the meditative aspects of the project.
  • Movement Exploration: Experimenting with different movement patterns and behaviors for the virtual moth could evoke a wider range of emotional responses and add another layer of depth to the project.

IM Fest

The moth did not see much time on-stage during the IM Festival. There are a multitude of reasons why. For one, it was not a game, thus it would struggle to retain people’s attention. In a hall full of fun games, a small art project would hardly catch anyone’s attention, especially since it is not immediately apparent what the message of the project is.

Additionally, people struggled with the intuitiveness of the controls. It was not entirely clear from the project that the was an optimal distance for the sensor to track the viewer.  Many people, I noticed, would try to activate the sensor by putting their hands in front of it, this never worked. I think I should have put some tape on the floor to indicate the optimal range to interact with the moth.

My monitor would often turn off during the festival, obviously obscuring the moth. I tried running a YouTube video in the background, however this failed to keep the monitor active.

I would occasionally see the sensor activate when people would pass it. This would grab their attention, but not enough for them to care for the whole project. Additionally, I would see the sensor sometimes tracking people who were interacting with the adjacent projects. This at least told me that the sensor and servo were doing exactly what I wanted them to do. Unfortunately, not many people were paying attention to see it.

Fruit Ninja final project Documentation

Concept:

The Fruit Ninja project recreates the classic fruit-slicing game using hand tracking and an accelerometer. Players slice virtual fruits displayed on the screen by moving their hands, which are tracked by a camera and the ml5 handpose model. The vertical movement of the virtual knife is controlled by an accelerometer connected to an Arduino.

Implementation

      • Hand Tracking: The p5.js sketch utilizes the ml5 handpose model to track the user’s hand movements through the camera. By calculating the average position of hand landmarks, it determines the x-axis position of the virtual knife.
      • Accelerometer Input: The Arduino reads the y-axis values from the accelerometer and transmits them to the p5.js sketch via serial communication. This data controls the vertical movement of the virtual knife on the screen.
      • Fruit and Bomb Generation: The p5.js sketch generates fruits and bombs at the bottom of the screen, propelling them upwards in a projectile motion.
      • Collision/slicing Detection: The sketch detects collisions between the virtual knife and the fruits/bombs. When a fruit is sliced successfully, it splits in two, the player’s score increases, and a slicing line appears. Hitting a bomb results in a penalty, a visual effect (like a flashbang), and the loss of a life.

Interaction Design

      • Hand Movements: Players use slicing motions with their hands to control the virtual knife and slice the fruits.

      • Visual Feedback: The game provides visual cues like slicing lines, explosion effects, and score updates to enhance player feedback.

      • Score and Lives: The score increases with each fruit sliced, and players have a limited number of lives, represented by hearts.

      • Game Over: The game ends when the player loses all lives by hitting bombs. The final score is displayed, and a restart option is offered.

User testing

Schematic

Arduino Code

The Arduino code reads the accelerometers yaxis values and transmits them to the p5.js sketch through serial communication. It also includes a button to reset the base position of the accelerometer for calibration (if needed for debugging reasons if the user was too far away from the screen).

const int buttonPin = 2;    // Button pin
const int xPin = A0;        // X-axis analog pin (unused)
const int yPin = A1;        // Y-axis analog pin connected to ADXL335

// Calibration values
const float xZero = 512.0;  // Raw ADC value at 0g for X (unused)
const float yZero = 512.0;  // Raw ADC value at 0g for Y

// Variables 
float baseX = 0, baseY = 0;

void setup() {
  Serial.begin(9600);
  pinMode(buttonPin, INPUT_PULLUP);  // internal pull-up resistor

  // Read initial values
  baseX = analogRead(xPin); 
  baseY = analogRead(yPin);
}

void loop() {
  // Read the button state
  static bool lastButtonState = HIGH;
  bool currentButtonState = digitalRead(buttonPin);

  // Check for button press to reset the base position
  if (lastButtonState == HIGH && currentButtonState == LOW) {
    baseX = analogRead(xPin);  // Unused
    baseY = analogRead(yPin);
  }
  lastButtonState = currentButtonState;

  // Read current accelerometer value for Y-axis
  float yVal = analogRead(yPin);

  float yG = (yVal - baseY); 

  // Send data to p5 through serial
  Serial.println(yG, 3);  // Using 3 decimal places for precision

  delay(100);  // Reduce data rate
}

p5.js Code

 

Communication between Arduino and p5.js

The Arduino sends the accelerometers yaxis data to the p5.js sketch via serial communication, which is then used to control the vertical movement of the virtual knife in the game.

Areas of Pride

      • Integration of Hand Tracking and Accelerometer: The project successfully combines hand tracking for horizontal movement and accelerometer data for vertical movement, creating a more interactive and engaging gameplay experience.

      • Visual Effects and Gameplay: The visual elements, including fruit slicing,  and bomb explosions, enhance the gameplay experience and provide satisfying feedback to the player.

Future Improvements

      • Variety of Fruits and Challenges: Introduce more fruit types, obstacles, and challenges to increase gameplay complexity and engagement.

      • Calibration and Optimization: Improve the calibration process for the accelerometer and optimize the code for smoother performance.

      • Sound Effects and Music: Implement sound effects for slicing, and explosions.

IM SHOWCASE



Final Project – Barrow Controller

Concept

Since we are learning physical computing, I am particular interested the human and computer interaction. The relationship of between human and a machine is very bare bone in my project. The human is the controller, while the motor is being controlled. However, I feel that it should not be always like that. In the time of AI development where the machines are getting awareness of the surroundings, machines should not be only controlled. Therefore, for this project, I want to give some basic controls to the motor so that it will not always be under human’s control.

Images of the project

Design inspiration:

Physical Parts:

P5.js images:

Schematic Diagram

Screenshot

p5.js Embedded
Link to fullscreen

User Testing Videos

 

Implementation

Interaction Design:

The main components of the project is the handpose model of the ML5.js library. Use its model, the hands are used as a controller for the motor. There are different types of actions that the users can interact with the project. First is the simple hand poses such as showing the palm, fist, turn it to 90 degrees. Each of this will give a different command to the motor which are stop, go, turn left/right respectively.

Since the recognition of the hand joints are done by the handpose library, I just need to give the conditional actions based on the position of the fingers. It is quite difficult to recognize the correct patterns of the of different hand poses initially. There are a lot of trial and errors to identify different hand patterns.

There are a lot of illustration on the screen as the instructions of the project. However, I further made a small icon of the Arduino car as a small companion with the users. This will display the direction in which the user is giving the command by moving in that direction.

Below is a block of code for recognizing the hand poses and giving the corresponding order:

if (indexFinger[1] > thumb[1] && middleFinger[1] > thumb[1] && ringFinger[1] > thumb[1] && pinky[1] > thumb[1]){
  if (middleFinger[0] > thumb[0] + 80){
    // console.log("run");
    // console.log("turn right");
    commandRight = 1;
    commandLeft = 0;
    x++;

    push();
    imageMode(CENTER);
    image(right, 100, 100, 100, 100);
    pop();

  }
  else if (middleFinger[0] < thumb[0] - 80){
    // console.log("stop");
    // console.log("turn left");
    commandRight = 0;
    commandLeft = 1;
    x--;

    push();
    imageMode(CENTER);
    image(left, 100, 100, 100, 100);
    pop();

  }
  else{
    // console.log("straight");
    commandRight = 1;
    commandLeft = 1;
    y--;

    push();
    imageMode(CENTER);
    image(fist, 100, 100, 100, 100);
    pop();

  }
}
else{
  // console.log("stop");
  commandLeft = 0;
  commandRight = 0;
  push();
  imageMode(CENTER);
  image(straight, 100, 100, 100, 100);
  pop();

}

However, you can notice that there is no command for going backward. This is the decision of the motor. Currently, there is no actual machine learing algorithm in the project, the project is just using simple decision making that is giving a portion of decision to the motor. The motor only decides to go back if and only if there is an obstacle blocking its way. This is done using a ultrasonic distance sensor. When the value is smaller than a certain threshold. After it detect an obstacle, the motor will automatically go backward and turn 180 degrees. Below is portion of code for that:

  if (rand) {
    digitalWrite(ain1Pin, HIGH);
    digitalWrite(ain2Pin, LOW);
    analogWrite(pwmAPin, potVal);

    // digitalWrite(bin1Pin, LOW);
    // digitalWrite(bin2Pin, LOW);
    analogWrite(pwmBPin, 0);
  }
  else{
    digitalWrite(bin1Pin, LOW);
    digitalWrite(bin2Pin, HIGH);
    analogWrite(pwmBPin, potVal);

    // digitalWrite(ain1Pin, LOW);
    // digitalWrite(ain2Pin, LOW);
    analogWrite(pwmAPin, 0);

  }

  unsigned long currentMillis = millis();

  if (currentMillis - previousMillis >= interval) {
    // save the last time you blinked the LED
    previousMillis = currentMillis;
    back = false;
  }
}

Furthermore, to give it some human actions, if it is in contact with one of the obstacles, it will express itself by saying that it don’t want to do that. However, since the showcase situation may be too noisy, the sound will be played in the p5.js side using the computer’s speaker.

P5.js and Arduino Communication:

The hand pose are detected in the p5.js side. Since I have 2 different wheels, I have 2 different variables for the left and the right wheels, the communication of the p5.js to the Arduino sketch is the status of the left and right control variables. The Arduino is then use those information to run the corresponding action.

Below is the Arduino code for processing the p5.js information (p5.js code for this has already be included in the earlier section):

if (!back){
  if (left) { //counterclockwise
    digitalWrite(ain1Pin, LOW);
    digitalWrite(ain2Pin, HIGH);
    analogWrite(pwmAPin, potVal);
  }
  else { //clockwise
    // digitalWrite(ain1Pin, LOW);
    // digitalWrite(ain2Pin, LOW);
    analogWrite(pwmAPin, 0);
  }

  if (right) { //counterclockwise
    digitalWrite(bin1Pin, HIGH);
    digitalWrite(bin2Pin, LOW);
    analogWrite(pwmBPin, potVal);
  }
  else { //clockwise
    // digitalWrite(bin1Pin, LOW);
    // digitalWrite(bin2Pin, LOW);
    analogWrite(pwmBPin, 0);
    // analogWrite(pwmBPin, 255 - potVal / 4);
  }
} else{
  if (cm < 10){
    Serial.print(back);
    Serial.print(',');

    digitalWrite(ain1Pin, HIGH);
    digitalWrite(ain2Pin, LOW);
    analogWrite(pwmAPin, potVal);

    digitalWrite(bin1Pin, LOW);
    digitalWrite(bin2Pin, HIGH);
    analogWrite(pwmBPin, potVal);
  }

For the communication from Arduino back to p5.js, since the small illustration on the screen need to also display the status of going backward, this information in sent to p5.js to know when the motor is going backward to display accordingly. Below is the block of code for that:

if (fromArduino[0] == "1"){
  y++;
  if(prev1 == "0"){
    sound1.play();

  }
}

What I am proud of:

I am really like the way the motor can run by itself without the commands. It is similar to use the state of the surrounding and make the decision by themselves. Even though this is not a machine learning project so it can not think by themselves, a very bare bone of intelligence can be simply made by Arduino.

Also, the implementation of ml5 library is also something new to me. It took me quite a bit of time to figuring out the correct number for the difference threshold of the hand poses. It still does not work that smoothly due to the errors in hand detections.

Resources used:

ml5 handpose documentation

ml5 examples

Youtube instruction

Challenges:

It was quite difficult to make sure that while the motor is taking the control, the user can no longer control the which direction it can move. Initially, I thought I should done it in the p5.js side where if the variable “back” is true, it can stop sending the information to the Arduino. However, this just stop the whole communication between the p5.js and Arduino. Therefore, I made it to be controlled in the Arduino side. So the variable called “back” is used to control the state and it can only be reset after the motor finish doing its role.

Apart from this is that I need to implement the decision of turn itself 180 degrees right after running backward for a while. Since I cannot use delay which will cause the motor to stop running, I used Blink without Delay technique to set the status and duration of the turning. Below is an illustration of this logic:

unsigned long currentMillis = millis();

if (currentMillis - previousMillis >= interval) {
  // save the last time you blinked the LED
  previousMillis = currentMillis;
  back = false;
}

Improvements:

One of the things that I want to do is that the illustration on the screen can be mapped to the actual distance travelled by the motor. However, since the power of the motor is not the same as speed, I was not able to implement this.

Also, I would like to allow the motor to have more decision making and not just return and simple speech. I think this also require complex analysis of the roles between the human and the machine.

IM Showcase:

The showcase went smoothly general. However, there are minor problems with the motor running. For example, the wire is stuck which is preventing the motor to run. One of the motor for some reason is significantly weaker than the other, so the device does not go in a straight line. Also, because of the lighting in the room, the ML5 library failed to detect the hand poses multiple times. I recognize that the environment plays a big role in keeping the project running smoothly.

Image:

Below are the videos of the user interaction during the show:

Final Project [Sizzling Steak]

Concept and the Game Description

Inspired by one of my childhood favorite games, Cooking Mama, I aimed to create a cooking simulation game called Sizzling Steak. Sizzling Steak is an interactive game simulating the experience of the whole steak cooking process from buying the ingredients to putting the sauce on the stake. This game simulates the experience of buying ingredients, cooking a steak on a grill, and putting sauce on the steak through a series of mini-games using various sensors and resistors. 

First, the user has to buy ingredients for making the steak. This stage simulates the experience of shopping for ingredients. The player controls the cart using the button switch to avoid obstacles and collect necessary ingredients. 

Once the user collects some ingredients, the user proceeds to the cooking the steak stage. In this stage, the player uses a potentiometer to control the fire level on the grill, which is related to the timing of the flipping time. The greater value of the potentiometer indicates a shorter interval of flipping time. When it is time to flip the steak, the LED button lits up, and the player must press the LED button to flip the steak at the right moment. Flipping too late would cause the steak to be burnt.  And the game immediately ends if the steak is burnt. So, the user should flip the steak at the perfect time for a couple of times to succeed in this stage. 

After successfully cooking the steak, the player moves on to putting sauce on it. Here the user should squeeze the sauce bottle with the flex sensor to control the amount of sauce applied. The player must apply the right amount of pressure to achieve the perfect sauce coverage on the steak.

Game pages (without buttons)

Here are some pages and elements I have designed for the interface of this project. I have utilized Canva to design the pages to be utilized in this project:

start page

Stage 1 instructions

Stage1 lost page 

stage 2 instructions 

stage 2 lost page

stage3 instructions

stage3 lost page

game ending page

Project Images

Prototype Development:

Final Product:

User Testing 

 

For the user testing, I have let some of my friends play the game  I made without giving them any instructions ( the game’s instruction page was there though) and see how they use it. Thankfully, all of my friends were able to immediately figure out how to play the game. After playing the game, I asked my friends whether there were parts that were confusing and they told me that the game was actually very intuitive. So, there was no particularly confusing part. However, I have discovered that many of them find stage 1 quite difficult to collect 5 ingredients. Hence, as of now, to adjust the level of difficulty, I made the game-winning condition for stage 1 to be collecting 3 ingredients. 

How does the implementation work?

User interaction design

From the brainstorming stage of this project, I wanted to ensure that the player could easily understand and engage with the game mechanics. I attempted to do so by making the instructions for each stage as clear and concise and making the physical component of the project as intuitive as possible. Before the “first” attempt of each stage, there is an instruction to guide the user on how to play the game. I especially have put lots of effort into creating the instructions page for stage 1. While I found the mini-games for stages 2 and 3 to be quite intuitive, I recognized that stage 1, involving collecting ingredients, might be more challenging. To address any confusion that may be caused,  I added images of the ingredients that the player needs to collect to clarify the game mechanics. 

In terms of the physical components of the game, I tried to make it intuitive by utilizing objects similar to those utilized in making the steak. First of all, I have utilized a cute little potentiometer that resembles those attached to the gas stoves to indicate that this is attached to control the “fire intensity”. Then, I inserted the flex sensor into the sauce bottle to signify that the player must “squeeze” the bottle to play the mini-game for stage 3. 

Communication between Arduino and p5.js

 

In this project, both Arduino and p5.js send data to each other. The Arduino sends the value of different sensors (two buttons, potentiometer, and the flex sensor) to the p5.js while the p5.js sends the “signal” to light up one of the buttons. The Arduino sends a string of the values of the sensors split by commas and the p5.js splits this string to an array of consisting different elements (each element being the sensor value) and utilizes these elements when needed. The p5.js also sends the signal in a string form. In stage 2, when it is not time to light up the button, the p5.js sends “0\n” and sends “1\n” when it is time to light up the button. The Arduino interprets this sent value using the Serial.parseInt(). 

Arduino Code: 

//stage 1 button led;
int led1 = 3;
int button1 = 2;
//stage2 button led;
int led2 = 5;
int button2 = 4;


void setup() {
 Serial.begin(9600);
 pinMode(LED_BUILTIN, OUTPUT);
 pinMode(led1, OUTPUT);
 pinMode(led2, OUTPUT);
  // Initial blink without using delay
 digitalWrite(led1, HIGH);
   while (Serial.available() <= 0) {
   Serial.println("0,0"); // send a starting message
 }


}


void loop() {


 // Handling incoming data
 while (Serial.available()) {
   digitalWrite(LED_BUILTIN, HIGH); // LED on while receiving data
   //interpreting the received data from p5.js
   int lightup = Serial.parseInt();


   //reading the sensors
   int buttonState = digitalRead(button1);
   int sensor1 = analogRead(A5);//potentiometer
   int buttonState2= digitalRead(button2);
   int sensor3= analogRead(A3);//flex sensor


   //sending data to p5.js
   Serial.println(String(buttonState) + "," + String(sensor1) + "," + String(buttonState2)+ "," +String(sensor3));


   //changing the behavior of the led button based on the data sent from p5.js
   digitalWrite(led2,lightup);
   }
}

The Arduino code is quite simple. After starting the handshake with the p5.js, when the serial communication is available, it starts handling the incoming data.  Using Serial.parseInt(), it reads and interprets the data from the p5.js. Then, utilize that data in changing the stage of LED using digitalWrite() later on in the code. As shown in the snippet, the Arduino reads the state and the values of the sensors and buttons using digitalRead() and analogRead(). The buttons’ states are read through digitalRead() and the values of the flex sensor and potentiometer are read through analogRead(). Then, it prints with a new line using Serial.println() to send data to p5.js. 

P5.js Code

Since the p5.js code is very lengthy, here I also attach the link to the entire sketch of the p5.js code (for easier access) 

https://editor.p5js.org/sihyunkim/sketches/ulUBUNDbb

Since my game consists of three mini-games and additional pages, flags were must. The most important flag of my game was gameState. I made gameState as a global variable, and then changed the value of it accordingly to the stage of the game. Here, I will explain the code snippets of different variables of gameState, which are when gameState= “connect port”, “stage1”, “stage2” and “stage3”

The following is how the game worked in each value of gameState: 

  • When gameState= “connect port” 
if (gameState == "connect port") {
  imageMode(CORNER);
  image(page0, 0, 0, windowWidth, windowHeight);
}

if (serialActive && gameState == "connect port") {
  gameState = "start";
}

The game starts with the gameState being “connect port”. Here, if the user connects port using the space bar and makes the serial active, the gameState changes to “start”

  • When gameState= “stage1”
//stage 1
 if (gameState == "stage1") {
   //boost condition 
   if (int(stage1Data) == 1) {
     if (!boostSound.isPlaying()) {
       boostSound.play();
     } else {
       boostSound.stop();
       boostSound.play();
     }
     boost = true; //boost flag to trigger the gamer (cart) to jump
   } else {
     boost = false;
   }

   //background image
   imageMode(CORNER);
   image(stage1bg, 0, 0, windowWidth, windowHeight);

   //calling gamer related functions
   gamer.move();
   gamer.show(cartimg);
   gamer.jump();

   //creating obstacles(unnecessary ingredients) and needed ingredients
   if (millis() - lastObstacleTime > obstacleInterval) {
     //x and y positions of the obstacle
     let obstacleX = windowWidth + 100;
     let obstacleY = windowHeight * 0.75;
     //x and y positions of the ingredient
     let ingredientX = windowWidth + 100;
     let ingredientY = windowHeight * 0.75;
     //initiating the obstacle and ingredient class
     let newIngredient = new Ingredients(ingredientX, ingredientY);
     let newObstacle = new Obstacles(obstacleX, obstacleY);

     //randomly choosing if ingredient or obstacle will be created

     let choice = random(0, 2);

     if (choice >= 1) {
       ingredients.push(newIngredient);
     } else {
       obstacles.push(newObstacle);
     }

     lastObstacleTime = millis();
   }

   for (let i = ingredients.length - 1; i >= 0; i--) {
     ingredients[i].update(-5); //speed of the ingredient coming towards the cart
     ingredients[i].show(ingredientsimg); //depicting the cart
     ingredients[i].checkCollisionGamer(gamer, metIngredientSound); // checking if the ingredient met the gamer

     //removing ingredients if they are off the screen
     if (ingredients[i].position.x + ingredients[i].size / 2 < 0) {
       ingredients.splice(i, 1);
     }
     //letting the ingredients disappear if they meet cart
     else if (metGamer == true) {
       ingredients.splice(i, 1);
       count++;
       metGamer = false;
     }
   }

   for (let i = obstacles.length - 1; i >= 0; i--) {
     obstacles[i].update(-5); //speed of the obstacle coming towards the cart
     obstacles[i].checkCollision(gamer); //checking collision with the cart (gamer)
     obstacles[i].show(obstaclesimg); //depicting the obstacle image

     // removing obstacles if they are off-screen
     if (obstacles[i].position.x + obstacles[i].radius < 0) {
       obstacles.splice(i, 1);
     }
   }
   //if the user collected 3 ingredients it proceeds to the next step.
   if (count == 3) {
     stage1results = "won";
   }
 }

 //results page for stage1
 if (
   gameState == "stage1" &&
   (stage1results == "won" || stage1results == "lost")
 ) {
   metIngredientSound.stop();
   boostSound.stop();
 }

 if (gameState == "stage1" && stage1results == "won") {
   completeSound.play();
   gameState = "stage2instructions";
 } else if (gameState == "stage1" && stage1results == "lost") {
   failSound.play();
   gameState = "stage1lost";
 }

 if (gameState == "stage1lost") {
   imageMode(CORNER);
   image(page3, 0, 0, windowWidth, windowHeight);
   //restart button
   image(
     button3,
     windowWidth / 2 - windowWidth * 0.1,
     windowHeight * 0.75 - windowHeight * 0.05,
     windowWidth * 0.2,
     windowHeight * 0.1
   );
 }

The stage 1 utilizes the same logic as my midterm project for Introduction to Interactive Media. The boost flag is triggered when the received data from the Arduino is 1. When this boost flag is triggered, the cart jumps in the game. Then, using the image() the background image is depicted. The gamer (cart) related functions are called. The game. move() is a function that is responsible for ensuring that the cart always stays inside the canvas and updates the position of the cart with gravity. The gamer.show() is the function responsible for depicting the cart itself and the gamer.jump() is responsible for the “jump” when the boost is triggered.  The if (millis() – lastObstacleTime > obstacleInterval) statement controls the creation of obstacles and ingredients. It checks if enough time has elapsed since the last obstacle/ingredient was created. If so, it generates a new obstacle/ingredient. Then, using random(), we choose if it will be the ingredient or obstacle and we will let it be part of the game. If the choice is greater than or equal to 1, it adds an ingredient to the ingredient list; otherwise, it adds an obstacle to the obstacle list. We go through the list of ingredients and the list of obstacles to depict them, check collision with the gamer, and update them. When the ingredient collides with the gamer (Cart), it adds to the count, but when the obstacle meets the gamer, the stage1results will change to “lost”. When the user collects 3 ingredients, the stage1results becomes “won” which eventually allows the user to proceed to stage 2. 

  • When gameState= “stage2” 
if (gameState == "stage2") {
    fireIntensity = map(int(stage2Data), 0, 1023, 0, 100);
    flipped = int(stage2Data2);
    flipSignal = false;
    imageMode(CORNER);
    image(stage2bg, 0, 0, windowWidth, windowHeight);

    if (!isNaN(fireIntensity)) {
      // flipping the steak
      timeToFlip = map(fireIntensity, 0, 100, 10000, 2000);
      steak.draw(steakimg);
      if (flipped == 1) {
        if (!grillingSound.isPlaying()) {
          grillingSound.play();
        } else {
          grillingSound.stop();
          grillingSound.play();
        }
        steak.flip();
      }
    }
  }
  //stage 2 pages
  if (
    gameState == "stage2" &&
    (stage2results == "won" || stage2results == "lost")
  ) {
    grillingSound.stop();
  }
  if (gameState == "stage2" && stage2results == "won") {
    gameState = "stage3instructions";
    completeSound.play();
  } else if (gameState == "stage2" && stage2results == "lost") {
    gameState = "stage2lost";
    failSound.play();
  }
  if (gameState == "stage2lost") {
    imageMode(CORNER);
    image(page5, 0, 0, windowWidth, windowHeight);
    //restart button
    image(
      button3,
      windowWidth / 2 - windowWidth * 0.1,
      windowHeight * 0.75 - windowHeight * 0.05,
      windowWidth * 0.2,
      windowHeight * 0.1
    );
    steak.reset(); //resetting when the stage2 lost
  }

When the gameState is “stage2”, the fireIntensity is mapped to the potentiometer value from the Arduino and flipped is the value of the button sent from the Arduino. When the fireIntensity is not NaN, i.e., there is a value coming from the Arduino side, timeToFlip is mapped to the fireInensity in the inverse manner, i.e., as the fireIntensity grows larger, timeToFlip becomes smaller. And we depict the image of steak using the steak.draw(). Then, when the flipped==1, i.e., the button is pressed, the steak.flip() . In this steak.flip(), all “flipping related” properties are included. This function is responsible for updating the flipCount, which is counted when the user flipped the steak in perfect timing, and checking whether the steak is “burnt” because the user flipped it too late. When the steak is burnt, the player loses. When the flipCount becomes 6, the stage2results becomes “won” and the player eventually gets a chance to play the third stage

  • When gameState== “stage3”
    if (gameState == "stage3") {
        addPerPress = map(int(stage3Data), 32, 1023, 0, 20);
        if (addPerPress != 0) {
          //playing the sound of the bottle being squeezed
          if (!squishSound.isPlaying()) {
            squishSound.play();
          } else {
            squishSound.stop();
            squishSound.play();
          }
        }
        imageMode(CORNER);
        image(stage3bg, 0, 0, windowWidth, windowHeight);
        sauce.drawSauce(windowWidth / 2, windowHeight * 0.3, 40, 20, sauceimg);
        sauce.drawPercentage(
          windowWidth / 6,
          windowHeight * 0.1,
          windowHeight * 0.1,
          font
        );
    
        let status = sauce.checkSauceLevel();
    
        fill(0);
        textSize(16);
        textAlign(CENTER);
        if (status === "gameWon") {
          stage3results = "won";
        } else if (status === "gameOver") {
          stage3results = "lost";
        }
      }
    
      //results page for stage 3
    
      if (
        gameState == "stage3" &&
        (stage3results == "won" || stage2results == "lost")
      ) {
        squishSound.stop();
      }
    
      if (gameState == "stage3" && stage3results == "won") {
        gameState = "game ended";
        completeSound.play();
      } else if (gameState == "stage3" && stage3results == "lost") {
        gameState = "stage3lost";
        failSound.play();
      }
    
      if (gameState == "stage3lost") {
        sauce.amount = 0;
        imageMode(CORNER);
        image(page7, 0, 0, windowWidth, windowHeight);
        //restart button
        image(
          button3,
          windowWidth / 2 - windowWidth * 0.1,
          windowHeight * 0.75 - windowHeight * 0.05,
          windowWidth * 0.2,
          windowHeight * 0.1
        );
      }
    

    When gameState== “stage3”, addPerPress is mapped to the flex sensor data sent from the Arduino. This addPerPress affects the sauceamount which influences sauce.drawSauce(), a function that depicts the sauce. In sauce.drawSauce(), the addPerPress is continuously being added to the sauceamount. Variables calledsauceWidth and sauceHeight are mapped with the sauceamount. As these variables are utilized in “resizing” the sauce image, as the addPerPress increases, the size of the image increases. The sauce.drawPercentage() depicts the progress of the sauce being released. 100% here indicates that it is in the perfect amount. The color of the text changes based on the percentage range of the sauceamount. 

What are some aspects of the project that you’re particularly proud of?

Honestly, I am so proud of the aesthetics of my project. The starting point of this project was to make “something aesthetically pleasing yet interactive and easy to play with”. So, I have put lots of effort into making the interface of the project cute. Believe or not, but it took me longer to deal with Canva to create the pages and elements used in the project. As much as I’ve put great effort into making the interface, I am very proud of the visual aspects of my project. 

 

Links to resources used

Challenges faced and how you tried to overcome them

While I did not encounter super “big” challenges working on this project, I have encountered a few minor challenges, which took me quite a long time to figure out. First of all, I have encountered serial communication stopping all of a sudden when the p5.js sketch is ongoing. I was very confused as I thought that I coded everything properly. However, it turned out that I did not initiate the handshake properly between the Arduino and the p5.js. It was just that it worked out of luck. So, following the slides from the class, I rewrote my Arduino code. Another issue I encountered was concern regarding the resizing of elements. While all the elements were resized with the resize of window size, the classes that were initiated inside the setup() were not resized as the setup() is only called once. So, I tried to utilize noLoop(), if-else statements, and other functions, but nothing worked. However, when I ultimately tried to re-initiate the classes that are initialized in the setup() in the windowResized(), it worked. I guess this is because the windowResized() is called only when the window is literally being resized unlike draw() which is continuously being drawn. 

What are some areas for future improvement?

My project could be improved in many ways. One way I could improve my project is to add more stages. For instance, chop some garnishes and cook them as well. Also, I think it would be interesting to make the user select the steak doneness (rare, medium, well done) and adjust the flipping time and flipping count accordingly. Another improvement could be that there will be a judge evaluating the steak that the user cooked. I think these could make my game more interesting and fun.

IM SHOWCASE DAY

A video of someone playing my game :

Some pictures from the Showcase: 

The pictures above are some pictures of people playing my game that I took. My game was way more popular than what I have thought! Many people came to play my game. People especially loved the stage 3 of my game. Lots of people asked me how I made the “squeezing” the sauce physically affect the game itself ((Thanks to the flex sensor!)) From brainstorming to the showcase, this journey of creating the final project was stressful yet very fun.  As always, I have truly enjoyed making the game and working with the Arduino. I cannot believe how the time flies so fast and all IM projects are done. This journey was truly unforgettable and learning experience.

Final Project Proposal (week 12)

For my final project, I am creating a Sign Language Glove, aimed at facilitating communication and improving accessibility for individuals who are deaf or hard of hearing. It is limited it to fingerspelling words using the American Sign Language alphabet for now. The glove incorporates flex sensors on each finger which detects how much the finger is bent. Arduino will process this data and send the finger configurations to a p5.js sketch, which will interpret the gestures and recognize the corresponding letters of the alphabet. The p5.js screen will display the recognized letters visually and audibly using text-to-speech.

There will be two options the user can select from: translating ASL to English and translating English to ASL. For the first program, the user spells out a word using the sign for each letter and p5 will read it aloud. For the second program, users will have the option to input a word via keyboard to display the corresponding ASL sign for each letter on the screen. This interactive system enables individuals that use sign language to have two-way communication with non-sign language users effectively.

Final Project – Treasure Hunt

Concept

Embark on an exciting treasure hunt adventure in a mystical world filled with hidden riches and formidable foes. In this immersive game, you’ll take on the role of either a skilled hunter or a stealthy ninja, depending on your taste.

Your quest begins in a green plain, where ancient treasures lie scattered, waiting to be discovered. However, time is of the essence, as you’ll have a limited duration to uncover the hidden treasures . Utilize your character’s agility and the radar to navigate through the treacherous terrain, avoiding obstacles and evading any lurking dangers.

As you progress through, the challenge intensifies. In level 2, you’ll find yourself navigating through a desolated space, where slimes, slimy creatures with a voracious appetite, roam freely. These formidable foes will stop at nothing to impede your treasure-hunting endeavors, so you’ll need to stay vigilant and employ strategic maneuvers to outmaneuver them.

To aid you in your quest, you’ll have access to a powerful radar system. By activating the radar, you’ll receive visual cues indicating the proximity of nearby treasures. However, use this ability judiciously, as it has a limited cooldown period, during which you’ll be vulnerable to potential threats.

Will you have what it takes to overcome the obstacles, outsmart the slimes, and emerge victorious with a bountiful collection of treasures? Prepare yourself for an adventure like no other, where quick reflexes, strategic thinking, and unwavering determination will be the keys to your success.

Immerse yourself in this captivating treasure hunt, where every step brings you closer to uncovering the secrets of a long-forgotten civilization. Embrace the thrill of the chase, and let your adventurous spirit guide you through this extraordinary journey.

Implementation

Interaction design:

In terms of interaction design, I had to consider two types of interaction, use interaction with the game, and the interaction between the arduino board the p5js. In terms of the serial communication, arduino sends 4 different values: buttonX, buttonY, positionX and positionY of the thumb joysticks. In the P5js code, the readSerial function splits and trims the incoming data, and each of the variables assigned to the respective incoming value. As for the p5js part, it sends the detected value, which then controls the led.

As for user interaction, I attempted to make the instructions as clear and concise as possible, but even then there were certain aspects of the game that were slightly vague. Inside the game, the user goes through 3 instruction panel, one introducing the game and stating the objectives, second one showing the remote controller instructions and the third one between the 2 levels that explains the content of level 2. Then there are 2 options waiting for the user for winning and losing condition. Furthermore, there is another panel after instructions that allows the user to choose from two given characters. Within the game states, there were also elements of interaction that were needed, for instance to represent the cooldown of the radar, I made a pulsing effect that only starts when the radar is in cooldown.  Also, when the character is in contact with the slime, their animation changes for a brief second to let the user know they are defeated. When the treasure is found, it will stay on the map for brief period of time and then vanishes, increasing the counter.

P5JS Code:

P5js code is where the main game mechanics are implemented. Firstly, in the preload function, I had to load the sprites, sound, background picture and slime array. In the setup function, I configured the hunter, ninja and the slime sprites’ animations and other necessary changes. In the draw function, there are multiple game states such as intro, choose, level1, between, level2, win, and lose. Moreover, there are several helper functions at the bottom part of the code, the helps with the in game mechanics.

  1. Game State “intro”:
    • Displays the background image (bg1).
    • Shows the title “Treasure Hunt” and a description of the game with instructions on how to play.
    • Prompts the user to press the space bar to start the game.
  2. Game State “instructions”:
    • Displays the background image (bg2).
    • Shows the title “Instructions” with details about controls (joystick for movement, green button for confirmation, red button for radar activation with cooldown).
    • Prompts the user to press enter to continue.
  3. Game State “choose”:
    • Displays the background image (bg3).
    • Presents the options to choose a character between “Ninja” and “Hunter” and how to select (using the green button and joystick button).
    • Allows the player to choose a character by moving the joystick and pressing the green button.
    • Transitions to “level2” if a character is selected.
  4. Game State “level1”:
    • Displays the background image (bg4).
    • Updates the timer and displays the remaining time.
    • Spawns treasure chests and updates the treasure count based on collected treasures.
    • Handles character movement and treasure detection based on the chosen character (Ninja or Hunter).
    • Updates the game state to “between” when all treasures are collected.
  5. Game State “between”:
    • Displays the background image (bg3).
    • Presents a transition message to indicate the start of the next level (“Level2”).
    • Waits for the player to click to continue and transitions to “level2” when clicked.
  6. Game State “level2”:
    • Displays the background image (bg1).
    • Resets the timer and displays the remaining time for the new level.
    • Spawns new treasure chests and updates the treasure count based on collected treasures.
    • Handles character movement, treasure detection, and enemy interactions (slimes).
    • Updates the game state to “win” if all treasures are collected or to “lose” if the timer runs out or the character is attacked by slimes.
  7. Game State “win”:
    • Displays the background image (bg2).
    • Stops the game’s background sound.
    • Congratulates the player on winning, displays the time taken, and prompts to restart by clicking the green button.
  8. Game State “lose”:
    • Displays the background image (bg3).
    • Stops the game’s background sound.
    • Informs the player of losing and prompts to restart by clicking the green button.

P5js Code Snippet:

if (chosenCharacter === "ninja"){
      hunterSprite.position.x = -200;
      updateSprite(ninjaSprite, "walk", 2);//function that enables the movement of sprite
      removeFoundTreasure(ninjaSprite, treasures);//function that removes the found treasures
      if (buttonY && canDetectTreasure) {
        detected = checkSurroundingForTreasure(ninjaSprite);
        canDetectTreasure = false; // Disable treasure detection for cooldown
        setTimeout(() => canDetectTreasure = true, 3000); // Enable detection after 3 seconds
  }
    } else if (chosenCharacter === "hunter"){
      ninjaSprite.position.x = -200;
      updateSprite(hunterSprite, "walk", 2);
      removeFoundTreasure(hunterSprite, treasures);
      if (buttonY && canDetectTreasure) {
        detected = checkSurroundingForTreasure(hunterSprite);
        canDetectTreasure = false; // Disable treasure detection for cooldown
        setTimeout(() => canDetectTreasure = true, 3000); // Enable detection after 3 seconds
  }
    }

Arduino:

The Arduino code sets up a system to interface with a thumb joystick, two buttons with LED indicators. The pins for these components are defined at the beginning: `D_pin` for the joystick switch, `Y_pin` for the Y output of the joystick, and `X_pin` for the X output. Two digital pins, `button1Pin` and `button2Pin`, are designated for the buttons, while `gledPin` and `rledPin` are used for green and red LEDs, respectively. In the setup function, the pins are configured accordingly – `D_pin` is set as an input for the switch, and the LED pins are set as outputs. Additionally, the serial communication is initialized at a baud rate of 9600.

The main loop of the code continuously reads the analog values from the X and Y outputs of the joystick, as well as the digital states of the two buttons. It also listens for incoming serial data. When serial data is available and ends with a newline character, the code parses the data into an integer (`serialValue`). If `serialValue` is 1, it turns on the green LED and turns off the red LED; otherwise, it does the opposite. The joystick position (X and Y values) and button states are then sent to the serial port in a comma-separated format for external processing or monitoring. Finally, a small delay of 200 milliseconds is included before the loop repeats, allowing for consistent and controlled operation of the system.

Arduino Code Snippet:

void loop() {
  int posX = analogRead(X_pin);
  int posY = analogRead(Y_pin);

  button1State = digitalRead(button1Pin);
  button2State = digitalRead(button2Pin);
  
  if (Serial.available()) {
    int serialValue = Serial.parseInt();//incoming detection value
    if (Serial.read() == '\n'){
      if (serialValue == 1) {
        digitalWrite(gledPin, HIGH); // Turn LED on
        digitalWrite(rledPin, LOW);
    } else {
        digitalWrite(gledPin, LOW);  // Otherwise, turn LED off
        digitalWrite(rledPin, HIGH);
      }
    }
  }
  //sending data to p5js
  Serial.print(posX);
  Serial.print(",");

  Serial.print(posY);
  Serial.print(",");

  Serial.print(button1State);
  Serial.print(",");

  Serial.println(button2State);

  delay(200); // create a small delay before repeat loop
}

Part That I Am Proud Of:

Within this project, I am proud of the different sprite interactions that I was able to utilize. Whether it is founding the treasure or being in contact with the sprite, they were all very interesting to think and code. Furthermore, I am also proud of the hardware construction of the project, especially implementing joystick and building the outer box, the hard part was putting and sticking the board inside the ox, but luckily I had relatively bigger box.

Resources used:

https://support.arduino.cc/hc/en-us/articles/4408887452434-Flash-USB-to-serial-firmware-in-DFU-mode

https://p5js.org/reference/

https://www.arduino.cc/reference/en/libraries/

Challenges faced:

One of the biggest challenges I faced was rather mysterious one. My laptop was not recognizing the arduino board. I have tried different cords, laptops, microcontrollers but it was not working, but apparently the problem was the wire that was connected to the 5v in the microcontroller. It started working after I changed the wire. Another aspect of the project that was difficult for me was soldering the led buttons. Somehow the melted alloy kept falling of after some time, which made me quite frustrated. Due to this, I spent relatively longer time soldering the wires and making sure that they are well connected.

Future Improvement:

There are definitely some improvements needed, whether it is in game mechanics or hardware aspect. For instance, there could be more variety of characters added to the game, so that users would have more variety. Since it is a level based game, there are always the opportunity to make more levels. Also, it could be better to use the collected treasures to procure something, whether it is a good boots to speed up or a better radar with wider range, making the game smoother and coherent. As for the hardware aspect, I would make more input possibilities like piezo speaker in order to let the user know if the radar detects a treasure or not. This could be much better choice than led lights. Furthermore, I would update the box so that it would look like and actual gaming console like xbox or playstation.

Project Images:


Project Videos:

IMG_4601

IMG_4602

User Testing:

IMG_4595

Im Show Documentation:

The IM show went well. Several people played the game, but due to the small radar and large map, people took a bit long time to win the game, otherwise it went very smoothly. Because I keep the serial communication too long, there were times were it was not responding to the buttons, so I had to constantly refresh it, in case it did not work.

IMG_4608

IMG_4603

IMG_4605

IMG_4606

IMG_4607

Final Project | Dungeon Escapist

Game Description

Dungeon Escapist is a single-player game where players take control of an adventurer and delve through a dungeon. By using a physical controller that reacts with tilting, players can move their character, collect coins, and reach the exit before the time runs out!

Check out DUNGEON ESCAPIST Here!

Project Inspiration

Dungeon Escapist is a game heavily inspired by older titles such as Adventure (1980), with interactive elements from The Legend of Zelda: Breath of the Wild (2017). This game is also a direct sequel to my previous IM game: Pyro Dancer.

Escapist (adjective)

relating to avoiding an unpleasant or boring life by thinking, reading, etc. –Cambridge Dictionary

Unlike the previous game, however, Dungeon Escapist focuses more on problem-solving rather than player reaction. The main character, an adventurer, wishes to gain riches! He began to look around the Adventurer’s Guild to see if there were commissions. Noticing the dungeon-clearing request, our protagonist sets off to explore the uncharted areas in hopes of bathing in fame and fortune.

Gameplay Images

Main Menu Screen

Game Screen with different maps

Gameplay Videos

How the Game Works

The game is built using the p5 Play library. The dungeon (or maze) is an array of sprites. The library can then construct these using their built-in sprite and tiling system. Because of this, the game is able to have three distinct maps easily.

// Print the sensor data to the serial monitor
//currentDirection, X, Y, Z
Serial.print(currentDirection);
Serial.print(",");
Serial.print("X:");
Serial.print(x);
Serial.print(",");
Serial.print("Y:");
Serial.print(y);
Serial.print(",");
Serial.print("Z:");
Serial.println(z);

Arduino sends out a message consisting of a number, the controllerDirection, alongside XYZ information to the Serial Output. The number listed in controllerDirection is read by p5 and moves are made based on that.

//Serial Library function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    //Only run when there is a Serial message
    let fromArduino = split(trim(data), ",");
    if (fromArduino.length == 4) {
      controllerDirection = int(fromArduino[0]);
    }
  }
}

// Then....

//UP
    if (
      controllerDirection == 12 &&
      isOpen(avatar.x, avatar.y - 1, currentMap) &&
      gameState == 1
    )

Then, inside the game, a function called isOpen() checks whether the space in the array next to the player is free. If it is and the controllerDirection is specific to that direction, the character will move infinitely towards that direction until it hits a wall. This simplicity of the character movement is something I am proud of.

function isOpen(x, y, mapName) {
  //Function to check whether the tile is empty to allow moving
  let i = floor(x);
  let j = floor(y);
  let tile = mapName[j][i];
  if (tile == "." || tile == "x" || tile == "h") {
    return true;
  } else {
    return false;
  }
}

Two major issues that I found to be painful while developing the game are figuring out the scoring system and gameState. It took me a few hours to solve why using multiple game states would crash the game. Ultimately, it all comes down to checking the game state before changing to another one.

The scoring system depends on the game time and coins picked. I wanted to make it such that the longer the player spends solving the dungeon, the less score they get, whereas the coins, act as a score multiplier. I found out that instead of immediately running the timer, I can make it run after the player has collected at least one coin.

User Testing & Feedback

During the production phase (2 & 4 May 2024), the game was tested by several users (thank you). Here are some of the issues that the game encountered:

The first thing I noticed was how players were confused about what to do with the physical controller. To combat this confusion, I decided to add a tutorial screen after the title screen. It also specifically mentions to “tilt” the physical controller for more legibility.

Notable things that I observed were in particular how long it took for players to solve the maze. Originally, I thought it would take <1 minute.

Initial User Testing.

However. given the time it takes for the player to figure out the control scheme, solve the maze, and discover things, the time averages around 4 minutes. The scoring system, which is dependent on the time and coins, has then been adjusted to give leniency to the players.

Another notable issue was player sprite color. Because it looks very similar to the walls, the color has been adjusted for better visibility as well.

I also found out that people kept referring to this game to how similar it is to a mobile game called Tomb of the Mask, a hit mobile game, due to their style.

Production Images

From prototypes to actual product

Future Improvements
  • I originally wanted to make the physical controller pretty. However, by the time I finished the whole game, another project was waiting for me. I will try to see what I can do before the IM Show just to make things a bit more pretty.
  • Scoreboards. I wanted to implement scoreboards so players could compare their scores. However, this was quite difficult to implement. So I’ll try to see what I can do in the meantime.
I.M. Show Reaction!

Many thanks to everyone who played Dungeon Escapist and managed to finish the level! Here are some thoughts, reactions, and comments about the game:

“I was confused with the controls in the beginning … It is quite difficult!” -A professor

“Pretty cool.” -I.M Senior

“It’s difficult to control in the beginning. But once you get the hang of it, it gets easier.” -A lost engineer

“Cool game. The UI/UX can be improved though” -An I.M Professor

I am glad that Dungeon Escapist captured the attention of many. I noticed that the game is ‘easier’ to grasp for younger audiences, whereas the controls and movement of the game are a barricade for adult audiences. By observing the human-machine interaction, I gathered so many insights on how to improve the game. While the difficulty might need to be adjusted, specifically in the map design, I believe the skill ceiling from mastering the physical controller of Dungeon Escapist makes it enjoyable and rewarding.

Resources Used

Assets

Home · Kenney

Royalty Free 8-Bit Background Music Downloads | FStudios (fesliyanstudios.com)

Artstation

Tutorials

How to Use Gyroscopes on the Arduino – Ultimate Guide to the Arduino #43 (youtube.com)

Yes, you can make a Gyroscope using #arduino (youtube.com)

Coding Challenge #66: JavaScript Countdown Timer (youtube.com)

p5play: Using tiles, tilesets, and making pixelated background images with Photoshop (youtube.com)

Final Project Update: Interactive Mood Lamp

Concept

For my final project, I made an interactive mood lamp with five push-buttons corresponding to five moods: Day, Calm, Study, Energetic, and Night. Each mood lights up with a matching color on the arduino and p5, as well as background music from p5.

Implementation

I used a total of 5 push-buttons (3 from the Arduino kit and 2 I had at home), each for one of the moods. I also used a potentiometer to control the light brightness, an RGB LED for the light itself, and some 220ohm resistors I had at home (because I lost the ones from the kit), as well as the breadboard and jumper wires.

Each button is connected to one of the arduino’s digital ports; I used the internal pullup resistors for them to reduce the amount of components needed and to make the circuit simpler. The RGB LED is connected through resistors to PWM ports to have control over the brightness. The potentiometer is connected to an analog input.

When the arduino starts, the light is set to orange as a starting color, with a matching screen on p5 prompting the user to select a mood. Whenever any of the five buttons are pressed, the light switches to that mood’s color, as well as the p5 screen, and matching music is played. Light brightness can be controlled continuously through the potentiometer.

User Testing

I gave each of my sisters my project to try.

My first sister managed to figure it out without any instructions from me, surprisingly even knew which COM port to select. She did not figure out that brightness can be controlled by the potentiometer though.

IMG_7816

My other (younger) sister managed the same, but did not know which what to do when the COM port prompt came up. She also did not realize that brightness can be controlled.

IMG_7817

Areas for Improvement

I should add instructions for the brightness within the program, so that people are aware it can be adjusted.

I also have yet to make a casing for my project, which I hope to finish soon.

Week 12: Serial Communication (Aneeka + Amiteash)

Serial Exercise 1

For the first exercise, we used the flex sensor in a voltage divider circuit as the analog input on pin A0 to control the x-axis motion of an ellipse on a screen. The more a flex sensor is bent, the higher the resistance of the flex sensor.

Schematic:

p5 Sketch:

Serial Exercise 2

For the second exercise, we represented the movement of an ellipse along the x-axis, as controlled by mouseX, as the brightness of 2 LEDs, on pins 9 and 11. The green LED increases in brightness as the ellipse moves to the right while the red LED increases in brightness as the ellipse moves to the left.

Schematic:

p5 Sketch:

Serial Exercise 3

Here we edited Prof. Aaron Sherwood’s GravityWind example in p5 to connect to Arduino through Serial Communication. This allows the potentiometer on pin A0 to control wind speed and direction depending on turning of the potentiometer (0 to 1023 mapped to -5 to 5) and for the LEDs on pins 9 and 11 to blink alternately when the ball crosses the bottom end of the screen.

Schematic:

p5 sketch:

Code (for all three)

From the Arduino side, we used a common code, mainly to prevent errors from continuously updating the code. The setup() function set up serial communication and the behavior while a connection was yet to be established. The loop() function was as follows:

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data
    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin, left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(potPin);
      delay(5);
      Serial.println(sensor);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

From the p5 side, our code didn’t differ greatly from the Serial Communication template. Where our code differed was mainly in mapping analog inputs to fit the range of values of the sketch width/maximum viable wind speed (for exercises 1 and 3 respectively), while variables called ‘left’ and ‘right’ were assigned to track changes to the LED state due to changes in the sketch.

One code we were particularly proud of in Exercise 3 was how we alternated the blinking of the LEDs depending on the number of times the ball bounced.

if (position.y > height - mass / 2) {
  velocity.y *= -0.9; // A little dampening when hitting the bottom
  position.y = height - mass / 2;
  left = (left + 1) % 2;
  if (left == 0) {
    right = 1;
  } else {
    right = 0;
  }
}

The code for assigning the values to the right LED may look needlessly complicated, and it seems that both could use the (x + 1) % 2 for both LEDs, only assigning one as 0 and the other as 1 at start. However, this method allowed for both LEDs to be dark before the first bounce of each ball (which can be seen in the demo if slowed), which we felt was key to the prompt.

Demo (for all three)

Week 12: Serial Communication (Amiteash + Aneeka)

Serial Exercise 1

For the first exercise, we used the flex sensor in a voltage divider circuit as the analog input on pin A0 to control the x-axis motion of an ellipse on a screen. When the flex sensor is fully bent, The more a flex sensor is bent, the higher the resistance of the flex sensor.

Schematic:

p5 Sketch:

Serial Exercise 2

For the second exercise, we represented the movement of an ellipse along the x-axis, as controlled by mouseX, as the brightness of 2 LEDs, on pins 9 and 11. The green LED increases in brightness as the ellipse moves to the right while the red LED increases in brightness as the ellipse moves to the left.

Schematic:

p5 Sketch:

Serial Exercise 3

Here we edited Prof. Aaron Sherwood’s GravityWind example in p5 to connect to Arduino through Serial Communication. This allows the potentiometer on pin A0 to control wind speed and direction depending on turning of the potentiometer (0 to 1023 mapped to -5 to 5) and for the LEDs on pins 9 and 11 to blink alternately when the ball crosses the bottom end of the screen.

Schematic:

p5 sketch:

Code (for all three)

From the Arduino side, we used a common code, mainly to prevent errors from continuously updating the code. The setup() function set up serial communication and the behavior while a connection was yet to be established. The loop() function was as follows:

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data
    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin, left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(potPin);
      delay(5);
      Serial.println(sensor);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

From the p5 side, our code didn’t differ greatly from the Serial Communication template. Where our code differed was mainly in mapping analog inputs to fit the range of values of the sketch width/maximum viable wind speed (for exercises 1 and 3 respectively), while variables called ‘left’ and ‘right’ were assigned to track changes to the LED state due to changes in the sketch.

One code we were particularly proud of in Exercise 3 was how we alternated the blinking of the LEDs depending on the number of times the ball bounced.

if (position.y > height - mass / 2) {
  velocity.y *= -0.9; // A little dampening when hitting the bottom
  position.y = height - mass / 2;
  left = (left + 1) % 2;
  if (left == 0) {
    right = 1;
  } else {
    right = 0;
  }
}

The code for assigning the values to the right LED may look needlessly complicated, and it seems that both could use the (x + 1) % 2 for both LEDs, only assigning one as 0 and the other as 1 at start. However, this method allowed for both LEDs to be dark before the first bounce of each ball (which can be seen in the demo if slowed), which we felt was key to the prompt.

Demo (for all three)