# midterm project

### Introduction

For my midterm project, I created a spooky juggling game called Morbid Juggler. I thought of the concept in week 2, and for the project involving arrays and classes I made a simple game that lets users add balls using the keyboard and juggle them using the cursor. Morbid Juggler version 2 is built on top of the same motion logic, but instead of using standard input (mouse and keyboard), players have to use their webcam and their hands to interact with the game. To add a new ball, the user makes the “🤘” gesture with their left hand. The balls automatically move in a parabolic trajectory, and the goal of the game is to not drop the balls, i.e. catch them before they leave the screen. To catch a ball, users can use any hand and pinch their fingers and drag a ball across the screen. To throw the ball again, they just have to release the pinch.

### How it works

The following explains the balls’ motion and is from my documentation for my week 2 project:

I save the initial time when an (eye)ball is created, and measure the time that has passed since. This allows me to use values of `elapsedTime` as a set of x values, which, when plugged into a quadratic equation, give a parabola. Changing the coefficients in the equation allows me to modify the shape of the parabola and thus the trajectory of the (eye)ball, such as how wide the arc created by its motion is. I played around with the coefficients and decided to use `(0.4x)(2-x)`, which works nicely for a canvas of this size.

A more detailed explanation of how the balls are stored in memory and moved with each frame update can be found here.

For tracking hands using the webcam, I used a Javascript library called Handsfree.js. I was initially going to use PoseNet, but I realized it wasn’t the best for this use case. PoseNet’s landmark model tracks 17 points on the human body, but for my game, I didn’t need all that. I just needed to track the user’s fingers, and PoseNet only returns one keypoint for the user’s wrist. So, I looked up other libraries and found Handsfree.js, which is built on the same framework, Google’s Mediapipe, as PoseNet, but it is more geared towards hand tracking. It tracks all fingers and knuckles and even has built in gesture recognition to detect pinches, which is what my users would need to do to drag balls around the screen. Furthermore, it was very easy to train the model to recognize new gestures using the website. It lets you collect your own data and create a gesture model to plug into your code. For example, this is the code for recognizing the “🤘” gesture.

```handsfree.useGesture({
algorithm: "fingerpose",
models: "hands",
confidence: "9",
description: [
["addCurl", "Thumb", "HalfCurl", 1],
["addDirection", "Thumb", "DiagonalUpRight", 1],
["addDirection", "Thumb", "VerticalUp", 0.19047619047619047],
["addCurl", "Index", "NoCurl", 1],
["addDirection", "Index", "VerticalUp", 0.3888888888888889],
["addDirection", "Index", "DiagonalUpLeft", 1],
["addCurl", "Middle", "FullCurl", 1],
["addDirection", "Middle", "VerticalUp", 1],
["addDirection", "Middle", "DiagonalUpLeft", 0.47058823529411764],
["addCurl", "Ring", "FullCurl", 1],
["addDirection", "Ring", "VerticalUp", 1],
["addDirection", "Ring", "DiagonalUpRight", 0.041666666666666664],
["addCurl", "Pinky", "NoCurl", 1],
["addDirection", "Pinky", "DiagonalUpRight", 1],
["addDirection", "Pinky", "VerticalUp", 0.9230769230769231],
],
});```

Then I can use this information like so:

```function addBall() {
const hands = handsfree.data?.hands;
if (hands?.gesture) {
if (hands.gesture[0]?.name == "addBall") {
let x = sketch.width - hands.landmarks[0][9].x * sketch.width;
let y = hands.landmarks[0][9].y * sketch.height;
console.log(x, y);
balls.push(new Ball(x, y));
// go to sleep for a second
setTimeout(() => {
}, 1000);
}
}
}```

The hardest part of this project was working with Handsfree.js. There is criminally limited documentation available on the website, and I had to teach myself how to use it by looking at the few demo projects the author of the library had created. For hand tracking, there was a piece of code that closely approximated what I wanted to do. It loops through the hands in the object returned by `handsfree.data`, and for each hand, it loops through all the fingers. Each finger can be identified using its landmark, and its location and other information can be used elsewhere in the program. For example, in `handsfree.data.landmarks[handIndex][fingertips[finger]]`, `handIndex=0` and `finger=8` represents the tip of the left index finger. In Morbid Juggler, when `handsfree.data.pinchState` for any hand or finger becomes `held` near a ball, the ball sticks to the tip of the pointer finger. When it becomes `released`, the ball restarts its parabolic motion.

To see the full code for the project, refer to the p5 web editor.

### Playing the game

The game can be played here. When the project has loaded, click on ‘start game’. It takes a second for Handsfree to load. You can check if the game is fully loaded by waving your hands in front of your webcam. You should see the skeleton for your hand mapped on the canvas.

Make a “🤘” sign with your left hand (tuck in your thumb, and make your index and pinky straight!). You’ll see a new ball pop from between your fingers. Add a few more balls like this, but be quick! Don’t let them leave the screen, or the counter in the corner of the screen will tell you how badly you’re losing even after death. To keep the balls from falling off, pinch your fingers like you would pick up a real juggling ball, drag the ball you’re holding across the screen, and release your fingers somewhere near the bottom left corner of the screen, so the ball can travel again. You can add as many balls as you like — at your own risk.

### Final thoughts

I think there’s still room to fine tune the hand tracking. If I were to upgrade the game, I would probably use a different library than Handsfree.js. The tracking information isn’t the most accurate, and neither is it consistent. For example, even when a hand is being held up still, the keypoints on screen are seen to jitter. Since during dragging, the balls stick to the tip of the pointer finger, the balls were jittering badly too. I later added some smoothing using the `lerp()` function to fix that. I also had to do a lot of trial and error when picking a gesture that would trigger the function to add a new ball. The model wasn’t very confident about a lot of the other gestures, and kept adding balls erroneously. The “🤘” sign was the final choice because it was explicit enough and did not resemble other gestures a user might inadvertently make while playing the game.

One thing worth special mention is the background music used in the game, which was produced by my friend by sampling my other friend’s vocals. I heard the track and liked the spooky yet playful feel to it and asked if I could use it in my project. My friend agreed, and now I have a bespoke, perfectly fitting music to accompany my project.