Midterm Project Draft: Interactive Hand-Gesture Controlled Game

Project Concept
The core idea behind my midterm project is to develop an interactive game controlled entirely through hand gestures. The game will leverage a hand-tracking library to interpret the player’s hand movements as input commands. I aim to create an engaging and intuitive user experience that does not rely on traditional input devices like keyboards, mice, or game controllers.

Design & User Interaction
Players will interact with the game through various hand gestures. For instance, an open hand gesture will start the game, while making a fist and holding it for a brief moment will pause or resume the game. The game’s mechanics and objectives will be designed around these gestures, ensuring that the player’s physical movements are seamlessly translated into in-game actions.

To detect and interpret these gestures, I will use a hand-tracking library that provides real-time hand position and gesture recognition. The player’s hand movements will be captured through a webcam and processed to identify specific gestures. Based on the detected gestures, the game will execute corresponding actions, such as starting, pausing, or resuming gameplay.

hand

 

Code Design
Gesture Detection Functions: I have implemented functions like detectOpenHandToStart() and detectHands() to detect specific hand gestures. These functions use the hand-tracking library’s predictions to analyze the hand’s position and orientation.

Hand Highlighting: The highlightHand() function visualizes the player’s hand position on the screen, enhancing user feedback and interaction.

Gesture Recognition Algorithms: Functions are OpenHand () and ClosedFist (), which distinguish between different hand gestures by analyzing the distances between hand landmarks. These algorithms are crucial for converting physical gestures into game commands.

let video;
let handpose;
let predictions = [];
let isGamePaused = false;
let fistDetectedTime = 0;
const fistToggleDelay = 2000;

function setup() {
  createCanvas(640, 480);
  video = createCapture(VIDEO);
  video.hide();
  handpose = ml5.handpose(video, modelReady);
  handpose.on('predict', results => {
    predictions = results;
  });
}

function modelReady() {
  console.log("Handpose model ready!");
}

function draw() {
  background(255);
  image(video, 0, 0, width, height);
  detectHands();
}

function detectHands() {
  if (predictions.length > 0) {
    const landmarks = predictions[0].landmarks;
    highlightHand(landmarks, 'green');

    if (isClosedFist(landmarks)) {
      let currentTime = millis();
      if (currentTime - fistDetectedTime > fistToggleDelay) {
        isGamePaused = !isGamePaused;
        console.log(isGamePaused ? "Game Paused" : "Game Resumed");
        fistDetectedTime = currentTime;
      }
    }
  }
}

function highlightHand(landmarks, color) {
  fill(color);
  landmarks.forEach(point => {
    ellipse(point[0], point[1], 10, 10);
  });
}

function isOpenHand(landmarks) {
  let minDist = Infinity;
  for (let i = 4; i <= 20; i += 4) {
    for (let j = i + 4; j <= 20; j += 4) {
      let dist = distanceBetweenPoints(landmarks[i], landmarks[j]);
      if (dist < minDist) {
        minDist = dist;
      }
    }
  }
  return minDist > 50;
}

function isClosedFist(landmarks) {
    let maxDist = 0;
    for (let i = 4; i < landmarks.length - 4; i += 4) {
        let dist = distanceBetweenPoints(landmarks[i], landmarks[i + 4]);
        if (dist > maxDist) {
            maxDist = dist;
        }
    }
    return maxDist < 40; 
}

function distanceBetweenPoints(point1, point2) {
  return Math.sqrt(Math.pow(point2[0] - point1[0], 2) + Math.pow(point2[1] - point1[1], 2) + Math.pow(point2[2] - point1[2], 2));
}

Challenges & Risk Mitigation
The most challenging aspect of this project was developing reliable gesture recognition algorithms that can accurately interpret the player’s intentions from the hand’s position and movement. Misinterpretation of gestures could lead to a frustrating user experience.

To address this challenge, I focused on refining our gesture recognition algorithms (isOpenHand() and isClosedFist()) to improve their accuracy and robustness. I conducted testing with different hand sizes and lighting conditions to ensure the algorithms’ reliability across a wide range of scenarios. Additionally, I implemented visual feedback mechanisms (via highlightHand()) to help players adjust their gestures for better recognition.

Next Steps
In conclusion, this project represents a significant step towards creating more natural and immersive gaming experiences. I aim to explore new possibilities in game design and interaction by leveraging hand gestures as input.

Leave a Reply