This project combines the thrill of techno music with the interactivity of hand gestures, providing players with a unique gaming experience. Using a webcam, players can control the spaceship’s movement with simple hand gestures, making the game both engaging and intuitive.
The game environment features a spaceship navigating through a field of objects. The objective is to avoid colliding with the objects while collecting points. The game utilizes the hand pose machine learning model to detect hand gestures, allowing players to control the spaceship’s movement by opening or closing their hand. Additionally, players can make a fist to pause or resume the game, adding an extra layer of interaction.
One aspect of the project that I’m particularly proud of is the seamless integration of hand gestures for controlling the spaceship. By leveraging the hand pose model provided by the ml5.js library, I was able to accurately detect and interpret hand movements in real time, providing players with responsive and intuitive controls. Additionally, the dynamic gameplay, with objects spawning randomly and increasing in speed over time, keeps players engaged and challenged throughout the game.
function detectHands() { if (predictions.length > 0) { const landmarks = predictions[0].landmarks; // highlightHand(landmarks, 'green'); // Optional: Uncomment to see hand landmarks if (isClosedFist(landmarks)) { let currentTime = millis(); if (currentTime - fistDetectedTime > fistToggleDelay) { isGamePaused = !isGamePaused; console.log(isGamePaused ? "Game Paused" : "Game Resumed"); fistDetectedTime = currentTime; } } else if (isOpenHand(landmarks)) { let averageX = landmarks.reduce((acc, val) => acc + val[0], 0) / landmarks.length; if (averageX < width / 2) { spaceship.setDir(-1); } else { spaceship.setDir(1); } } else { spaceship.setDir(0); } } } // Check if the hand is open function isOpenHand(landmarks) { let minDist = Infinity; for (let i = 4; i <= 20; i += 4) { for (let j = i + 4; j <= 20; j += 4) { let dist = distanceBetweenPoints(landmarks[i], landmarks[j]); if (dist < minDist) { minDist = dist; } } } return minDist > 50; }
Also, another key element that enhances the immersive experience of the Gesture-Controlled Game is the synchronization of the music with the background elements. By integrating dynamic sound effects and music, the game creates a cohesive audio-visual experience that engages players on multiple sensory levels.
function drawGameElements() { let level = amplitude.getLevel(); let size = map(level, 0, 1, 5, 20); //size for more impact // Cycle through HSB colors colorMode(HSB, 360, 100, 100, 100); colorPhase = (colorPhase + 1) % 360; for (let i = 0; i < 400; i++) { let x = noise(i * 0.1, frameCount * 0.01) * width; let y = noise((i + 100) * 0.1, frameCount * 0.01 + level * 5) * height; // Dynamic stroke color let hue = (colorPhase + i * 2) % 360; let alpha = map(level, 0, 1, 50, 100); //alpha based on volume for dynamic visibility // Simulated glow effect for (let glowSize = size; glowSize > 0; glowSize -= 4) { let glowAlpha = map(glowSize, size, 0, 0, alpha); stroke(hue, 80, 100, glowAlpha); strokeWeight(glowSize); point(x, y); } } colorMode(RGB, 255); spaceship.show(); spaceship.move(); handleParticles(); }
Improvements:
Optimization and Performance: Consider optimizing the game’s performance, especially when dealing with graphics and rendering. This includes minimizing unnecessary calculations and rendering, especially within loops like drawGameElements()
.
Game Mechanics Refinement: Assess and refine the game mechanics for a better gameplay experience. This could involve adjusting spaceship movement controls, particle spawning rates, or particle effects to enhance engagement and challenge.