Week 5 – Midterm Progress

For my midterm, I knew I wanted to incorporate a machine learning library, specifically for gesture recognition. I initially explored building a touchless checkout interface where users could add items to a cart using hand gestures. However, I realized the idea lacked creativity and emotional depth.

I’ve since pivoted to a more expressive concept: a Mind Palace Experience (not quite a game), where symbolic “memories” float around the screen  – some good, some bad. The user interacts with these memories using gestures: revealing, moving, or discarding them. The experience lets users metaphorically navigate someone’s inner world and discard unwanted memories, ideally the painful ones. Here’s a basic canvas sketch of what the UI could look like.

At this stage, I’ve focused on building and testing the gesture recognition system using Handsfree.js. The core gestures, index finger point, pinch, open palm, and thumbs down, are working and will be mapped to interaction logic as I build out the UI and narrative elements next.

The code for different gestures.

function isPinching(landmarks) {
  const thumbTip = landmarks[4];
  const indexTip = landmarks[8];
  const d = dist(thumbTip.x, thumbTip.y, indexTip.x, indexTip.y);
  return d < 0.05;
}

function isThumbsDown(landmarks) {
  const thumbTip = landmarks[4];
  const wrist = landmarks[0];
  return (
    thumbTip.y > wrist.y &&
    !isFingerUp(landmarks, 8) &&
    !isFingerUp(landmarks, 12) &&
    !isFingerUp(landmarks, 16) &&
    !isFingerUp(landmarks, 20)
  );
}

function isOpenPalm(landmarks) {
  return (
    isFingerUp(landmarks, 8) &&
    isFingerUp(landmarks, 12) &&
    isFingerUp(landmarks, 16) &&
    isFingerUp(landmarks, 20)
  );
}

function isFingerUp(landmarks, tipIndex) {
  const midIndex = tipIndex - 2;
  return (landmarks[midIndex].y - landmarks[tipIndex].y) > 0.05;
}

The sketch link:

https://editor.p5js.org/sc9425/full/n6d_9QDTg

Leave a Reply