Concept
For this project, I wanted to create more than just a static self-portrait. My concept was to build a “digital puppet”. A TV show host personality that lives inside the browser.
The goal was to achieve a level of “aliveness” using generative motion (breathing, blinking, head bobbing) and interactivity.
I wanted the character to react to the user’s cursor, shifting from professional composure to excitement depending on how the viewer interacts with the canvas.
I am particularly proud of the interactive expression system. Here is the logic that detects where the mouse is and calculates the “target” emotions, blending them smoothly over time:
// === INTERACTIVE EXPRESSIONS BASED ON MOUSE POSITION ===
// Define interaction zones
let mouseInSmileZone = mouseY > height * 0.6 && mouseY < height * 0.85;
// SMILE: If mouse is in the lower portion, target a big smile
if (mouseInSmileZone) {
targetSmile = map(mouseY, height * 0.6, height * 0.85, 0.3, 1.0);
targetEyebrowRaise = 0.3; // Slight eyebrow raise when smiling
} else {
targetSmile = 0.1; // Return to subtle default smile
targetEyebrowRaise = 0;
}
// Smooth transitions to make it feel natural
smileAmount = lerp(smileAmount, targetSmile, 0.1);
eyebrowRaise = lerp(eyebrowRaise, targetEyebrowRaise, 0.1);
This code allows the smile to grow gradually as the mouse moves lower, and pulls the eyelids into a “happy squint” simultaneously, making the reaction feel genuine rather than robotic.
Reflection and Future Improvements
This process taught me how much detail goes into simulating “life.” A simple sine wave can simulate breathing, and a random timer can simulate blinking, but combining them is what creates the illusion of a living character. One significant challenge was the hand, getting the fingers to look anatomical while being drawn with code.
For future improvements, I would like to add Audio Reactivity. Perhaps connect the mouth movement to the microphone so the character can “lip sync” to my voice.