Week 2 – Creative Reading Megan

Casey Reas Eyeo talk on chance operations

How are you planning to incorporate random elements into your work? Where do you feel is the optimum balance between total randomness and complete control?

Before answering this questions, I wanted to implement by myself the concepts the video was talking about. Randomness. One quote by Casey Reas really stayed with me: “change it by a little so that movement creates chaos in an ordered way.” And I think that perfectly represents what I was trying to do in my work. The things I changed to create this element of randomness were data, small pieces of data that might seem insignificant at first, but machines are built based on that, so even the smallest thousandth can make a difference.

This brings me to another quote from the video, about Dadaism: “Dada wished to replace the logical nonsense of the men of today with an illogical nonsense.” To what point is something considered logical? Computers are supposed to be based on pure logic. And yet, artists find ways to turn that logic into illogical sense. And yet, it still has meaning. It’s like deconstructing the logic embedded in the machine with the purpose of illogically creating something that has meaning behind it. Or the other way around, creating something without meaning by using the logic of the computer. Either way, I feel this can be applied to the artworks shown in the video.

A quote by Gerhard Richter captures this idea very well: “Above all, it’s never a blind chance; it’s a chance that is always planned but also always surprising.” These artists construct from chance as a base. It’s about bringing in disorder that has been curated with intention, in a way that even surprises the artist themselves. I think that finding this balance between total randomness and complete control is about using logic and repetition, patterns and algorithms that allow you to repeat a process, while the outcome is completely different, but the essence of it remains present.

Something that really caught my attention in the video was the idea of the million random digitized values and the 1,000 normal deviates. It honestly amazed me to think about the power, the expectation, and the importance that randomness has in our lives when we understand that art imitates or simulates the real world, and the real world is chaotic and not curated by anyone, but rather filled with randomness. Without a doubt, this video and this work opened my eyes and helped me understand even better the importance of chaos within order, and order within chaos.

Week 2 – Zere – Creative Reading Response

  Casey Reas’s talk made me look at the concept of randomness differently. In my prior opinion, randomness was something that took away the “magic” of art, as it took away the artist’s control over what their piece will look like. Yet, Reas’s speech gave a new perspective of randomness, as in, that it can be used with intentionality, utilizing logical rules given and created by the artist, therefore giving them a sense of control over their work. Generative art is exactly that in my opinion. I think that the “magic” of art in this case is the artist designing the structure, giving the rules, setting the limits on variations etc. Artists in this case, I think, showcase that code can be used as an art medium much like chalk, acrylic paint, colored pencils and various other types of mediums. If looked from the perspective that computers and generative tools were created by humans for humans, the same as paint and canvas, generative art is art. An artist can control the limits of the “random” decision made by the computer, and it can be exciting. Art is meant to be exciting, at least in my perspective of it. I am not saying it is meant to reach a deep part of your soul every time, but one of the reasons many people create art is for that sense of excitement it brings you. 

       Generative art is unique because of its randomness. One of the things that kept appearing in my head is chance operations in dance. I have taken Intro to Modern Dance last semester, and our professor introduced us to Merce Cunnigham’s work. His idea was to have a set of moves, number each of them, then randomly select the numbers and therefore build a unique dance each time. I feel like this is one of the examples of how I would like to utilize random elements in my work – having a set of elements that I adhere particular meaning to, then randomizing their order to see how many new and unique combinations I can get. In my opinion this is also an example of balancing randomness and control – you give an algorithm/ a machine a set of elements/variables that matter to you as an artist, but leave the combinational part up to the machine.

Week 2 – Reading Reflection – Kamila Dautkhan

This video really put into words something I’ve been figuring out with my own work. It’s not “randomness” itself that I find interesting, but what happens when you give a system a little bit of freedom to play with the code itself. I really like when my work starts with something simple and then escalates into something complex and interesting ! Also, I really liked the idea of the artist as a gardener, not a painter.

At the same time, the video’s question about “true randomness” and convergence really caught my attention. If you let systems run long enough, they tend to eventually form some sort of patterns. And it is exactly the kind of thing I enjoy the most: you start very broadly and then some kind of order appears.  And that is exactly what I want to see in my own generative pieces. I don’t want pure static, and I don’t want something completely predictable. I want that middle ground where you can still recognize my own aesthetic in the codes I wrote. This video helped me realize that I’m not just making art with controlled chaos, I’m actually designing a system’s tendencies and then letting the computer show me just how far they can be pushed.

 

there are no perfect circles in nature

This assignment was non less than an endeavor in its own. All I had in mind was circles when I started. I thought of them as colliding an getting bigger or something like that. The hardest part was to come up with something which uses loops. Not from the coding end but from the creative one. The it suddenly hit me; draw in itself if a big while TRUE loop. Honestly I still didn’t know what to do about it, but at least I had starting point.
I wanted to add an interactive side to it as well, so the first thing I did was created pointer for the pointer, so that it has visible interactions. I borrowed the monitor frame from my self portrait to add somewhat character to the framing.

The moment I had two circles bouncing off each other. I noticed the the repeating motion. To observe it in detail I wanted to add the trail into it. I had some what trouble doing that because of the monitor underneath, the trail was hidden under it. I asked chatgpt about it. It made me realize that I don;t want my monitor to be draw again and again. So I just put it up in the setup. no I could see their movement.

The most interesting part about it was they never collided if left undisturbed. Because of the movement variables I set. But if it is disturbed by the mouse the kinda stuck in the loop. This discovery is what I am most proud of, I am not sure which part of the code represents it. It randomly reminded me of how it is necessary to go off road to meet people or places that quite possible create the biggest impact in our lives.
I used the https://p5js.org/reference/p5/text/ to write the text. It represents a vague idea I conclude from above about living

lastly the part of code I think is highlight is as follows

x += 1
  if (x % 220 == 0){
  c1 = random(0,255)
  c2 = random(0,255)
  c3 = random(0,255)
  fill(c1,c2,c3)
  
}

I like this because this where I manipulated the draw loop code to change the circle colors

The Sketch

What I can add to this is. I feel like this is very sketchy. To represent the, I would want to make more calm and smooth

Assignment 1: Self Portrait

Concept:

For the very first assignment in Intro to Interactive media, I was able to make a 2D portrait of my own face which also included parts of my identity such as lifted eyebrows, bigger eyes and writing some parts about my identity such as being a polyglot from Thailand for the very first time using coding as I previously don’t have any previous experience in coding.

Below is my finished portrait:

Screenshot

http://<iframe src=”https://editor.p5js.org/po2127/full/E8eNn1n2a”></iframe>

How it’s made:

This portrait was made using p5.js functions using 2D shapes and different functions such as Fill, Elipse, Stroke, Strokeweight, noStroke, Circle, Arc and line, these basic functions help me have a good foundation of what I need to make a realistic face and to explore and add different more specific features such as eyebrows etc.

When I started it was pretty difficult as it took me quite a while to watch the 3 videos available on YouTube and the link provided to get a good basic grasp of the functions of this platform because I haven’t coded much before and which code I need to do to make the shape of each facial features compatible with the body.

Screenshot

I designed my face to be more of starting with face frame using fill and elipse with no stroke, then eye using ellipse, and the eyehole using fill and circle, then eyebrow using arc, and nose lines and more specifically far apart because my nose is more big, lip, hair and ear using arc and neck using lines and shirt using text and stroke.

Later on adjusting more specific details such as arc of lips and eyebrows to make it fit well with how I look and my body proportions or look like me was also another hard step.

As highlighted code I’m proud of:

The part I am proud of is I was trying to add different parts of my identity to my shirt such as giving purple theme and writing some things that really resonate with me and are part of my identity on the shirt and to be able to write design of NYU shirt on the portrait.

Screenshot

Reflection:

I enjoyed creating my  self portrait, alsthought hroguhout the journey I had some difficulties especially at positioning the shapes and getting used to how the gird works, I was able to get through and I think the main thing I struggled was when to apply the noStroke function and finding the right strokeWeight for specific places such as lips etc.

I think in the future what could have been better is after I get better and more used to program I can add some side effects such as animations, and different gimics into the portrait.  But overall, I think as the first work it was already pretty fun and I feel proud to create my first very own portrait.

 

 

 

 

 

User Testing

In the user testing most of the user figured out the idea without the need for me to explain; however, there were some pitfalls were the users were  a bit confused:

    • When the two users start to play at the same time and they start to fire they sometimes missed the effects happening and failed to get that the fire portals to the other screen.
    • The hardware setup I had during testing was not finished so some users failed to get which buttons correspond to which direction.
    • Button coloring: Some user recommended having the Fire button in different color so they know it’s supposed to perform a different action than movement.
    • Some users asked about the keyboard controls even though they suspected it’s either gonna be the arrows or WASD. Also the firing button ‘V’ wasn’t clear except for gamers.

What worked really well was the communication between p5 and the neoPixel screen. Once the users got the hang of the game they enjoyed the game so much and the animation of getting hit. They also liked the separation of colors between the player: Yellow and Blue including the fires color coming out of both of them. Some were impressed by the gameplay and how the pixels smoothly switch between the two screens.

To fix the earlier issues I would have a clear instructions page on the game startup that would clarify the controls on both sides and explain scoring system. I would also the core idea of the game to even get the users excited to try it out.

Week 14 – FINAL PROJECT

A Walk Through Time: Final Project Documentation

Concept

The idea behind A Walk Through Time is to let the viewer control the flow of time with simple hand gestures. When the viewer waves their hand on one side, time moves forward. When they wave on the other side, time reverses. When no one interacts, time pauses. The system changes both the physical world and the digital world at the same time.

The physical clock hand moves using a stepper motor. A growing plant moves up and down using a DC motor and a telescoping cylinder system. On the screen, a surreal p5.js world shows time moving with colors, waves, particles, and a glowing abstract clock. Everything stays in sync and reacts at the same moment. The goal was to create one experience where movement, gesture, and time feel connected.

Project Interaction 

Interaction description:

  • The viewer stands in front of the clock and plant
  • Two ultrasonic sensors wait for hand gestures
  • Waving on the right makes the clock tick forward and the plant rise
  • Waving on the left makes the clock tick backward and the plant collapse
  • When the viewer steps away, both the clock and plant pause
  • The p5.js visuals shift to match the state: forward, backward, or paused

How the Implementation Works

The system uses two Arduinos, two motors, two sensors, and a p5.js sketch.

Main Arduino

  • Reads the left and right ultrasonic sensors
  • Decides the time state: FORWARD, BACKWARD, or PAUSED
  • Moves the stepper motor to tick the physical clock
  • Sends the state through serial as a single character (F, B, or P)
  • Sends the same data to the second Arduino

Second Arduino

  • Receives F, B, P
  • Moves the DC motor to pull or release fishing wire
  • This grows or collapses a three-layer telescoping plant

p5.js

  • Reads the same serial data from the main Arduino
  • Updates the surreal background
  • Moves particles, waves, arrows, and an abstract glowing clock
  • Lets the viewer see time flowing

Interaction Design

The interaction is very simple. The viewer uses hand gestures to control time.

Right sensor → Time Forward
Left sensor → Time Backward
Both or none → Pause

All outputs reinforce this state:

    • The physical clock hand moves
    • The plant grows or collapses
    • The digital world changes color and motion

Arduino Code

Below are the Arduino codes:

#include <Stepper.h>

const int stepsPerRevolution = 2048;
const int ticksPerRevolution = 12;
const int stepsPerTick = stepsPerRevolution / ticksPerRevolution;

Stepper clockStepper(stepsPerRevolution, 8, 10, 9, 11);

enum TimeState { PAUSED, FORWARD, BACKWARD };
TimeState timeState = PAUSED;
TimeState lastSentState = PAUSED;

const int TRIG_RIGHT = 4;
const int ECHO_RIGHT = 5;
const int TRIG_LEFT  = 6;
const int ECHO_LEFT  = 7;

const int DETECT_THRESHOLD_CM = 40;

unsigned long lastTickTime = 0;
const unsigned long tickInterval = 1000;

long readDistanceCM(int trigPin, int echoPin) {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  long duration = pulseIn(echoPin, HIGH, 20000);
  if (duration == 0) return -1;
  return duration / 29 / 2;
}

void sendStateIfChanged() {
  if (timeState == lastSentState) return;
  lastSentState = timeState;
  char c = 'P';
  if (timeState == FORWARD) c = 'F';
  else if (timeState == BACKWARD) c = 'B';
  Serial.write(c);
}

void setup() {
  clockStepper.setSpeed(10);
  pinMode(TRIG_LEFT, OUTPUT);
  pinMode(ECHO_LEFT, INPUT);
  pinMode(TRIG_RIGHT, OUTPUT);
  pinMode(ECHO_RIGHT, INPUT);
  Serial.begin(9600);
}

void loop() {
  unsigned long now = millis();

  long distLeft  = readDistanceCM(TRIG_LEFT,  ECHO_LEFT);
  long distRight = readDistanceCM(TRIG_RIGHT, ECHO_RIGHT);

  bool leftDetected  = (distLeft  > 0 && distLeft  < DETECT_THRESHOLD_CM);
  bool rightDetected = (distRight > 0 && distRight < DETECT_THRESHOLD_CM);

  if (leftDetected && !rightDetected) timeState = BACKWARD;
  else if (!leftDetected && rightDetected) timeState = FORWARD;
  else timeState = PAUSED;

  if (now - lastTickTime >= tickInterval) {
    lastTickTime += tickInterval;
    if (timeState == FORWARD) clockStepper.step(-stepsPerTick);
    else if (timeState == BACKWARD) clockStepper.step(stepsPerTick);
  }

  sendStateIfChanged();
}

const int ENA = 6;
const int IN1 = 5;
const int IN2 = 4;

enum TimeState { PAUSED, FORWARD, BACKWARD };
TimeState state = PAUSED;

byte motorSpeed = 80;

unsigned long lastChangeTime = 0;
const unsigned long maxRunTime = 10000; // 10 seconds

void setup() {
  pinMode(ENA, OUTPUT);
  pinMode(IN1, OUTPUT);
  pinMode(IN2, OUTPUT);
  Serial.begin(9600);
  lastChangeTime = millis();
}

void applyMotorState(TimeState s, byte speed) {
  if (s == PAUSED) {
    digitalWrite(IN1, LOW);
    digitalWrite(IN2, LOW);
    analogWrite(ENA, 0);
  } else if (s == FORWARD) {
    digitalWrite(IN1, HIGH);
    digitalWrite(IN2, LOW);
    analogWrite(ENA, speed);
  } else if (s == BACKWARD) {
    digitalWrite(IN1, LOW);
    digitalWrite(IN2, HIGH);
    analogWrite(ENA, speed);
  }
}

void setState(TimeState newState) {
  if (newState != state) {
    state = newState;
    lastChangeTime = millis();
  }
}

void loop() {
  if (Serial.available() > 0) {
    char c = Serial.read();
    if (c == 'F') setState(FORWARD);
    else if (c == 'B') setState(BACKWARD);
    else if (c == 'P') setState(PAUSED);
  }

  unsigned long now = millis();

  if (state != PAUSED && (now - lastChangeTime >= maxRunTime)) {
    setState(PAUSED);
  }

  applyMotorState(state, motorSpeed);
}

Circuit Schematic

(Diagram made with https://www.circuit-diagram.org/)

breakdown of schematic:

Main Arduino

  • Ultrasonic Sensor Left
    • TRIG to pin 6
    • ECHO to pin 7
    • VCC to 5V
    • GND to GND
  • Ultrasonic Sensor Right
    • TRIG to pin 4
    • ECHO to pin 5
    • VCC to 5V
    • GND to GND
  • Stepper Motor (with driver)
    • IN1 → pin 8
    • IN2 → pin 9
    • IN3 → pin 10
    • IN4 → pin 11
    • VCC → 5V
    • GND → GND
  • Serial Out
    • TX (pin 1) → RX of second Arduino

Second Arduino (DC Motor Controller)

  • DC Motor Driver
    • IN1 → pin 5
    • IN2 → pin 4
    • ENA (PWM) → pin 6
    • Motor output → DC motor
    • Vmotor → 5V
    • GND → common ground with Arduin
  • Serial In
    • RX (pin 0) → TX of main Arduino

p5.js Code

  • Reads the serial state from Arduino
  • Updates the scene
  • Changes background colors
  • Moves particles and waves
  • Animates the digital clock
  • Shows arrows for direction

 

Communication Between Arduino and p5.js

Arduino → p5.js

Sends one character:

  • 'F' — forward
  • 'B' — backward
  • 'P' — paused

p5.js reads this using the Web Serial API.
When p5.js sees the character, it updates the digital world.

What I Am Proud Of

I am proud of how everything stays in sync.
The telescoping plant mechanism was hard to build, but it works well and gives life to the piece.
The gesture-based control also feels natural, and most users understand the idea at once.

How This Was Made

The clock was laser cut and screwed into a cardboard cylinder. I used an ice cream stick for the hand, which was connected using a skewer to the stepper motor. The boxes for the ultrasonic sensors were laser cut, and I got the design from boxes.py. For the fast forward and rewind icons, I designed them in Illustrator and then laser cut them. I got the idea for the telescoping cylinder from a YouTube short (https://www.youtube.com/shorts/99a4RUlUTm0), and I made a much simpler version that I 3D printed. I used another cardboard cylinder that I cut open to place the plant in and attach the DC motor and wheel at the top. I used acrylic paint, with help from an art major friend, to paint a background scene with the plant and sky.

The p5.js code was written through many tests and changes to connect smoothly with the Arduino using Web Serial. The designs for the scene, the clock visuals, and the interaction layout were made in small steps until everything felt right. The writeup was also done in simple clear language to explain the full system. All media was either created by me, painted by a friend, laser cut from my designs, or made using free online tools.

Areas for Future Improvement

The clock could be painted with a wooden-style finish to look more complete. I also still want to explore the original rotary sensor idea. The plan was to let the user manually rewind the clock by hand, and the system would detect this and move backward. I tested this with gears connecting the rotary sensor to the stepper motor, but the motor was too weak or the gears did not line up. I want to try again with stronger parts.

Finally, while the p5.js visuals look good and support the project, I feel there may be more ways to integrate the digital space with the physical movement. This is something I want to improve in the future.

Week 13 – user testing

User Testing: A Walk Through Time

I have conducted the user testing for the project, and the reactions were very positive. Most users understood the main idea right away. When the clock hand moved forward, they saw it as time moving forward. When it moved backward, they understood that time was reversing. This was a good sign that the core interaction is intuitive and does not need much explanation.

The gesture control also made sense to most people, but a few were unsure at first about which sensor to wave their hand over. To make this clearer, I decided to laser cut simple icons for fast forward and rewind and attach them to the ultrasonic sensors. This small change makes the mapping between gesture and action much more obvious.

One interesting issue that came up during testing was the behavior of the plant mechanism. The DC motor pulls fishing wire that extends a telescoping plant, and it collapses when the motor goes in reverse. Some users kept reversing time for too long, which caused the wire to unwind so far that it started rolling in the opposite direction. This made the plant rise again by mistake. Another related problem was users sending the plant up too high until it almost reached the motor.

To address this, I am adding a failsafe to the DC motor logic. The system will now prevent the motor from spinning in the same direction for too long. This will keep the fishing wire from fully unspooling and will protect the telescoping structure from being pulled too far up. This fix makes the physical system more reliable and safer for open interaction

Week 12 – Final Proposal

A Walk Through Time

A Walk Through Time is an interactive artwork that combines a physical clock, motion sensors, a DC motor, a stepper motor, and a digital surreal time-scape made in p5.js. The goal is to let the viewer control the flow of time with simple gestures, and watch both the physical world and the digital world respond in sync. When the viewer waves a hand on one side, time moves forward. When the viewer waves a hand on the other side, time moves backward. When no one is interacting, time pauses.

Arduino: Design and Behavior
The hardware side uses:

Two ultrasonic sensors

A stepper motor that drives a physical clock hand

A DC motor that rotates forward or backward depending on the state

A communication link to p5.js through Web Serial

A second Arduino that receives the state and drives the DC motor

Inputs (Arduino)

Left ultrasonic sensor

    • Detects a hand or body close to it
    • If only the left sensor sees something, time moves backward

Right ultrasonic sensor

    • Detects a hand or body on the right side
    • If only the right sensor sees something, time moves forward

Both sensors together or none

    • Time enters a paused state

The Arduino reads both sensors and decides one of three states: FORWARD, BACKWARD, PAUSED

Outputs (Arduino)

Stepper motor movement

    • Moves a physical clock hand
    • A full rotation is broken into 12 “ticks”
    • In FORWARD state the stepper ticks clockwise
    • In BACKWARD state the stepper ticks counterclockwise
    • In PAUSED state it holds still

Serial output sent to p5.js

    • The Arduino sends a single character representing the state:
    • ‘F’ for forward
    • ‘B’ for backward
    • ‘P’ for paused

Serial output to the second Arduino (DC motor controller)

    • The same state characters (F, B, P) are sent out
DC Motor Arduino

The second Arduino receives the state from the first one:

    • ‘F’ → DC motor spins forward
    • ‘B’ → DC motor spins backward
    • ‘P’ → DC motor stops
p5.js: Design and Behavior

The digital part is a surreal, dreamlike time-space. It reacts in real time to the state coming from Arduino. The design uses motion, color shifts, particles, waves, ripples, and a glowing abstract clock.

Inputs (p5.js)

Serial data from the Arduino

    • Reads incoming characters: F, B, or P
    • Updates timeState
    • Applies visual changes based on the state

Keyboard fallback for testing

    • F B P keys switch states if Arduino is not connected

Behavior of the digital scene

The scene changes in several ways depending on the state, reflecting time going forward, backwards, or stopped.

 

KICK ‘N’ SAVE – Final Project

Concept

For my final project in Intro to IM, I created KICK ‘N’ SAVE, an interactive penalty-kick game that combines Arduino hardware, a joystick + arcade button, p5.js animation, and machine-learning hand-tracking using ML5’s Handpose model.

The idea was to create a soccer experience where the player controls the shooter with a physical joystick, while the goalkeeper is controlled by hand gestures picked up by a webcam. This combination of digital and physical interaction makes the gameplay energetic, intuitive, and fun for spectators.

The goal was to build something that feels like an arcade mini-game, is simple to understand immediately, and still feels alive because of the hand-controlled goalkeeper.

Images of the Project

Schematic

User Testing Videos

Video 1

How the Implementation Works

Interaction Design:

KICK ‘N’ SAVE involves two simultaneous interactions:

  • Shooter (player 1):
    Uses a physical joystick

    • Tilt ← → to select shot direction

    • Press the push button to shoot

    • LED flashes green if a goal is scored

  • Goalkeeper (player 2):
    Controlled by hand gestures using a webcam

    • Move hand to left/center/right

    • ML5 Handpose tracks the index fingertip

    • Keeper smoothly lerps toward the detected zone

    • Attempts to block the shot

The design intentionally creates a duel between physical and digital control.

Arduino Code:

The Arduino handles:

  • Joystick left/middle/right detection

  • Button press detection for shooting

  • LED flashing animation when p5.js sends ‘G.’

  • Serial communication to send ‘L’, ‘M’, ‘R’, and ‘S’ to p5.js

Code Snippet

// PIN DEFINITIONS

const int xPin = A0;        // Joystick XOUT
const int buttonPin = 2;    // Joystick Button (SEL)

const int GREEN_LED_PIN = 10; // Green LED (NOW: Arcade Button Light)

// JOYSTICK THRESHOLDS

const int thresholdLow = 400;
const int thresholdHigh = 600;

// State variables
String lastDirection = "M";  // Start in middle
int lastButtonState = HIGH;

// LED flash duration (1 second flash)
const int LED_FLASH_TIME = 1000;

void setup() {
  Serial.begin(9600);

  pinMode(buttonPin, INPUT_PULLUP);

  pinMode(GREEN_LED_PIN, OUTPUT);

  digitalWrite(GREEN_LED_PIN, LOW);
}

void loop() {

  // A. HANDLE LED COMMANDS FROM p5.js
  if (Serial.available() > 0) {
    char c = Serial.read();

    if (c == 'G') {
      digitalWrite(GREEN_LED_PIN, HIGH);
      delay(LED_FLASH_TIME);
      digitalWrite(GREEN_LED_PIN, LOW);
    }
    else if (c == 'R') {
      // Do nothing
    }
  }

  // B. READ JOYSTICK X-AXIS

  int xVal = analogRead(xPin);
  String currentDirection;

  if (xVal < thresholdLow) currentDirection = "L";
  else if (xVal > thresholdHigh) currentDirection = "R";
  else currentDirection = "M";

  if (currentDirection != lastDirection) {
    Serial.println(currentDirection);
    lastDirection = currentDirection;
  }

  // C. READ JOYSTICK BUTTON (SHOT)
  int buttonState = digitalRead(buttonPin);

  if (buttonState == LOW && lastButtonState == HIGH) {
    Serial.println("S");
  }

  lastButtonState = buttonState;

  delay(50);
}

Here is the link to the full code on GitHub: Github

p5.js Code:

The p5.js sketch handles:

  • Rendering the game visuals

  • Animating the ball based on the joystick-chosen direction

  • Keeper movement driven by ML5

  • Collision detection

  • Sending ‘G’ or ‘R’ back to Arduino

Embedded sketch

Parts of the Project That I’m Proud of

One of the things I’m most proud of in this project is how naturally the hybrid interaction system came together. The combination of a physical joystick for the shooter and ML5 hand tracking for the goalkeeper created a dynamic, two-sided experience that feels genuinely interactive and different from typical p5.js games. I’m also especially proud of the smooth goalkeeper movement—using lerp() to reduce jitter made the keeper feel responsive yet realistic, which dramatically improved gameplay. I’m also pleased with the UI design, especially the cartoon-style intro page, which gives the game a professional and cohesive look. On the technical side, achieving stable serial communication between Arduino and p5.js—with no dropped signals—was a big accomplishment and made the hardware and software feel seamlessly connected. Altogether, these elements make the project feel less like a school assignment and more like an actual mini-game someone might find in an arcade or mobile app.

Link to Resources Used

AI Tools Referenced

I used AI tools in the following ways:

ChatGPT

  • Helped debug Arduino serial issues

  • Helped rewrite and clean p5.js classes

  • Guided merging of joystick logic with animation logic

  • Assisted in writing ML5 code for controlling the goalkeeper in the three directions

Gemini

  • Helped generate images used for the project (intro page image, goalkeeper image, shooter image)

Challenges Faced & How I Overcame Them

1. ML5 Handpose jitter
  • Solved using lerp() for smoother movement

  • Added 3-zone classification instead of direct x-values

2. Joystick sending multiple repeated values
  • Fixed by only sending direction when it changes

3. Arduino errors from invisible characters
  • Caused by stray UTF-8 spaces

  • Solved by rewriting the affected lines manually

4. Serial communication timing issues
  • Added delays + ensured consistent baud rate

  • Verified using the p5 serial monitor

5. Resizing issues
  • Used scaleFactor everywhere based on the window height

  • Updated positions inside windowResized

Areas for Improvement

Looking ahead, there are several exciting improvements I’d love to bring into future versions of this project. One of the biggest upgrades would be enhancing the core game mechanics – especially by adding variable shot power so the ball’s speed and curve depend on how long the shooter holds the button. This small change would instantly add depth and skill to the gameplay. I also want to rethink how the goalkeeper reacts by introducing a realistic “dive” mechanic that forces quick decisions instead of constant tracking, making the challenge more balanced and intense. On the user experience side, adding a subtle aiming line for the shooter and a clear tracking-zone guide for the goalkeeper would solve most of the confusion players currently face. Technologically, expanding the serial communication to include haptic feedback or LED signals would make the hardware feel more alive and connected to the game. And finally, introducing polished animations – like a proper kick sequence or a dramatic save – as well as a slight 3D-style pitch perspective would elevate the visual experience far beyond the current prototype. All together, these improvements could transform the game from a fun demo into a fully immersive, replayable mini-sports experience.

IM Show Documentation

Video 1