The game is a fast-paced, reaction-based experience inspired by the Speed Light Reaction Game, which I transformed into something I’m passionate about: football. The game requires players to quickly respond to an LED light by pressing the correct button. If they react correctly, they score a goal, but if they miss or press the wrong button, the ball hits a mannequin, simulating a failed attempt. The game incorporates visual and auditory feedback, a playlist of music inspired by FIFA, and a timer to create an engaging and immersive experience that challenges players’ speed and reaction time.
The game uses Arduino and p5.js software for an interactive experience.
Hardware (Arduino):
LED button lights serve as control, which lights up signaling which button to press to score, allowing user input.
Serial communication sends user responses from the Arduino to P5.js.
Software (p5.js):
Manages game visuals, including the football, goal, and mannequins.
Tracks scores, missed shots, and the countdown timer.
Plays music and sound effects to immerse players in the experience.
INTERACTION DESIGN
Title Screen:
Features:
Displays the game title with an bold font.
Includes buttons for “Start Game”, “Instructions” and “shuffle – play/pause” for music.
Design Choices:
Used bold colors and a clean layout for clarity and visual appeal.
Gameplay:
Press the matching led button that lights up to score.
Correct button presses make the ball move toward the goal, simulating a successful shot.
Incorrect or missed presses (not pressed within 1 sec) result in the ball hitting a mannequin, simulating a failed shot.
Additional on-screen buttons allow players to shuffle music or play/pause background tracks.
Feedback: The game uses visual (ball movement) and auditory (goal or miss sounds) feedback. Background music inspired by FIFA enhances the immersive experience
End Screen:
Displays the final score and missed attempts.
Includes buttons to restart the game or return to the main menu
DESCRIPTION OF P5.JS CODE:
The p5.js sketch manages the visuals, sounds, and game state.
Key Features:
Dynamic Visuals: Updates scores, displays animations for goals and misses, and tracks time.
Audio Feedback: Plays sound effects for scoring and missing.
Serial Data Handling: Receives and processes data from Arduino.
Code Snippets:
Serial Data Handling:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
functionreadSerial(data){
if(data === "BUTTON:CORRECT"){
score++;
// Animate football to the goal
}elseif(data === "BUTTON:WRONG"){
missedShots++;
// Animate football to the mannequin
}
}
function readSerial(data) {
if (data === "BUTTON:CORRECT") {
score++;
// Animate football to the goal
} else if (data === "BUTTON:WRONG") {
missedShots++;
// Animate football to the mannequin
}
}
function readSerial(data) {
if (data === "BUTTON:CORRECT") {
score++;
// Animate football to the goal
} else if (data === "BUTTON:WRONG") {
missedShots++;
// Animate football to the mannequin
}
}
Music Control:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
functiontoggleMusic(){
if(isMusicPlaying){
backgroundSounds[currentTrackIndex].pause();
isMusicPlaying = false;
}else{
backgroundSounds[currentTrackIndex].play();
isMusicPlaying = true;
}
}
function toggleMusic() {
if (isMusicPlaying) {
backgroundSounds[currentTrackIndex].pause();
isMusicPlaying = false;
} else {
backgroundSounds[currentTrackIndex].play();
isMusicPlaying = true;
}
}
function toggleMusic() {
if (isMusicPlaying) {
backgroundSounds[currentTrackIndex].pause();
isMusicPlaying = false;
} else {
backgroundSounds[currentTrackIndex].play();
isMusicPlaying = true;
}
}
DESCRIPTION OF ARDUINO CODE:
The Arduino code handles LED prompts, button detection, and serial communication with p5.js.
Key Components:
LED Control:
LEDs light up randomly, prompting user action:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
voidlightUpLED(int index){
for(int i = 0; i < 3; i++){
digitalWrite(ledPins[i], (i == index) ? HIGH : LOW);
}
}
void lightUpLED(int index) {
for (int i = 0; i < 3; i++) {
digitalWrite(ledPins[i], (i == index) ? HIGH : LOW);
}
}
void lightUpLED(int index) {
for (int i = 0; i < 3; i++) {
digitalWrite(ledPins[i], (i == index) ? HIGH : LOW);
}
}
LEDs turn off after a button is pressed or the timeout ends:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
voidturnOffLEDs(){
for(int i = 0; i < 3; i++){
digitalWrite(ledPins[i], LOW);
}
}
void turnOffLEDs() {
for (int i = 0; i < 3; i++) {
digitalWrite(ledPins[i], LOW);
}
}
void turnOffLEDs() {
for (int i = 0; i < 3; i++) {
digitalWrite(ledPins[i], LOW);
}
}
if (Serial.available()) {
String command = Serial.readStringUntil('\n');
if (command.startsWith("LED:")) {
targetLED = command.substring(4).toInt();
lightUpLED(targetLED);
}
}
if (Serial.available()) {
String command = Serial.readStringUntil('\n');
if (command.startsWith("LED:")) {
targetLED = command.substring(4).toInt();
lightUpLED(targetLED);
}
}
COMMUNICATION BETWEEN P5 AND ARDUINO:
How It Works:
Arduino sends data about user responses (correct or wrong button presses) to p5.js.
p5.js uses this data to update the game state:
Correct responses move the football to the goal and increase the score.
Incorrect responses move the football to a mannequin and increase missed attempts.
p5.js also sends signals back to Arduino to light up LEDs.
Challenges Overcome:
Initial miscommunication due to overlapping signals was resolved by implementing a debounce delay in Arduino and validation logic in p5.js.
CHALLENGES FACED:
One major challenge was managing communication between Arduino and p5.js. Initially, multiple data packets were sent for a single button press, causing disruptions in the game. To fix this, I added a debounce delay in the Arduino code:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
if(magnitudeG > impactThreshold){
Serial.print("BUTTON:CORRECT");
delay(200); // Debounce to avoid multiple signals
}
if (magnitudeG > impactThreshold) {
Serial.print("BUTTON:CORRECT");
delay(200); // Debounce to avoid multiple signals
}
if (magnitudeG > impactThreshold) {
Serial.print("BUTTON:CORRECT");
delay(200); // Debounce to avoid multiple signals
}
This ensured only one signal was sent per button press. I also validated inputs in p5.js by processing only expected packets like "BUTTON:CORRECT", which resolved signal misinterpretations.
Another challenge was ensuring strong soldering connections for the buttons and LEDs. My first attempts were unreliable, but then I secured the connections, improving hardware stability.
WHAT IM PROUD OF:
I’m proud of successfully integrating Arduino and p5.js to create a smooth and responsive game. Features like the animated football, scoring system, and FIFA-inspired music enhanced the user experience. Solving technical issues, such as serial communication and soldering, was rewarding, as they significantly improved the gameplay experience.
FUTURE IMPROVEMENTS:
One improvement would be adding a game mode where pressing the correct button contributes to building a beat or rhythm for a song. Each correct button press would play a specific musical note or drum beat, gradually creating a complete soundtrack as the player progresses. This mode would combine the fast-paced reaction element with creativity, making the game more dynamic and engaging. By turning gameplay into a musical experience, it would appeal to a broader audience and add a unique layer of interactivity. This feature could also include different difficulty levels, where faster reactions create more complex beats, challenging players’ skills and rhythm simultaneously.
My final project is a pet cat experience. For someone who loves cats but was never fortunate enough to be blessed with one because her mother dislikes the idea of having a cat in the house, I want to take this opportunity to give those who are in the same situation like myself to experience the joy of having a cat, without upsetting their mothers.
I hope this cat will provide comfort for cat lovers like me, or even turn non-cat lovers into one!
Pictures and Videos
Some pictures of the physical project:
User testing video from the first version:
Demo of final version:
Implementation
Schematic:
The cat itself is built using cardboard. The Arduino and breadboard are placed inside the body of the cat, accessible by opening the back panel of the body.
The arm of the cat is connected to a servo motor protruding from one side of the cat, and on the bottom of this same side, an electronic sensor also protrudes.
The head of the cat is also attached to the main body of the cat. On the bottom of the head, there is a circular hole to allow the wires connecting the two force sensors on the head to be able to connect to the breadboard and Arduino placed in the main body.
Over on the p5 side, I used a class to easily switch between the different states of the experience (start, play, end). Buttons were also made for user input to enter the name of the cat of their choosing and a submit button to confirm the name and trigger the sending of the name over to Arduino to be displayed on the LCD.
For the gifts, I illustrated each of the gifts on PowerPoint. The gifts are of a bird, a mouse, a yarn ball, a toy fish, and a donut.
I uploaded a meow sound onto p5 that plays when the cat is pet (based on the readings of the force sensors from Arduino).
Interaction Design
The cat is built from cardboard, and it has a collar in which the user can choose a name of their liking to be displayed on the collar. Cats love pets and users will be able to pet the cat and have it meow in content as a response.
When users extend their hand towards the cat, the cat lowers its paw and hand the users a “gift”. This act of gift giving from a cat is “an expression of affection and trust” which “shows they value the users companionship and consider them part of their social group.” (Cited from here)
However, the “gifts” the cats bring can range from random toys, to animals. This will be reflected in the p5.js sketch wherein every time a user extends their hand, the cat will give them a new “gift”.
On the Arduino side, it was used mainly to get sensor readings and sending them to p5. It was used to calculate distance based on the readings from the ultrasonic sensor, change the servo motor position based on the distance readings, and output the name from p5 onto the LCD.
Here is a snippet of the code of how I triggered the move of the cat’s arm depending on the distance reading. It was initially using delay, but since delay will stop the whole program, I used millis() for time, stated the interval for delay between each increment or decrement of the servo position, and used the following code:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
int servInt = 5;
// check if it's time to update the servo position
if(time - servoTime >= servInt){
// update the last move time
servoTime = time;
if(distance <= 10 && increasing){
// increment position
pos++;
if(pos >= 90){
// change direction
increasing = false;
}
}elseif(distance > 10 && !increasing){
// decrement position
pos--;
if(pos <= 0){
// change direction
increasing = true;
}
}
// move the servo to the new position
myservo.write(pos);
}
int servInt = 5;
// check if it's time to update the servo position
if (time - servoTime >= servInt) {
// update the last move time
servoTime = time;
if (distance <= 10 && increasing) {
// increment position
pos++;
if (pos >= 90) {
// change direction
increasing = false;
}
} else if (distance > 10 && !increasing){
// decrement position
pos--;
if (pos <= 0) {
// change direction
increasing = true;
}
}
// move the servo to the new position
myservo.write(pos);
}
int servInt = 5;
// check if it's time to update the servo position
if (time - servoTime >= servInt) {
// update the last move time
servoTime = time;
if (distance <= 10 && increasing) {
// increment position
pos++;
if (pos >= 90) {
// change direction
increasing = false;
}
} else if (distance > 10 && !increasing){
// decrement position
pos--;
if (pos <= 0) {
// change direction
increasing = true;
}
}
// move the servo to the new position
myservo.write(pos);
}
And here is the code for writing onto the LCD based on the name sent from the p5 sketch:
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
if(Serial.available()){
name = Serial.readStringUntil('\n');
name.trim();
// only update LCD if the name has changed
if(name != previousName){
// update the last displayed name
previousName = name;
// center-align the name
int offset = (16 - name.length()) / 2;
// reset to the first row
lcd.setCursor(0, 0);
// clear the row without lcd.clear()
lcd.print(" ");
// position the cursor to the beginning
lcd.setCursor(offset, 0);
// display the new name
lcd.print(name);
}
}
if (Serial.available()){
name = Serial.readStringUntil('\n');
name.trim();
// only update LCD if the name has changed
if (name != previousName) {
// update the last displayed name
previousName = name;
// center-align the name
int offset = (16 - name.length()) / 2;
// reset to the first row
lcd.setCursor(0, 0);
// clear the row without lcd.clear()
lcd.print(" ");
// position the cursor to the beginning
lcd.setCursor(offset, 0);
// display the new name
lcd.print(name);
}
}
if (Serial.available()){
name = Serial.readStringUntil('\n');
name.trim();
// only update LCD if the name has changed
if (name != previousName) {
// update the last displayed name
previousName = name;
// center-align the name
int offset = (16 - name.length()) / 2;
// reset to the first row
lcd.setCursor(0, 0);
// clear the row without lcd.clear()
lcd.print(" ");
// position the cursor to the beginning
lcd.setCursor(offset, 0);
// display the new name
lcd.print(name);
}
}
On p5, I developed a class called Screen with 3 methods that would display the different states of the experience (start, play, and end). An input box and submit button was created for the name of the cat to be read from user input and then sent to Arduino. I also used p5 to play the cat’s meow when it is petted.
Here is a snippet of the code to randomize the gift and display the gift when the user’s hand is under the cat’s arm and the cat’s hand is lowered.
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
// in the main sketch
functionrandomizeGift(){
while(!isDown){
gift = gifts[int(random(0, gifts.length))]
break;
}
}
// in the Screen class, under the play() method
// displaying gift when cat hand is lowered
if(isDown){
image(gift, width/2, height/2+h3)
gift.resize(0,300)
}
// in the main sketch
function randomizeGift(){
while (!isDown){
gift = gifts[int(random(0, gifts.length))]
break;
}
}
// in the Screen class, under the play() method
// displaying gift when cat hand is lowered
if (isDown){
image(gift, width/2, height/2+h3)
gift.resize(0,300)
}
// in the main sketch
function randomizeGift(){
while (!isDown){
gift = gifts[int(random(0, gifts.length))]
break;
}
}
// in the Screen class, under the play() method
// displaying gift when cat hand is lowered
if (isDown){
image(gift, width/2, height/2+h3)
gift.resize(0,300)
}
Communication between Arduino and p5.js
From Arduino:
Ultrasonic Sensor and Servo Motor
The ultrasonic sensor is placed on the side of the cat, under one of the cat’s paws that is raised. The reading from the ultrasonic sensor is used to calculate the distance of the hand from the cat. If the distance is smaller than a certain threshold, this triggers the servo motor to turn which causes the cat’s paw to lower and touch the user’s hand. Once the hand is lowered, it triggers the display of the randomized object the cat hands to the user on the P5 sketch. So to p5, Arduino will send the value of the calculated distance, and a boolean that states whether the hand is down or not.
Force Sensors Two force sensors are placed on the cats head, so that when a user pets the cat, this force is recorded. Both of the force readings are sent to p5 to trigger the sound of a cat’s meow to play from P5.
From p5:
LCD Screen
The P5 sketch prompts the user to fill in an input box to give the cat a name of their choosing. This name will be sent to the Arduino to be displayed on the LCD screen which is bplaced on the cat’s neck, like a collar.
Highlights
I’m really proud of how everything came together, but specifically in how I managed to handle the communication between arduino and p5 in terms of the cat hand interaction. On the arduino side, having calculated distance, using this distance value as a condition that triggers the change in position of the servo motor for the cat’s hand to lower, and using a boolean value to state if the cat’s hand is lowered or not is one thing I’m proud of for being able to pull off. Then on the p5 side, using the boolean variable of whether the cat’s hand is down or not to create a function that randomizes the gift and then within a method in the class that I created, I used this function and then displayed the gift when the cat’s arm is down. I think managing this whole process and communication to work well is something that I’m very proud of.
I initially had issues with the delay of the servo motor, since delay would stop all of the code
One of the greatest challenges that I had with this project was finding the perfect pressure/force sensor to use for the petting and meowing experience. I initially made a prototype of a pressure sensor using velostat and copper tape which worked, but when I tried implementing it on a bigger scale, it was not giving me a varying range of readings. I then turned to use a piezo sensor, but the results were similar. It was then that I found the small force sensor that was very sensitive and is exactly what I was looking for, just that it was a tad bit small, so I ended up using two of them to cover a larger area of the cat’s head to trigger the meow sound.
Future Improvements
For the future, as I have been advised by a user I had the pleasure of meeting at the IM Showcase, I would like to implement a more technical approach to the project. I would like to try using a sensor for facial expression to detect the user’s facial expression (and thus emotions) and based on this reading, the cat will have certain reactions (or perhaps facial expressions as well, displayed on a larger LCD screen) as a response to the user’s emotions. I think with this approach, the experience will create a deeper relationship between the user and the cat, just as a real owner and pet relationship.
Fall 2024 Interactive Media Showcase Documentation Tuesday 10 December 2024
At first, creating the game was challenging because I struggled to get the buttons to function the way I wanted them to. Mapping the LED buttons to the gameplay and ensuring they triggered the correct actions was particularly difficult. However, I overcame this by refining the logic for button inputs and testing repeatedly to ensure everything worked as intended. This process required a lot of trial and error, and it was hard to troubleshoot the interactions between the controls and the game logic. Despite the difficulties, resolving these issues was rewarding and helped me improve my understanding of how to create interactions in the game.
During user testing, most users were able to figure out the game easily as the instructions were clear; however, I had to explain to some users that this is a reaction-based game where they need to react to the LED button that lights up and not to focus on the defenders on the screen. While the instructions were effective, a few players needed additional clarification about the LED mapping and corresponding button. Once users understood the concept of reacting to the LED, the gameplay became much smoother. Also, I need to fix a minor issue where when the users press start, the positions of the manequins wouldn’t be where they were supposed to be until a few seconds later, it gets fixed. However, the experience worked well overall, with the audio feedback for scoring and missing enhancing the immersion. The visual design also made the game enjoyable, and the fast-paced gameplay kept users engaged. To improve the experience, I plan to add a more detailed instruction page to eliminate the need for further explanation and slightly increase the reaction time to make the game more accessible for new players. Additionally, I was inspired by the FIFA game and would like to incorporate a shuffled playlist for background music to elevate the experience further and create a more engaging atmosphere.
Over the course of the project, I encountered several challenges, particularly with the hardware and data communication between the Arduino and p5.js. Initially, the score wasn’t sending correctly from the Arduino to p5.js. The issue was from the fact that the data being sent was triggered by every impact, causing the Arduino to send multiple data quickly, one after the other. This resulted in incomplete or overlapping data being received in p5.js.
To resolve this, I modified the Arduino code to send data in separate packets. It now sends only one score value per impact and ignores any other impacts until the restart button is pressed. This solution ensured that p5.js received clean and complete data, allowing the game to function as intended.
On the hardware side, securing the sensor to the punching bag was also tricky. The sensor kept shifting due to the force of punches, which required me to reattach it multiple times. I also had to redo the wiring and soldering to ensure the connections were stable and durable during testing. Another adjustment involved adding water to the punching bag base, which I had initially removed, as the bag became unstable during heavy use.
The overall p5.js sketch is now complete. It includes a functional title screen, instruction menu, and gameplay with features such as dynamic scoring, a fuel bar, and a restart button. I used an arcade font for all the text to maintain a cohesive theme, and the punching bag’s vibration and animation added a realistic touch to the game. The game also plays sound effects after each punch, adding to the immersive experience.
USER TESTING:
During user testing, the instructions were clear, and players easily understood how to start the game. However, one recurring issue was the instability of the punching bag due to the lack of water in the base. Once I added water, the problem was resolved, and users could play without interruptions.
Another key observation was the need to test the sensor with punches of varying strengths. Users with stronger punches sometimes triggered unexpected behaviors in the scoring system. This helped me fine-tune the sensitivity of the sensor, ensuring accurate calculations regardless of punch strength. I used AI assistance to determine the optimal sensitivity settings and a built-in math file in Arduino to calculate the results from the data.
Feedback from users also highlighted that the gameplay felt smooth, and they appreciated the arcade-like visual and audio elements. The scoring system and the gradual increase of the score display were well-received, as they mirrored the pacing of arcade games. Overall, the changes I implemented addressed the main issues, and the game is now ready for final polishing.
Throughout history glasses have developed from tool for eyesight that simply follows its function to a fashion piece that remains essential for people with poor vision. As a person who is prescribed to wear glasses, I am a big fan of the frame that I have, but I decide to wear contact lenses instead, since lenses for my glasses make my eyes seem small – at least this is what I notice a lot. Lenses for myopia and hyperopia, two most popular vision conditions, distort the look of eyes through the lenses by making them smaller or bigger, respectively. However, such changes of proportions are often noticed solely by those who wear glasses and not by others – we memorise our own face features and overlook such minor changes on the faces of others.
“To Gaze is Not To See” is an interactive artwork that invites the user to put on a pair of glasses to begin their experience. The frame is red, which is what allows the visuals to appear on the screen – when web camera detects red, the ornament is displayed. Unless the user puts on the glasses, they are unable to explore the work.
The glasses are equipped with two speakers located at the end of both temples, which are connected to Arduino to play sound. The chosen song is “Triangle/Gong/What” by Wolfgang Tillmans – I have decided to play it on loop to make the experience of the user more immersive and slightly disturbing due to the surreal flow of the melody. In order to attach the speakers, I have 3D modelled the glasses from scratch using Rhino, then printed them on Prusa printer, and then spray painted with the desired shade of red. Several prototypes were made and 6 pairs of glasses were printed in total, but only one was perfected and chosen for the showcase.
The electronic components were hidden in a 3d printed eyeball that featured an LED Arcade button as an iris – when it was pressed, the visuals on the screen changed from “myopia” mandala to “hyperopia” mandala. The difference between two ornaments was in the size of the eyes and their placement on the screen, creating a confusing and almost surreal imagery of changing projections of eyes.
During the showcase more than a dozen of people came up to my project to test it. Thanks to planning the setting in advance, I have managed to set up proper light conditions to make the image captured by the web camera more clear. However, due to the loud conversations and music around, it was difficult to hear the melody playing through the glasses, so I had to adjust the volume on the spot.
All people who experienced my work were very interested by the concept and surprised by the implementation, as they did not expect to see their own eyes on the screen displayed in real time. The problem behind the concept was relatable to all participants who also wear glasses, as they have agreed that they experience the same issue.
I was happy with my concept from the start, and I think I have managed to create a good representation of the experience I was intended to capture. The most challenging part was to connect the audio, as I was not simply using tones that would work directly with Arduino UNO, but decided to play an mp3 file with an actual song. I did not realise that I would have to use additional electronic components for this until the day before the showcase, since I was so involved into modelling of the glasses and perfecting the mandala ornament visuals. As such, I had to make urgent edits of the electronic scheme, adding an MP3 Playback Shield from Adafruit, exploring and testing the libraries that had to be added to my code, then adding an amplifier for the speakers, and finally soldering a connector from actual headphones to the speakers. Nonetheless, in the end I was satisfied with the quality of the audio, and I am very proud that I did not give up with the idea of including an mp3 file and have managed to do this.
Future improvements
While I am glad that I have managed to find and implement a proper library to work with eye tracking, I believe that the visual part could be improved even further in terms of quality of the image and variety and complexity of ornaments. I have tried to use an external web camera to extract video with a higher resolution, however, this made the sketch too heavy for my laptop to run correctly. I wish to develop this project further, as I have really enjoyed working on it and I plan to include it into my portfolio.
I’ve created this project to explore one of Kazakhstan’s cultural traditions seating around the table (or Dastarkhan in kazakh). This experience alone unpacks different parts of our culture from food to music. To expose the user to those different parts, I’ve created multiple levels to guide the user’s journey in understanding what Dastarkhan is. I’ve built physical room where the table is situated and implemented sensors and buttons around it to support next interactions. Overall, I have 4 levels that user has to complete:
First level is about interaction with the buttons. I’ve positioned three red buttons around the table to symbolize different seating placements around the table in kazakh culture. Depending on your social status and age, your place around Dastarkhan will be differentTherefore, user had to press different buttons to explore the meaning of the places. When user pressed the button the corresponding image appeared on the P5.
After pressing all three buttons on the physical project, the button to go to next level appears on the P5 screen.
Second level is about kazakh music. The user has to find kazakh musical instrument dombyra around the table and press it to trigger the play of the composition in the P5. When the user presses dombyra, the force sensor from Arduino sends value to P5 and the video of a musician playing on Dombyra appears.
Third level introduces the user to national dishes on the Dastarkhan. By pressing the arrows on the P5 screen, the user scrolls to different dishes and their explanations.
There is no interaction between P5 and Arduino at this level. The table rotates on the physical project using the servo motor, while the P5 is guided by the user.
Fourth level holds more sentimental meaning, as it uses the pulse sensor. The user has to place their finger on the pulse sensor and press on the heart on the screen. The P5 receives the message that Pulse sensor is active and then it triggers the appearance of pictures of my family and friends.
The communication between Arduino and P5 is based on buttons, pulse sensor and force sensor. Values from Arduino are directly sent to the P5 and trigger appearance of certain elements.
What I’m Proud of
I am really proud of my final outcome. It took a lot of time working with the buttons, soldering them and figuring out how to work with them. I spent quite a lot time building physical project and connecting both parts. Moreover, I am proud that I figured out how to code the serial communication, so that multiple sensors could be connected at the same time, while only one of them sends information at a time. I am also proud of aesthetics of my project, since I feel like I could really deliver the atmosphere of Kazakh interior setting and culture. Many Kazakh students came to me sharing that my project really took them to home.
Future Improvements
In the future, I would like to work more on the complexity and depth of interactions between the project and the user. In this project, I only worked out 4 levels of interaction, but there are so many other ways to do it. Also, I would like to work on intuitiveness in experiencing my project. I would like to step aside from my main role of communicating purpose, idea and workings of the project, and leave it to the user himself to figure out. That way I could really achieve high level interaction. Additionally, I would explore more the intersection of traditions and technologies, how are they addressed in today’s world. For now, I just tried to attempt accomplishing this interaction, but there are more sensors and immersive experiences to explore.
My final project was an interactive robot which someone can control. I was also inspired by the animation Despicable Me which can explain the design of my robot as a minion. The robot is a simple build with 4 tyres arranged in a + manner to allow the robot to move in all directions and also spin. It also has two hands which move up and down to depict the human hand movement. On movement of the robot the hands also move depending on the direction e.g when moving left the left hand rises. While creating the robot I had a dilemma on how to control the robot. However, I got inspired by previous students work which used the handspose library from ml5.js to control movement using hands. I went on to implement this to control my robot.
Interaction Design The interaction was using ml5.js hanspose library which picks the users hands position to control the movement of the robot. I divided the screen into 4 sections such that having a hand in each section sends a certain command to Arduino to control the robot; a hand in the top left section moves the robot forward, top right section moves back, bottom left section turns left and bottom right turns right, to stop the robot one has to put both hands on the screen or remove all hands from the screen.
Arduino Code
My Arduino code included commands to control every movement of the robot; the motors, servos and LED’s. It receives data from p5 and depending on the data it executes the command on the robot.
My p5 code picks data from user to move the robot. Using the average position of the users hand key points, the code checks where the hands are positions and stores this in as a defined data text which is sent to Arduino. Through serial communication the Arduino is able to process commands on the robot based on what it gets from p5.
Circuit Schematic
For the project I used the Adafruit motor shield for my connections as I initially wanted to use 4 motors. This component turned out to be very beneficial as it reduced the amount of wires that I needed to use if I would have opted for the small motor shield. I also didn’t have to use a breadboard which helped to minimise the size of my robot. I used KiCad to generate the schematic
Project Reflection and Future Improvements
Working on this project was a worthwhile experience. I was a good opportunity to put into work what I have learnt in the course. I had a couple of challenges while coming up with it. I initially wanted to use 4 DC motors for each individual wheel however this had a lot of issues as the robot couldn’t move in some directions. After spending a lot of time trying to fix this I finally got a solution by having to use only two motors and two support wheels which enabled the robot to move better. I also had an issue with picking commands from user. My initial idea was to have the user swipe the hands across the canvas but this had errors because the handspose library was too sensitive which resulted in clashing commands being picked. The project still has space for improvement especially in terms of the user interaction. The user experience can be enhanced by having both hands to control the movement since many users found it challenging to use only one hand. Everyone naturally started using two hands even after reading the instructions which might mean that including a two hand control would be better.All in all, I am proud of the whole project and how I was able to implement it.
User Testing Videos
Video Player
Media error: Format(s) not supported or source(s) not found
As I began working with p5.js and Arduino, I decided to focus my project on four Armenian symbols: Mount Ararat, the pomegranate, the Armenian rug, and lavash. I removed the duduk (musical instrument) from the project, as recreating it would have taken too much time, and the sounds I could input wouldn’t be similar enough to the actual instrument. Therefore, I chose to focus on these four interactions:
Lavash Baking Session: This interaction references Armenian flatbread (lavash) and imitates its baking process in a traditional clay oven (tonir). Users can cover the 3D-printed oven with both hands to place the lavash inside. When they release their hands, the lavash reappears on the p5 canvas. This interaction is based on the real-world operation of a tonir, which is typically placed in the ground to bake the lavash beneath the surface. Serial communication includes an LDR sensor from the Arduino side, which sends a signal to p5 to manipulate the lavash’s position inside the oven. An LED sensor also indicates when the lavash is being baked.
Breathing in Ararat: This interaction centers on Mount Ararat, the iconic peak visible from much of Armenia. As many Armenians dream of seeing Ararat through the clouds, this interaction lets users blow away the clouds to reveal the mountain. Users are encouraged to blow into the 3D-printed mountain, and with enough force, the clouds on the p5 canvas will move away, revealing the peak. I used a DHT22 sensor to detect humidity levels, which are sent to p5 to control the movement of the clouds.
Coloring the Pomegranate:
This interaction draws inspiration from Sergei Parajanov’s iconic film The Color of Pomegranates. To color the pomegranate, users place their finger on the fruit, where a pulse sensor detects their heartbeat. The pomegranate changes color based on the sensor’s readings: lower heart rates result in blue hues, while higher rates produce red tones. Serial communication reads the pulse sensor’s value on the Arduino side and displays it on the p5 canvas, adjusting the pomegranate’s color accordingly.
Crafting the Rug:
This interaction is entirely based in p5.js. Users can see themselves integrated into the pattern of an Armenian rug, typically displayed on walls for its aesthetic value. I used the following p5.js code as a reference to capture the pixelated camera feed, and by adjusting the RGB values of four main colors (blue, black, red, and beige), I created an image that blends with the rug pattern.
This project challenged me to work with various types of sensors. I encountered several issues with sensors I had never used before, such as the pulse sensor. Initially, it displayed three-digit values that bore no relation to an actual heartbeat. I eventually discovered that the sensor itself is not very reliable, so I implemented constraints to ensure the displayed values resembled a realistic heartbeat. Fortunately, this approach seemed to work during the IM showcase, as users received values around 90-100. Additionally, I had to constantly calibrate the LDR and DHT22 sensor values to suit the specific environment of the showcase.
I believe the visual aspects of the project could be further refined as well. More than that, the interactions could be recorded in p5.js, allowing users to compare how quickly they could “blow the clouds” or “bake the lavash.” This would introduce an element of competition and encourage users to revisit the project repeatedly.
Overall, I am proud of what I accomplished, particularly the variety of interactions I managed to implement within a constrained timeframe. I learned a great deal from this experience and thoroughly enjoyed engaging with users during the IM showcase – hearing their feedback about their favorite interactions was especially rewarding.
Unfortunately, the videos from user testing were not saved on my phone.
But, here are the main insights that I’ve gained from user testing:
Due to presence of the physical product, all of the users kind of failed to pay attention to the P5 screen on the computer, while it was the component to start interaction. When looking at the project, users immediately wanted to touch buttons and expected some response.
The component that attracted the most attention on the physical project was the heart sensor that was placed in front of the table and in the closest reach to the user. This placement was the reason people tried to push the sensor, touch it and expected instant reaction from the physical project or the computer. But since it didn’t occur, they were kind of disappointed.
Since to have this interaction with heart sensor, user had to go through multiple levels to reach that level, I had to explain them this information manually. This led me to thinking that maybe having levels was not great decision to build interactive experience with users, since they were not intuitively instructed by the physical project that they will have to go through different levels, therefore they were confused.
I also thought that next time, just the interaction with either of the components, such as buttons and sensors can trigger the screen appearance that supports specific interactions. That way there will be no levels, but self-guided exploration of the project.
Also, users tried to interact with components that they were not supposed to interact with. For example, users tried to push dishes on the table as if they thought they were buttons. The confusion was supported by the level screen on p5, which encouraged users to explore dishes and the table. In reality, users just needed to press the arrows to go through different dishes. Users also couldn’t understand where to press to switch the dishes, some pressed on the dishes on the screen or anywhere but not arrows
This led me to the idea that I should make more intuitive arrows, so users can easily and directly press on them.
Another thing I learned from user testing was that after completing different levels, users didn’t have any clue that the experience has ended. There was just a page with a last level. Therefore, I decided to have a finish page that thanked the user.
After implementing it and conducting another user testing, I found out that user needs to refresh page every time he want to go through the project again. Therefore, there should be a refresh button to go to the main page again.
Some of the users were also inquiring if there was any interaction with LED chandelier on the top. They tried to touch it or do anything with it, but failed. So, I had to explain that it was just the lighting.
I also noticed that the text size for many users was quite small, so they had to come closer and bow to look into the text on the screen, which kind of inconvenient for them.
The great confusion was brought by the pulse sensor. When the user reached level 4, he had to place his finger on the pulse sensor and press on the heart. The user expected something big and radical to happen, but in my vision I used the pulse sensor for it’s symbolic and sentimental meaning of listening to your heart, so the pictures of the family appeared. But, the users couldn’t believe that it was the only thing that needed to happen.
Melody Masters is a fun and interactive project that helps people explore and learn about music. It uses Arduino to let users interact with physical buttons and p5.js to show visual and audio feedback on a screen. By pressing buttons, users can play sounds and see how melodies are made. The idea is to make learning music simple and enjoyable for everyone, whether they are just starting or love experimenting with music. Melody Masters makes learning about music feel like a fun and creative experience. Melody Masters challenges the user by collecting as much points as they can in a minute which helps them with coordinating the location of the note with the note itself. The game has a tutorial that explains where each letter is located so that the user can learn where each note is located. There is also an option for free play where the user can play whatever music they want.
How does it work?
The interaction design of Melody Masters is simple and engaging. Users interact with physical buttons connected to an Arduino board. Each button corresponds to a different musical note, and pressing a button triggers a sound, and lights up the button, the sound is also visualized through animations on the screen using p5.js. The design focuses on making the experience intuitive, so users of all ages can easily play and create melodies. The interaction bridges the gap between touch, sound, and visuals, making it a multisensory experience.
The Arduino code is written to detect button presses and send corresponding signals to the computer via serial communication. Each button is assigned a unique musical note, and when pressed, the Arduino sends a specific code to indicate the note. The code also includes debouncing logic to ensure smooth and reliable input.
The p5.js code takes input from the Arduino through serial communication and creates visual and audio feedback. The P5 code has the tutorial, game and free play logic. Each logic is different than the other and each component gives the user a different experience.
The Arduino and p5.js communicate via serial communication. When a button is pressed on the Arduino, it sends a unique character (like ‘C’ or ‘D’) over the serial port. The p5.js sketch reads this data, processes it, and triggers the appropriate sound and visual response. The communication ensures a real-time connection, creating an immediate response to user interactions, which makes the experience dynamic and engaging.
What I am proud of?
The way physical button presses translate into both sound and visuals feels intuitive and satisfying. It’s rewarding to see how users can engage with music in such an accessible way.
What can I improve?
What I think I want to improve is that when the audio visualization is being drawn, I can change the colors of the visualization depending on what note is being played for example if it was an E it will become yellow and so on. I want to include this so that the aesthetics of the game matches the aesthetics of the board more.
IM Showcase Interactions
Video Player
Media error: Format(s) not supported or source(s) not found