Final Project User Testing

User Testing

For the user testing, I think the fact that the same plushy in real life was on the p5 screen kind of hinted at what users were supposed to do, especially since the sensors were not so subtly taped onto the back of the plushy. The physical aspect of the project is still in the makes, which is why the sensors were on the back, because I’m still trying to figure out the best way to implement the sensors without taking away from the capabilities of the flex sensors. Ideally, I wanted to stuff the flex sensors as well as the arduino board and bread board into the plushy, but I realized how heavy it would be and also the struggle of the jumper wires potentially disconnecting. I think that if I stuck with the plan where the only thing visible was the arduino cable to connect to my laptop, it would be less self-explanatory and a bit harder to figure out since the sensors wouldn’t be in plain sight.

I think there was some confusion on how exactly the plushy should be squeezed. I don’t think the user knew it was supposed to just be hugs, but he probably hugged the plushy simply because it’s a cozy plushy. If it were a different object, I don’t think his first instinct to interact with it would be to hug it. He ended up squishing the plushy between his palms as well, which was a really interesting way of playing with it.

I think the overall concept is working well, where the pressure value and p5 connection is working properly, but there could be major improvements in where the sensors/arduino components could be within the plushy, as well as better p5 display and design with instructions and maybe some context on the project. I think I would have to explain the purpose of it because knowing your tight hugs make the plushy’s day would definitely add more positivity into the interactive experience. I would also have to explain how exactly it would work, which is a bit complicating because saying to just hug it is a bit vague since I only have two flex sensors, so if they hug an area without the sensor, the p5 screen wouldn’t reflect anything, which would be a pretty disappointing outcome. Having just brief words on the p5 screen before or maybe during its start would help make sense of the idea

Final Project Progress: User Testing

Progress

Over the break, I made quite a few changes to the hardware component. While i had originally planned to attach 4 wires to individual guitar strings and use a wired conductive guitar pick to create a controller in which strumming each string triggers a different in-game action, this failed to work as all the strings were connected to a metal saddle, which made all the strings one conductive body and therefore could not be utilized for individual actions. I ultimately decided to turn the guitar’s metal frets into the buttons of the controller. I covered three separate frets with copper tape for enhanced visibility and connected them to wires, doing the same for the metal saddle; when the player presses any string against the taped frets, a circuit is completed and a signal is sent from arduino to p5js to trigger 3 actions — start, jump, and attack — depending on which fret is pressed.

The overall structure of the p5 game is now complete. The only work left to do is to finish the visuals of the game, as some of the power-ups and enemies are displayed as basic black placeholders. A system also has to be set up to keep track of high scores over the course of multiple games.

User Testing

At first, it was confusing for users to figure out how to operate the guitar controller, as I haven’t created any signage to attach to the frets and indicate their functions. It took them a few seconds of fiddling before getting to the actual gameplay. The gameplay also wasn’t as straightforward as I thought it was; users were confused as to how to increase their score (which is by collecting power-ups to build up attack charge and attack enemies). I will have to work on the visual components of the guitar controller (adding small labels, etc.) and adding an instruction menu before the game starts.

Final Project User Testing

Without clear instructions the user was confused between how the buttons and notes corresponded to each other, so I had to explain the game before the user tested it. I think everything will be fine when I combine the game and the tutorial and maybe if I add an instructions page if the user is still confused with the tutorial.

Everything in my experience is working well, the buttons are corresponding to the notes and sound, and if a wrong button is pressed 5 points are deducted and if you press the correct button corresponding to the note you get 10 points, I might want to add a library to save the highest score so far so it becomes competitive.

What I want to improve in my experience is the overall look, I want to laser cut the box that would hold the Arduino and I want to paint some of the buttons to match the music therapy note colors.

Final project User testing

Video: 

IMG_4223

 

User testing analysis: 

The basics of my project were easily understood by users, without needing much instructions, they all understood that the buttons corresponds with the graphics that are shown on p5. There was nothing that particularly confused them regarding the mapping of the controls and the experience. However, the drum selection dropdown was mostly ignored by everyone that I have shown the project to. I felt the need to tell the users that that was a feature that was built in, so I think i can improve on making it more clear to the users that the dropdown menu is available on for them choose different drum kits from.

 

 

 

Assignment 13: Final Project User Testing

Progress

I finally finished with the hardware aspect of my project, in terms of making the head body and arm of the cat (the ears are still missing though..). During the process, I encountered some issues with my project, mainly with the pressure sensor. First, I made my own pressure sensor using velostat and that works as shown in the final project proposal for Assignment 12. However this was too small for what I actually need for my final piece, so I tried making it on a larger scale but the pressure sensor did not give me a variety of readings and only gave me high values of readings (in the 1000s) and it rarely varies. I tried using the piezo sensor, but it also does not vary. So at the moment, the pressure sensor is not a functioning part of my project.

Other than that, I have made progress with the interface on the p5 side, finished building the arm and have tested it so that it moves with the servo motor and I have also implemented the lcd to reflect what is written on the input box in the p5 sketch.

User Testing Video

Feedback and Reflections

For this user testing, I asked one of my friends to test my project. He was able to figure it out because I was debugging right in front of him, so he saw what actions I made and knew what to do. But if he didn’t see me do it, he said he would have been confused because I didn’t give instructions for how to trigger the cat’s hand to move.

There were also parts where I couldn’t finish in time to implement, that being illustrating the different images I want to pop up on the p5 sketch whenever the cat’s hand move. So he said the cat’s hand moving right now seems a bit pointless (and I had to explain what was meant to happen).

After this user testing, I realized there are still a lot more improvements and edits that I need to make to my project. One being to give clear instructions on the p5 sketch so that users know exactly what to do. Another thing is to find a way to get the pressure sensor working. And finally, finish with the illustrations so that I can implement it to my p5 sketch.

Final Project: Progress

Concept

I decided to proceed with my preliminary idea and develop it further. I chose five specific symbols that I will focus on in my project: Mount Ararat, the Armenian rug, lavash (Armenian bread), the duduk (Armenian flute), and the pomegranate.

The project will be divided into five different sections (or elements) that users can explore by clicking the corresponding icon in the p5.js sketch.

Customize Your Armenian Wall Rug:
No Armenian home is complete without a rug on the wall. In this section, users can create their own personalized wall rug by interacting with physical buttons. There will be two buttons: one for switching the patterns and another for fixing them on the rug. When the first button is pressed, patterns will chaotically move and bounce around the rug in the p5.js sketch until the “stop” button is pressed.

Lavash Tonir Simulation:
A tonir is a traditional clay oven used for baking lavash. This section allows users to “bake” their own lavash. By covering the light sensor with their hand, users simulate pulling lavash from the tonir on the digital display. Each time the light sensor is covered, a newly baked lavash piece will be placed on the table, stacking one on top of the other.

Breathing in Ararat:
Everyone in Armenia dreams of having a view of the legendary Mount Ararat from their home. However, even when the view is clear, Ararat is often covered by clouds. In this section, users can clear the clouds by blowing on a 3D-printed mountain. A DHT11 sensor will detect breath through changes in temperature and humidity. This sensor was chosen because, unlike sound-based detection, the DHT11 measures subtle shifts, ensuring precise breath recognition.

Virtual Duduk Experience:
The duduk is an iconic Armenian musical instrument traditionally made from apricot wood. This interaction is heavily based on the p5.js sketch: pressing virtual holes on the duduk triggers sounds from three piezo buzzers connected via Arduino.

Heartbeat-Responsive Pomegranate:
Inspired by Sergei Parajanov’s cinematic masterpiece The Color of Pomegranates (1969), this element allows users to discover their “pomegranate color” based on their heartbeat. Data from a heartbeat sensor will be shown in the p5.js sketch as a color-changing pomegranate, with each heartbeat value generating a unique RGB combination.

Progress

I have designed a rough sketch of how the project will look both on the Arduino circuit and in the p5.js sketch (see below).

As mentioned in my preliminary idea documentation, I plan to create a map within the p5.js sketch to help users navigate through all five elements. The icons will be clickable only in the p5.js interface, leading users to each element’s page, which will include a brief description with interesting facts about the symbol and interaction guidelines.

The physical layout of the elements on the Arduino side will visually resemble the map in the p5.js sketch. To conceal the circuit and sensors, I will build a wooden box that covers everything except the LEDs and buttons. For aesthetic purposes, I will cover the box with a piece of fabric. Additionally, I will place a 3D-printed mountain on the box (with a small hole at the top for the sensor) to make it more intuitive for users to “blow” on it.

Final Project Progress

Concept

I have decided to work on the idea that I initially proposed, but with several alterations that will allow to focus on the idea of the distorted image. The glasses will be asymmetrical, with one square and one oval frame, and two lenses will be changing the eye size in different ways – one making it smaller (for myopia) and the other one making it larger (for hyperopia). The web camera will be projecting an image of the eye area of the user on the screen with the p5.js code, where the eyes will be changing their size and moving, creating almost a hypnotising performance accompanied by the audio coming from the glasses.

Arduino

Electronic components, piezo buzzers, will be embedded into the temples of the glasses so that they act as speakers that allow the user to hear the melody clearly. The buzzers will be connected to the Arduino using wires, and the plate will be hidden in a 3D printed eyeball that the user can hold.

p5.js

Using a library WebGazer.js, I will be tracking the eyes of the user and showing a representation of them on the screen. The image of both eyes will be changing on the screen, creating a likeness of a mandala ornament, which will be done by defining the movement of the eye image extracted from the web camera.

Progress

I have discovered several examples of other projects on p5.js where the creators used the same library and came up with various ideas of working with the image of eyes and the gaze tracking. This allowed me to understand the logic of working with this library and possible application to my work.

Final Project Progress

Concept

I decided to settle with creating a simple robot for the final project. My idea was inspired by a robot waiter in one of the restaurants I visited back home. I found it cool how one can come up with such a thing and it pushed me to try and emulate it. I will be creating a simple robot that moves around on command of the user. The robot will be able to do a simple dance, spin around with the hands moving up and down, when the user commands it to. I would also like to have the robot to be able to do simple writing and also measure distance and give feedback to the user.

Arduino 

The Arduino will be the basis of the project as it will be getting input from p5 to move the robot. The robot is powered by four Arduino controlled DC motors which respond to inputs from p5 to move the robot. Additionally, there are two servo motors to control hand movements of the robot. I will also include a distance sensor that will allow the robot to read distance from objects and send the value to p5 for the user. I will also have LED lights which will represent the eye of the robot.

P5

I was conflicting on what my p5 would do but I settled on using the ML5.js Handpose to enable the user control the robot using their hands. Since I will also have two modes, one for moving the robot and one for drawing using the robot, I intend to have it that on the draw mode the robot can write on a white board and p5 can give a visual representation of what the robot is drawing. On the moving mode the p5 will be taking input from the user and sending it to Arduino to control the robot.

Progress

So far I have been able to make the base which is the wheels controlling the movement of the robot. I have also added the servo motors that will control the hands. I want to create a solid base that will hold the components and make the robot aesthetic. I also want to 3D print a body for the robot but I am yet to do that.

 

FINAL PROJECT – PROGRESS

Concept:

I changed my concept from a mood board to creating  a Punching Bag Game that combines physical activity with digital feedback and humor. Players aim to achieve the highest punch strength and combo streak, all while enjoying dynamic visuals and memes in the P5.js interface based on their performance. A wooden control box houses an LCD, LEDs, and an arcade button, while the P5.js visuals make the game fun and engaging with score-based meme displays.

How to Play the Game:

  1. Start the Game:
    • Press the arcade button on the wooden box to begin.
    • The game resets all scores and displays “Game Start” on the LCD and P5.js interface.
  2. Punch During the Green Light:
    • Wait for the green LED to light up and the P5.js display to say “Punch Now!”
    • Punch the bag as hard as you can during this phase.
    • If you’re too slow and punch during the red light, you loose!
  3. Score Points:
    • Your punch strength determines the points added to your score.
    • Achieve combos by landing consecutive valid punches.
  4. Watch for Memes:
    • As your score increases, P5.js will display funny memes tied to your performance
  5. End of Game:
    • After a set time or when the player stops, the LCD and P5.js show your final score and highlight your best streak.
    • Press the arcade button to restart.

Hardware Components:

  1. Accelerometer sensor (MPU-6050):
    • Measures the strength of punches by detecting acceleration on X, Y, and Z axes.
    • Attached to the back of the punching bag
    • Sends punch data to the Arduino for calculation and updates.
  2. 16×2 LCD Display:Shows:
    • Game Status: Displays “Punch Now!” or “Wait…” depending on the phase.
  3. LEDs:
    • Green LED: Indicates when the player can punch.
    • Red LED: Indicates when punching is not allowed.
    • LED STRIP: showing the power of the punch for asthetics( like a fuel bar)
  4. Arcade Button:
    • Resets and starts the game.
  5. Wooden Box:
    • Organizes all components in a neat, arcade-style housing.

Arduino Program:

Inputs:

  • Accelerometer: Detects and calculates punch strength.
  • Arcade Button: Starts and resets the game.

Outputs:

  • LCD Display: Shows punch strength, score, and game state.
  • LEDs: Indicates when the player can punch or must wait.
  • Serial Communication: Sends punch data and game updates to P5.js.

 P5.js Program:

Inputs (from Arduino):

  1. Punch Strength:
    • Dynamically fills a power bar and displays the strength numerically.
  2. Game State:
    • Updates the screen with “Punch Now!” or “Wait…”.
  3. Score & Combo Streak:
    • Updates the scoreboard in real-time.

Outputs (to Arduino):

  1. Start Signal:
    • Resets the game and all scores.
  2. Difficulty Setting:
    • Adjusts thresholds (e.g., punch strength required to score)

Visual Features:

  1. Scoreboard:
  2. Tracks current score, best score, and combo streak.
  3. Punch Strength Visualizer:
  4. Memes Based on Performance
  5. Aesthetic designs aracde like

Sound:

I will have sound effects playing from p5 to have on my laptop rather than on the Arduino board.

I still havent created the code due to me changing my whole concept and idea. However, I have the dimensions of the box created using Markercase. As for the setup on the Arduino Board, I did test the components ( LCD, LED etc) however, I didn’t have the sensor or LED strip with me at the time, so I have to re-do it with these included.

 

 

 

FINAL PROGRESS – GOAL RUSH

Finalized Concept for the Project

For my final project, I will create Goal Rush, which is a football reaction game where the player acts as a goalkeeper in a penalty shootout. The goal is to save as many penalty shots as possible by reacting to random signals. The project uses a physical setup with three buttons and corresponding LEDs to simulate the direction of the ball (left, center, or right). Each button press simulates the goalkeeper’s dive to save the ball.

The buttons, LEDs, and game logic are connected to and managed by an Arduino. The Arduino communicates with a p5.js program, which mirrors the physical gameplay with a digital goalpost, goalkeeper, and ball animations. The LEDs light up to signal where the ball is “shot,” and the player must press the corresponding button to save the goal. The correct press sends a signal to p5.js, making the goalkeeper dive in the corresponding direction. If the player reacts too late or presses the wrong button, the ball goes into the net, and a missed goal is recorded.

Arduino Program Design

  1. Buttons:
    • Button 1 (Left):
      • Connected to digital pin 2.
      • Represents the goalkeeper diving to the left.
    • Button 2 (Center):
      • Connected to digital pin 3.
      • Represents the goalkeeper staying in the center.
    • Button 3 (Right):
      • Connected to digital pin 4.
      • Represents the goalkeeper diving to the right.

What It Does:
Each button, when pressed, sends a signal to the Arduino. The program identifies which button was pressed and determines if it corresponds to the lit LED.

  1. LEDs:
    • LED 1 (Left):
      • Connected to digital pin 5.
      • Lights up to indicate the ball is being “shot” to the left.
    • LED 2 (Center):
      • Connected to digital pin 6.
      • Lights up to indicate the ball is being “shot” to the center.
    • LED 3 (Right):
      • Connected to digital pin 7.
      • Lights up to indicate the ball is being “shot” to the right.

What It Does:
The LEDs are controlled by the Arduino to light up randomly, signaling where the ball is heading. This gives the player a cue for which button to press.

  1. Wires:
    • Signal Wires: Connect each button and LED to their respective digital pins on the Arduino to transmit input and output signals.
    • Power Wires: Connect all components to the Arduino’s 5V and GND pins to ensure proper power supply.

Arduino Program Workflow:

  1. Random LED activation:
    • The Arduino randomly selects one of the LEDs (left, center, or right) to light up, signaling the ball’s direction.
    • A small delay allows the player time to react.
  2. Button Monitoring:
    • The Arduino continuously checks the state of the buttons using digitalRead.
    • If a button is pressed, the Arduino determines whether it matches the lit LED:
      • Match: Sends a success signal (-1, 0, 1 based on the button pressed) to p5.js.
      • Mismatch or No Press: Sends a failure signal to p5.js.
  3. Serial Communication:
    • Sends the button state (-1, 0, or 1) and save/miss result to p5.js, where the digital game is updated accordingly.

p5.js Program Design

  1. Digital Goalpost:
    • Displays three zones (left, center, right) where the ball can be shot.
  2. Digital Goalkeeper:
    • Moves left, center, or right based on data received from Arduino (-1 for left, 0 for center, 1 for right).
  3. Ball Animation:
    • A ball is animated to travel toward one of the three zones, matching the lit LED in the physical setup.
  4. Scoreboard:
    • Tracks successful saves and missed goals.
  5. Game Timer:
    • Limits the game duration and increases difficulty by speeding up the ball animations over time.

p5.js Program Workflow:

  1. Input from Arduino:
    • Receives data from Arduino indicating which button was pressed and whether it was correct.
    • Updates the goalkeeper’s position (left, center, or right) based on the button data.
  2. Random Shot Generation:
    • Randomly determines the ball’s trajectory (left, center, or right), mirroring the Arduino’s LED activation.
  3. Collision Detection:
    • Checks whether the goalkeeper’s position matches the ball’s trajectory when the ball reaches the goal:
      • Match: Displays a save animation and increases the score.
      • Mismatch: Displays a missed goal animation.
  4. Visual Outputs:
    • Updates the digital display to show the ball’s movement, the goalkeeper’s dives, and the game’s score and timer.