User testing

User testing with no instructions:

When two users tried my project without any instructions, both were able to figure out how to navigate it. The p5.js screen made it clear what action was required, so they understood that they had to respond to what they see on the screen. One user was able to move through the interaction smoothly and quickly connected the on screen commands to the physical controls like the buttons and keypad.

The other user showed some hesitation at the beginning. They got confused when instructions appeared quickly or were repeated, especially when multiple inputs were introduced at the same time. This made the experience feel slightly overwhelming in that moment. Even then, they were still able to understand the system after a short time. Both users understood the mapping between the controls and what happens in the experience, but one needed more time to adjust. This shows that the interaction is intuitive but depends on the user’s familiarity with fast paced games.

What worked well and what could be improved

Several parts of the experience worked well. The card interaction, LEDs, and overall input and output system were clear and responsive. The home page and flow of the experience helped guide the user, and the audio added an important layer by giving clear feedback for actions. These elements made the experience feel engaging and easy to follow once the user started playing.

One area that could be improved is the clarity of the instructions. Making them more specific and slightly more paced would help users who are less familiar with this type of interaction. Another important improvement is the stability of the buttons. When users started playing faster, the buttons sometimes shifted or felt less secure, which could interrupt the experience.

Reinforcing the buttons and making them more stable would make the interaction feel more reliable and comfortable. This was clear during testing and also when I disassembled and reassembled the project, which helped me better understand how to improve the physical setup.

What needed explanation and how to improve

Most parts of the project did not need explanation because users were able to understand them through interaction and feedback. The main part that needed explanation was the card. Since it is a separate component, users did not immediately understand how it works or what action triggers it.

To improve this, I could make the card interaction more clear by adding more detailed instructions or clearer visual cues. For example, showing a clearer message on the screen when the card is needed or adding simple labeling near the sensor would help users understand it faster. This would make the experience more accessible for first time users without changing the core interaction.

 

User one:

 

 

User two:

https://vimeo.com/1188956184?share=copy&fl=sv&fe=ci

Week 13: User Testing

Documentation:

Reflection:

My project was generally easy for users to figure out, mainly because instructions are provided at each step. One point of confusion was that the experience begins with the tamper selected, which made users unsure whether to start there. However, after referring to the steps, they were able to correct themselves and continue. One user also initially tried to use the joystick instead of the sensor for movement, but quickly understood the correct interaction. Overall, users were able to grasp the mapping between the controls and the actions and complete the process smoothly.

Most parts of the experience are working well, especially since the instructions are clearly displayed within the sketch. However, this also depends on the user taking the time to read them. Areas that could be improved include the clarity of the sprite sheet animations and making the interaction with the ultrasonic sensor more intuitive. This could be addressed by refining the instructions to more clearly indicate that movement in front of the sensor is required.

The main aspects I needed to explain were that users should use the physical button rather than the keyboard, and that the animations are controlled through hand movement using the ultrasonic sensor instead of controls. I believe this will become more intuitive once the physical setup is finalized and the wiring is concealed, allowing users to focus entirely on the interaction without distractions.

User Testing – Week 13

For my project, I created an interactive maze game where the player controls Minnie Mouse using two potentiometers connected to an Arduino. The maze appears on the screen, and the player must guide Minnie through it without touching the walls. The player collects items such as cheese, stars, and hearts, which trigger a yellow LED and a pickup sound. Hitting a wall turns on a red LED and plays a bump sound, and reaching Mickey at the end lights a green LED and plays a happy sound.

For user testing, I asked my sister to try the game. She immediately noticed that Minnie did not move smoothly. Instead of following the potentiometer movements, the character would jump or move to random parts of the maze. Even when she moved the knobs slowly, the character keeps moving as if it was lagged, making the maze impossible to complete or time consuming and not very easy. She described the controls as confusing and it wasn’t consistent. However, she said that the start page was clear and to connect the arduino it was easy and she mentioned that the instructions was clear aswell.

After watching my sister play, I realized the problem came from how I mapped the potentiometer values to the maze. The potentiometer sends numbers from 0 to 1023, but the way I converted those numbers into Minnie’s position on the screen was wrong. Because the mapping didn’t match the actual maze area, Minnie kept appearing in the wrong places. So basically the numbers coming from the Arduino did not match the part of the screen where the maze actually is.

From this user testing session, I learned that the mapping has to be extremely accurate when you connect real‑life controls to movement on the screen. Even a tiny mistake in the numbers can completely break how the game feels. I also realized how helpful it is to test with someone else, because they notice problems that I get used to. Even though the game was not working perfectly before and I realized that before the user testing, the testing helped me understand exactly what was wrong since my sister could see the same issue.

User Testing:

User Testing Final Project

I think it was pretty easy for my user to figure out how to use/play the game but I believe that’s mainly because I gave them a glove with foils attached to the fingers, so it was quick to understand what to do. By that, I mean that they understood that touching fingers in the glove would trigger the ballerina to dance.

But they did get confused about the rules of the game. And the idea of losing points and the ballerina falling were not fully working yet, so they were confused on how to win or lose. I didn’t create an instructions page yet so that is part of the reason.

One particular feedback I got was that it was difficult for them to use the glove. Some presses would not work, or they would have to press harder. For that, I will need to find a way to stick my aluminum foils to the glove in a neater more seamless way.

 

User Testing

Week 13 – User Testing

So I let my siblings test it out and they were able to figure out the basic idea pretty quickly, that they needed to press the colored arcade buttons when the falling tiles reached the hit line on screen. The colors helped a lot because they made the connection between the physical buttons and the tiles more obvious. But they were confused at first about when exactly they were supposed to press, since there were no instructions during gameplay, so they needed a few seconds of trial and error to understand the timing.

I think the mapping between the controls and the experience worked well because each button matched a lane and color on screen, which made the interaction feel natural. Once players understood that each button controlled a specific lane, they were able to continue without much help. The physical arcade buttons also made the experience feel more immersive compared to using a keyboard.

I think the strongest part of the experience was the atmosphere. The background visuals, projector reveal screen, sound design, and glitch effects made the game feel tense and it matched the theme and concept well. They especially liked the reveal screen because it felt like a reward after completing each round and made them curious about the story.

The main area that could be improved is clarity at the beginning. One of them were unsure whether to press buttons immediately or wait for the tiles to reach the bottom hit line. I found myself explaining the timing and the goal of “rebuilding the signal” so they could progress. To make this clearer, I could improve the instruction screen by adding a short visual example or animation showing exactly when to press the buttons (like a quick tutorial).

(Also, I noticed I wrote 4 rounds but its kind of 5 because you have to play again to go to the final message…I am thinking about just reducing it maybe its too long of a game??)

Week 14 – PawPortion User Testing

Some users were able to understand how the system worked overall, but there was confusion around the IR sensor. A few people tried to press it like a physical button, and they were unsure how hard to press, how close their hand should be, or how many fingers to use. I think this confusion happened because they interpreted it as a button rather than a proximity sensor, so the mapping between the control and the action wasn’t immediately clear.

Despite that, most users understood the general flow of the experience and were able to move between interacting with the machine and observing the outcome. One part that worked especially well was the portion size. The amount of pet food dispensed felt accurate, and the servo motor’s opening and closing motion was smooth and consistent. However, there was a small issue where some pieces of pet food would fly out from the sides, which could be improved with better containment.

The main part I had to explain was how to use the IR sensor properly. For one user, I also had to point out that they needed to place the bowl under the dispenser, even though there was a visual instruction. This might have been specific to that user, but it suggests that the instructions could be made more noticeable or clearer for first-time users.

Conway’s Prism – User Testing

Brother 1 User Testing:

Explained info: Rules of con way’s game of life

Project Walk Through:

Bonus – 9 year old brother test subject:

Conclusion:

From my first brother, he didn’t really know how to navigate the website, and I think this is due to the lack of knowledge of how simulations work in general, and I only briefly explained what Conway’s game of life is to him. To be more nitpicking, the paused and running text on the top left of the grid doesn’t seem to be very noticeable by both siblings and I had to tell them both the simulation was paused when they were just clicking the grid hoping to start something. The rest of the buttons aren’t really used either unless I explicitly tell them about it, maybe one part is just them not knowing what to do at all for the video. But what I think the core issue is the fact most people aren’t familiar with how simulations work.

The experience itself works pretty well, it does its purpose in showing something cool entirely made by the user, I however wish I was able to add more external controls, but my potentiometer pins broke while I was trying to add that and the joystick module just would not fit with my design, so I had to resort to controlling the simulation through p5js (I did add some keyboard shortcuts).

For how to solve the issue of the lack of knowledge, I think If I had time I would add some tutorial overlay when the website is first loaded in, so like when you first launch you get a short brief explanation of what every button does in a speech bubble style, have the user do what the tutorial is asking them to do before moving on to get them familiar with the controls. I should also make the pause/running indication more visible, maybe by changing the grid outlines or the background from red -> green or something along the lines, something more “obvious.” A big fat “PAUSED” in the middle of the grid could work too!

The buttons aren’t really intuitive for anyone who isn’t aware of simulations in general, fast, slow, random, clear, presets all were buttons ignored unless I explicitly mentioned, so this should also be fixed with my how to use overlay tutorial type of thing. However the process of clicking on the grid and seeing stuff react when the simulation started seemed to make them understand the mapping pretty instantly! So I think I would call this project a success, just simply lacking some intuitive upgrades to the UI.

Final Project – User Testing

PRESSYR – User Testing Reflection

VDO:

https://youtu.be/GeI6QWZyzsk?si=tPqhPQI3fib59SSb

For my final project, I created an interactive piano-inspired instrument called PRESSYR using Arduino, FSR pressure sensors, a CD74HC4067 multiplexer, an I2C LCD display, an arcade power button, and a p5.js visual and sound interface. The project allows users to press pressure-sensitive pads to play musical notes while also generating sound and visuals on a computer screen.

User Testing Process:

I conducted user testing with four participants. During the testing session, I intentionally did not give them detailed instructions at first because I wanted to observe how naturally they could understand and interact with the project on their own.

One thing I noticed immediately was that almost every participant was initially confused about what the device actually was. However, most of them could somewhat guess that it was related to a piano or music instrument because the cardboard enclosure visually resembled piano keys. In addition, my house already contains a real piano, and everyone who participated already knew that I play the piano, so this context may also have influenced their assumptions.

The main confusion at the beginning was understanding what to press first and how to start interacting with the system. Fortunately, the LCD screen helped guide users by displaying “Press RED Button” when the system was off. This instruction helped participants understand that the red arcade button was the actual starting point for interaction.

After turning the system on, some participants were still slightly confused about what to do next. I think this happened for two reasons. First, the instructions on the LCD were written in English, and second, some participants did not immediately understand what the word “pads” referred to. Even so, after a short moment of exploration, every participant eventually understood that the pressure-sensitive pads were musical inputs that played notes like Do, Re, Mi, Fa, Sol, La, Ti, Do.

Observations During Interaction:

One interesting observation from the testing session was that participants rarely looked at the computer screen visualizations while playing. Instead, they focused almost entirely on the physical pads and their hand movements.

I believe this happened mainly because users had to concentrate on pressing the pads accurately and listening to the melody from each key being produced. Since the sound feedback already confirmed that the interaction was working, participants did not feel a strong need to constantly look at the screen. The laptop screen was also positioned slightly farther away from the pads, which may have reduced visual attention even more.

Although it is possible that the visual effects on the screen were not large or attention-grabbing enough, I personally think the stronger reason was that users naturally prioritized the physical interaction and audio feedback over the visual component.

What Worked Well:

Several aspects of the project worked successfully during testing:

– The arcade power button clearly communicated the ON/OFF state of the system.
– The illuminated LED inside the arcade button provided immediate visual feedback.
– The LCD instructions helped guide users through the interaction process.
– The pressure-sensitive pads successfully triggered sound and visuals consistently.
– Users eventually understood the interaction flow without requiring direct explanation.
– The piano-inspired physical design helped communicate the musical concept of the project.

Areas for Improvement:

Some participants commented that the pressure pads felt larger than actual piano keys. This feedback was valuable because it highlighted how the scale of the interaction affected the realism of the experience.

In future versions of the project, I would like to:
– Reduce the size of the pads to better resemble real piano keys.
– Improve the visual prominence of the p5.js animations so users notice them more easily.
– Add clearer visual labels or symbols for first-time users.
– Experiment with different layouts and materials for a more polished interaction experience.
– Explore using velocity-sensitive audio or multiple sound layers for more expressive musical interaction.

Reflection:

This user testing session helped me realize how important clarity and interaction flow are in physical computing projects. Even though the project technically functioned correctly, small design decisions such as wording, placement of instructions, and physical scale strongly affected how users understood the experience.

Overall, the testing process was extremely useful because it revealed areas where the interaction was intuitive and areas where users still needed additional guidance. It also helped me think more carefully about how people naturally approach unfamiliar interactive objects for the first time.

Final Project Blog Post

Will be using this blog post to gradually write the blog post instead of all at once:

PROGRESS:

  1. Arduino wiring: DONE
  2. Arduino code: DONE
  3. UI
  4. p5 code communication

30/04/26

I tried connecting my accelerometer to my breadboard without soldering. I placed my headers into the breadboard, placed the accelerometer on top, and connected the female end of a jumper wire on top. I downloaded the relevant libraries and wrote the code from this page. Unfortunately, the Arduino showed no connection whatsoever (the code I had was designed to write in the serial “No ADXL345 sensor detected.” And indeed that was what my serial displayed for me. Now, I will be moving on to a different connectivity, building on the unusual switch and musical instrument assignments we made. This idea will still follow the same concept of composing music in an opera; if you compose incorrectly, a buzzer will sound, and the ballet dancers will fall. Instead of using an accelerometer to measure the x-y positions of the user’s hand, I will attach foils and wires to each finger on a glove. The connection between all the wire combinations will control a relevant aspect of the music, such as pitch, speed, note, and beats. This wiring idea was also heavily inspired by the emerging usage of TouchDesigner.

01/05/26

I decided to go on with my glove idea. I originally wanted to involve all 5 fingers, but then I realized that would complicate things. I stuck to using three. One would be in charge of playing the song, the second is in charge of speeding it up, the third is in charge of slowing it down. I tested out my wiring by also connecting an LED to the breadboard to ensure correct wiring, and it worked. I didn’t connect my buzzer today. And honestly, I don’t think I will until the very end because I got a new buzzer and the legs look extremely delicate and if I’m being honest I’m not the most patient or gentle with any electronic device so I don’t want to risk breaking yet another buzzer from the get go.

 

STIPEND USAGE:

  1. Accelerometer: 40 AED: https://www.amazon.ae/AOICRIE-GY-291-ADXL345-Acceleration-Transmission/dp/B09S633YTW/ref=sr_1_1?crid=15F0J7UIXQK35&dib=eyJ2IjoiMSJ9.gDQl-_42Yd4esBPagpxPgPQ1K0jRtEl1-iW409PI-IfKNZ2RDA0hRICe1Ca5CQO83N_NFecaKnxd0M0wb-MUX1e1_oV4-HaxCe8KarHRjisgkRf5YXzLzubvBosNxFh3jF1fxCOIuON14P6KvVqARPChru9DCNpV5LlMncvQg8Ro-_7YjQ8SDpSgTryUpC0bzRT_iYJgE7TdhPJPFwpEfobjkqCHFv6UofjnQWaQvz2MidmSXbQrPRS50HEMsv52y0cumuy0z_MNJD41on7dsMC64UjQqNZ6qDc8AXm-5ME.Jzj2a_U_k_AoU6BZxRHVxhwAFpV0Zcu0PJnDUZjpUM0&dib_tag=se&keywords=GY-291+3-Axis+Accelerometer+ADXL345&qid=1777274584&sprefix=gy-2913-axis+accelerometer+adxl345%2Caps%2C261&sr=8-1
  2. Jumper wires: 18 AED: https://amzn.eu/d/03P5tjOx

AI USAGE:

I used AI for this project thus far to discuss my ideas and discuss any problems which I may face while building the project.