(initial) User testing

Testing

So far, my code is able to recognize the user’s different poses, for the rock, paper, scissors game. By uploading the training data (images of different poses) to Teachable Machine, the model is now able to reliably distinguish between the 3 actions in the game. I tested this out myself, shown in the video below. I was glad to see that the image recognition model also worked well when users other than myself tested it out, even though the training data only included me.

https://youtube.com/shorts/dKZt07FMDyg?feature=share

user testing

First of all, I changed my topic again. Now, it is about creating an “immersive” experience of the nature of my country, Kazakhstan. I just miss home and got inspired after listening to some Kazakh songs. For this project, I am using a potentiometer to change the backgrounds and a ZX detection sensor to move a horse on the screen and change the colour of led lights from green to yellow, so that there could be an interactive experience with the environment.

Generally, I could finish the main functionality part of the program, and establish the communication between p5js and Arduino. However, I keep having struggles with the working of the led strip. I tried to solder it many times, but it keeps falling off (and the strip is becoming shorter and shorter). I asked for help, but although it seemed that the connection was good, it still remains unresponsive to the code. During the weekends, I worked with a soldered led strip, which is much shorter, sorry but one of the wires fell off just yesterday……So, now it does not work either, although I created a code based on it. A ZX detection sensor is another story. Theoretically, it is a really fun and useful sensor with endless possibilities… but it keeps giving me random values, which makes it complicated to control.

I asked my friend to test the program so that I could have a clear view of some new additions and changes to the program. Firstly, I need to add instructions. The second thing is to adjust my sensor. Also, she asked me if I could add more interaction with the sensor, I will think about it when it starts to normally work without creating a messy picture 😀 Yet she loved the design and the music in the background.

User Testing – Cybertruck by Zaeem&Dev

Progress

So far, we have added extra an obstacle detection system using the ultrasonic sensor and a boost in maneuverability to the car. Earlier, the car would only curve slightly when moving left and right. Now, it stays in place and adjusts directions before moving forward or backward.

Car Movement

After user testing, we have concluded that the car works as it was intended to. The ultrasonic sensor adds the automatic breaking functionality that we wanted to add initially. There are a few problems with the sensor though.

 

Furthermore, we have also figured out how to use the XBee shield to establish wireless connection between two computers. Although, we ran into a problem here too. We will need to book office hours with the Professor to figure this out.

Challenges

As observed in the video shared above, the car moves forward in intervals. This only happens when the code for the motion sensor is uploaded. We have deduced that this must be because of all the system delays associated with the working of the ultrasonic sensor. Furthermore, we cannot figure out how to connect the Xbee shield to p5js. We figured out how to connect the two components of the shield together, but we cannot properly connect it to p5js and then to Arduino. Solving this issue will mark the end of our project.