Are they able to figure it out? Where do they get confused and why?
My friends were a bit confused what to do in first place. So, I included instructions on top left corner, that by doing a fist they could transition between different 3D models and by using index finger and thumb they could manipulate the sketch.
Do they understand the mapping between the controls and what happens in the experience?
They do understand it. Controls how to use it is already given in the top left corner as instructions, so no problem with that.
What parts of the experience are working well? What areas could be improved?
I’m still working on adding new 3D models and adding explanation to each one and how it is specifically tied to Kazakh culture. I want people to learn about Kazakh culture and feel the items, observe them, and etc. I want to do a menu screen, and another instructions screen. Add 3 more Kazakh Cultural elements. Also, more features that would allow to do something interesting with the sketch line draw lines and oscillations and something interactive like this on the background of 3D model. In addition, I will working heavily on Arduino side from now on, I want to make Neopixel Adafruit not just turn a particular color, but make a beautiful pattern, so others on the exhibition will also come to take a look.
What parts of your project did you feel the need to explain? How could you make these areas more clear to someone that is experiencing your project for the first time?
I think using a fist to transition between different 3D models is pretty cool idea, and I get to explore hand gestures more. I will try to make instructions constant on the page, in case user forgot, they will always be there. Also, in main menu I will try to explain a bit about the project itself. The project teaches people about Kazakh culture through interactive 3D objects. The user moves their hand in front of the webcam and controls a 3D asyq, kiiz ui, dombyra, taqiya in real time. Hand rotation and hand openness change rotation, scale, and animation of the models. The goal is to make cultural objects feel alive and playful. Handpose in p5.js finds key points on the user hand. The code maps these points to rotation and size values. The 3D models load from external OBJ files. The models update their rotation and scale on every frame based on hand movement. The interaction is simple so anyone can explore the cultural objects. The user brings their hand into the camera. Turning the index finger rotates the model. Bigger pinch between thumb and index distance makes the model grow. I want to implement moving the hand forward or backward changes the distance of the models. The idea is to mix physical movement with digital cultural storytelling. The Arduino sends arcade button state to. p5.js reads the values using the Web Serial API and starts the game. From p5.js to Arduino communication happens when user does a fist, then the 3D model changes and Adafruit Neopixel lights up a different color depending on which particular 3D model is on the screen. So far, asyq (blue), kiiz ui (orange), dombyra (red), taqiya (green). I am proud that the project takes Kazakh cultural objects and makes them interactive. I like that people can learn culture through movement. I am happy that the 3D models work smoothly with hand tracking. I used generative AI to help fix errors, structure the code, and write the description. All cultural objects are based on real Kazakh designs. I found 3D models online on Sketchfab, and I will reference them. I want to add more gestures for cultural actions like throwing the asyq. I want to add more Kazakh cultural objects. I also want to create a cleaner guide so visitors can learn the meaning of each cultural object.