Hey everyone! 👋
As we’re finishing up our final projects, we also have to conduct user testing to gain valuable feedback and identity potential issues before the showcase.
Here’s one of the short user testing sessions:
As you can see, I did not give any prompt or instructions, and they were able to figure it out nearly instantly (that this is a music synthesizing glove), which is a good sign. They were easily able to play around with it and make music, and understood that bending the fingers was controlling the sound.
However, one piece of feedback I got was that while it was obvious which finger controlled which bar, it wasn’t clear what they were mapped to. I had initially expected that the answer would reveal itself upon the glove being played with for a while (especially with the helpful addressable LEDs even providing hints and feedback on their actions), but clearly this wasn’t the case. Most people were able to somewhat guess the action of the first finger (controlling the pitch (actually the base frequency)), but weren’t able to tell more than that. To be honest though, even I wouldn’t have been able to immediately tell the action of the last 2 fingers. So taking this into account, I added little labels at the bottom of the bars, indicating what each finger controlled.
For some curious enough, I also explained how the sound was actually being synthesized. While this does mean there was something to explain, I think this was more the theory behind the scenes, and so it isn’t vitally important to know. In fact, I think people should be able to play around and enjoy the sound, without knowing the actual technical details behind it, which was the case.
Other than that, I’ll polish up the interface, and although I would also like to improve the music generation, I doubt I’ll be able to change it much between now and the showcase.