User Testing

While testing my final project, I ran into a few issues:

  1. The soundtrack that plays as soon as the user steps in front of the exhibit is longer than the actual interaction.

Solution: Make the soundtracks shorter, test them again.

2. Two users began looking at the exhibit from the left, whereas one began from the right side. This is an issue because the left side is where the “first” soundtrack plays.

Solution: to fix it so that it wouldn’t matter which direction the user is coming from.

3. The user assumes there’s something they can touch, play with, etc. whereas the only actual interactions are based on sensors, distance, and body motion.

Solution: Improve the iPhone exhibit and add a more interactive component, since the motor and the sensor controlling the phone from under a glass cover is not a very straightforward interaction, and two users did not realize what was going on, or that their distance from the sensor was controlling the motor.

4. For someone who doesn’t focus on the background soundtrack, it is not clear what exactly is happening, or what the context of the whole “Future Museum” exhibit thing is. There need to be more visual cues.

Solution: Provide some form of description, or instructions? (Not sure about this one yet)

5. The webcam on the ‘Snapchat simulator’ kept lagging, and the program was running slow. Also, the camera was flipped and a little bit too zoomed so it didn’t feel very natural or selfie-like.

Solution: I think I’ll be able to, with some help, fix the camera flip situation. However, I was told that Processing doesn’t work very fast with cameras and video, so it is possible that I won’t be able to significantly improve the speed. I’ll have to ask Jack for help.

Here’s a video demo:

https://youtu.be/L9w7QQ3ffcQ

Leave a Reply