Week 13: Final Project User Testing

User Testing Videos

The first video was at the initial stages of development, where the user have feedback on the basic functions of the project. The feedback was integrated into the work before the second stage of user testing. The second user testing was done at a more advanced stage to see if the controls were improved and what steps need to be taken to improve the user experience and clarify the rules and functions of the project.

User Testing 1

User Testing 2

Are they able to figure it out? Where do they get confused and why? Do they understand the mapping between the controls and what happens in the experience?

They were able to figure it out with minimal instructions provided to them which will be added to the introduction screen at the start of the experience. Though one user did confused on whether it was necessary to keep pressing the sensor or whether one press would be sufficient to reach the work, which will be integrated in the instructions to avoid confusion. Further, they expressed that the speed difference that comes with more pressure being applied to the sensor was not noticeable so creating a larger range of speed could also be beneficial in clarifying the workings of the project.

What parts of the experience are working well? What areas could be improved?

The first user expressed her worry about the movement not being smooth during the walk to the artwork, with multiple stops despite them not removing their hand of the sensor. Which turned out to be an issue with the physical set-up of the wires, that created instability in their readings. Which was then improved for the second user testing, resulting in smoother transitions between artworks when the sensors are pressed. They also expressed that they wished there was a way to enter the artworks from the controller itself, not from the keyboard, which resulted in me adding a toggle switch to the controller for a more compact and full function control area that improved the experience to be easier to navigate. I believe hiding the wires below the board is the next step to further improve the user experience and make sure it is as easy to navigate and visually appealing as possible.

What parts of your project did you feel the need to explain? How could you make these areas more clear to someone that is experiencing your project for the first time?

With a introduction screen to introduce the user to the controls, I believe I can avoid adding sentences at the top of the screen which I believe can be distracting, so for a more engaging experience I’d like to integrate an opening screen that holds all the instructions in a concise and digestible form that the user can understand. I am also currently working on improving the artworks to include the different elements discussed in the project plan with integrating sound and visuals into the experience to make it more interesting and meaningful.

The process of testing was very insightful on what an outside look on the project is like and has assisted me in planning out my next steps more productively to create the best possible experience for the users. I will be integrating the feedback of the users during the next development steps along with my original plan.

Week 12: Final Project Proposal

Concept

The concept behind my final project is a simulation of one of my favorite activities to do, visiting smaller galleries. The project will include two elements, a virtual room that you can explore that holds the different artworks, and a physical floorplan board that the user will be able to interact with to move around the virtual room. There will be four artworks that the user can interact with each exploring a different element of a joined overarching theme. One work will be an auditory experience, where the user will contribute to the sound being created by the work, while one will be a visual experience where the user controls the flow of the visual. The other two works will have an experience that is interactive in the floorplan board where once again we explore audio and visuals, though through different elements.

Design & Description of the Arduino 

Arduino acts as the physical interface between the user and the virtual gallery. It reads interactions from the floorplan board and sends that data to the p5.js program, which controls the virtual room and the artworks inside it. The board will send signals from the pressure sensor to both signal which direction to go and how fast to move in that direction. The pressure sensors will be placed in front of the placeholders for the physical artworks. There will also be LED’s placed at each work that will receive a signal from p5 if the current artwork is the one the user is on the LED will start flashing. Additionally, two of the art works will be receiving signals and creating an output on the board one with sound and one with visuals. While it would more efficient to produce sound from p5, the idea behind the fourth work is the interaction of sound between the p5 and arduino at the same time.

Design & Description of the P5

 The p5 will be the main hub of the project where all the artworks will be displayed, the room will be navigable where the user will give signals to the arduino that will be received by the p5 that will control the navigation direction and speed. The works will also be sending a signal when they are pressed by a user on the p5, the work will enlarge on the screen and the LEDs will be sent a signal to start blinking. Additionally, conversely to the arduino two of the works will be sending signals to control audio and visuals on the floor plan.

Progress 

Below is a rough draft of the vision I have for the floorplan board that the user can interact with to control their experience. It will be laser-cut acrylic that will have pressure sensor below each of the artworks that will be used to navigate to the work. The lights at the bottom of each work can we an LED that will blink of that is the work the use is currently in.

For the p5.js, I explored possibilities on how to create an interactive 3D room, and created a draft of what it could possibly look like, using WEBGL, orbitControl() and camera controls. It was interesting to see the possibilities within p5 that I haven’t yet explores and it really changed my perspective on what the project could look like.

 

Week 11: Final Project Preliminary Concept

For the final project, I will create an interactive virtual gallery that will respond to the user input through a variety of sensors to guide them to a direction in the gallery. The works themselve are interactive as well, on a smaller scale, where each explore a different output, such as sound and visuals. The concept behind it is exploring the intersection between interactivity and the different themes we explored through class and the reading, to provide a viewer of the work with an engaging experience where they get to see a reflection of their actions, giving them control over the work.

Week 11: Reading Response

Design meets disability covers an important aspect of assistive technology that is often overlooked by the developers of such devices: design. The design aspect of these devices is built on the societal belief that disability is something to be ashamed of or something that needs to be hidden.  Hence the discretion takes center in the design process, leaving behind aesthetics. Moving away from this belief and centering aesthetics over discretion would contribute to overcoming the stigma behind disability not only on an individual level, but on a societal level. The writer highlights the example of the glasses, that converted an assistive device from just a functional device to a fashion piece that is used as a method of expressing identity.

This shift in perspective that came from the focus on aesthetics caused a significant change on the view of visual impairments, that could be applied to other devices. That can only be achieved when designers begin viewing these devices as an essential part of everyday life rather than medical necessities, focusing on how they can be used by the users to display their identity. Converting devices such as hearing aids and prosthetic limbs to opportunities for expression rather than reminders of their differences. This was an important message as approaching the design of assistive technology with creativity and curiosity, would transform the field of development of assistive technology. Therefore, putting aesthetics in the forefront is not merely a matter of visual aspect. It involves acknowledging the emotional and social aspects of disability and creating designs that respect and promote visibility instead of pushing them into the shadows.

Week 11: Serial Communication

Exercise 1

For this exercise, we decided to use Ultrasonic sensor because we thought it will interesting to control the ellipse with our hand kind of like in video games Wii. Wherein based on your physical hand movement the objects move on the screen. To proceed with this idea, we first made a connection to the Arduino on P5.js, similar to how we did it in class. Then to control the x-axis we first read the serial write in P5.js and then mapped the value read from 0 to windowWidth. Another thing we noticed while performing this assignment is that the movement for ellipse was very abrupt and so we used a function called ‘lerp’. I had previously used this function for one of the P5 assignments and so I recalled that to smooth distance we can use this function. This function basically generates a value between my last x position to the current measured distance at certain interval (0.1 in this case). This makes sure that the circle moves in a smooth movement. Below I have attached my code to P5 and Arduino along with the demonstration video.

P5.js

Arduino

exercise1-demo

Exercise 2

For the second exercise, we chose to create a visual representation of the conversion from one side to another. Inspired by the chess clock, where time shifts between players while playing. We designed the LEDs so that as one grows brighter, the other dims, in turn symbolizing the passing of control from one side to the other. We started by first establishing the serial connection between Arduino and p5.js, similar to the previous exercise and what we did in class. In the p5 we created a slider ranging from 0 to 255, which we used to determine the brightness of LED 1, then for LED 2 we then set the inverse of that value so that as one increased, the other decreased. We continuously sent these two mapped values through serial in the format “value1,value2” and read them on the Arduino side to update the LED brightness using analogWrite. This setup allowed us to control both LEDs simultaneously from the browser and visually see the transition between them. Below are the p5.js and Arduino code along with the demonstration video.

P5.js

Arduino

exercise2-demo

Exercise 3

To complete this exercise we made changes to the gravity ball code provided. We first made sure that the LED lights up everytime the ball bounces to do so we maintained a state variable so everytime the ball touches ground we switched the state to HIGH else we set the state to LOW. This helped us achieve the goal. For the second module of this exercise, we decided to use our concept in exercise 1 i.e. ultrasonic sensor to control the wind movement of the ball. We looked at our distance from ultrasonic sensor and then decided to set a threshold of 50. So if the distance is >50, we set the wind speed to -3, else 3. This helped to move the ball in different directions and control it with our hand. We have provided the P5.js and Arduino code below for your references.

P5.js

Arduino

exercise3-demo

Week 10: Reading Reflection

Victor’s brief but powerful rant on the future of interaction design continued to open my eyes to the extent that technology has cut off human beings from the physical world. For him, the so-called “future” of design with its touchscreen and glossy surface is not a revolution but merely a very limited advance that pays no attention to the human side of things. He completely turns the issue around and states that our hands are the most delicate locks for the least skillful and least tech-savvy users. The describes actions as elementary as page turning and a glass of water holding caused me to realize the extent of feedback and consciousness that humans get from touch. At this point, one might conclude that most of the devices one has nowadays are the ones that take the feedback away from them. His term “Pictures Under Glass” truly resonated with me since it brilliantly encapsulates the notion of how dull and one-dimensional technology interactions can appear. The reading of his rant made me reflect on the directors’ point that soft human qualities should not be engendered by modern technology. In my opinion, he wants us to know that true advancement should keep us emotionally attached to our creations while tech, unfortunately, does the opposite.

In his later responses, Victor makes it clear that he was not trying to destroy de facto technology but to show the way to future development. He underlines that the iPad and the likes are already very important and revolutionary, and still, they are not the end. The comparison of the iPad with the old black-and-white photography was very pleasant to me. It was good for the time but the market for color film kept rising. Victor believes that it should be the same with interaction design. Not only should we seek the right ways to design technology that is visible, tangible, and interactable, but also we should explore such ways. What I found most striking was his stress on the whole body in interaction rather than just a fingertip. He said that most of us are sitting and staring at screens all day long, which makes a total separation from our original nature of moving, feeling, and exploring. This idea was very strong to me as it brings technology back to something very human. Reading both articles made me rethink the role of design in either restricting or enlarging our innate capabilities. Victor’s writing is a sign that the technology of the future should make us feel more alive and interconnected, while on the other hand, it should not have the opposite effect of making us feel dead and isolated.

Week 10: Musical Instrument

Concept

The musical instrument that inspired us to create our project for the week, is the xylophone. An instrument that is both interesting and easy to navigate, which made for the perfect interactive musical instrument to create using arduino. We created it using three tinfoil strips that each play a different note when they are pressed, with a potentiometer controlling the frequency of every note adding dimension.  The ability to control two different elements such as the note played and it’s frequency provides the user of the participant the ability to create a range of sounds, diversifying the possible experiences that one can create with our instrument.

Demonstration

Schematic

Code Snippet

 if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    Serial.print("Strip "); Serial.print(activeIndex+1);
    Serial.print(" freq=");
    Serial.println(freq);

    delay(50);
  } else {
    noTone(buzzerPin);
}

This part of code was one that was the most challenging, as it controls all important elements of our instrument. I would say it is where all the magic happens, bringing together both the analog and digital input to create the right sound intended by the player of the instrument. It calculates the final tone by multiplying the base frequency by a set multiplier, keeping the result within a safe range, and then sends this signal to the buzzer. When no input is detected, the sound stops. Essentially, it acts as the core logic that translates user interaction into audible output.

Full Code

Github Link

Reflection

This week’s assignment felt like a more interactive work, building up on our previous projects and adding dimension using more elements. For future improvements we would like to add a limit to the duration a note plays similar to an actual xylophone where a note stops after a certain time after playing a note. This imitates an actual instrument creating a more natural and realistic feel to the work, even playing multiple notes allowing the connection of more than one foil strip and more buzzers. For future projects, I’d like to focus more on aesthetics and adding more visual elements to the works.

Week 9: Analog Input & Output

Concept

For this week’s assignment, I worked on building a Study Tracker that turns study and rest into clear, color-coded cues so I can keep my attention steady. The buttons each control one of the LED’s, making them flash when pressed then stop again signaling study and break depending on the button pressed. The LED flashing means the student is in a break and the blue LED flashing means the student is focused. While the the dial to set the brightness so the lights fit the time period of the studying, softer at night and stronger during the day. Over time the repeated pairing of blue with study and red with rest builds a simple habit, so the colors start to set the mood before a student begins studying.

Demonstration

IMG_7760

Code Snipped

//Map potentiometer to brightness
  int potValue = analogRead(potPin);
  int brightness = map(potValue, 0, 1023, 0, 255);

  //Button 1
  bool button1State = digitalRead(button1Pin);
  if (button1State == LOW && lastButton1State == HIGH) {
    led1Flashing = !led1Flashing; 
    if (!led1Flashing) {
      analogWrite(led1Pin, brightness); 
    }
    delay(200);
  }
  lastButton1State = button1State;

  //Button 2
  bool button2State = digitalRead(button2Pin);
  if (button2State == LOW && lastButton2State == HIGH) {
    led2Flashing = !led2Flashing;
    if (!led2Flashing) {
      analogWrite(led2Pin, brightness);
    }
    delay(200);
  }
  lastButton2State = button2State;

Brining together the different elements we learnt in class, I believe that this part of the code to me was exploring how there are countless ways to combine techniques and create something new. Which reminded me of this week’s reading which discussed the value of building up on existing works rather than worrying about creating something did not exist before. Creating something functional and different from the simple parts given to us is possible.

Complete Code

Github Link

Schematic

Reflection

During this assignment, exploring both digital and analog input, was interesting in discovering how they can interact together to add depth and dimension to the work. The difficult part to me was coming up with a way to creatively merge both of these elements while not having them crash with each other. In future work, I’d like to explore more sensors and switches and integrate them into my work, there were several ones we dicussed in class that I would like to explore.

Week 9: Reading Response

Most modern discoveries are build on foundational concepts that have been around for years. We are nothing without the knowledge of those who came before us. Which is why it is very insightful to dive into the previous works in the field of physical computing, going through the best and the not so best inventions and creations. Everyone is fixated on creating the next new thing that we forget the importance of building on and improving previous works and ideas. Through reading about these works, I was without a doubt inspired for future projects on how to integrate physical computing. It brought to my attention crucial elements that I would have otherwise overlooked such as the importance of balancing aesthetics and functionality. Projects such as the video mirrors had aesthetics as the central focus foregoing any structural interaction. While the lack of valuable interaction is a valid concern, it did get me thinking on the importance of interaction and if there is value that comes from aesthetics that makes overlooking interaction the correct move. This ties in with themes from the previous readings that discussed the balance between aesthetics and functionality, a concept which I believe were put to the test in these projects.

“Set the Stage, Then Shut Up and Listen” is a harsh message to the artists of interactive art, yet it is an important one as well. Creating interactive art is a collaborative effort, like a puzzle where the artist only has half the pieces and needs to guide the participants so they could place the remaining ones.  It is a reflection of the creator of the work as well as every person that choses to interact with it. A dynamic conversation where the artists becomes primarily a listener, following the guide of the consumers of their work. It is tempting to share the message behind a work, especially when you spent a long time developing it and integrating a message within it. Though, how can you conclude and solidify a message when the work has not been interacted with and spoken to from outside contributors. On the other hand, I do think that it is important to find a balance, where there is a foundational message that the work is being built on, one that can be altered and developed by outsiders. As without a strong foundation a project is at a risk of being void of a message to be build on by the participants. Once again, we are in a position where a balance is crucial to the building of a meaningful and engaging work.

Week 8: Unusual Switch

Concept

The creation of an unusual switch was an interesting task, considering I have created simple LED circuits before, though never with anything beyond what is given in our kit. I decided to use a piano key as a switch to create an interesting switch to add an element of sound that adds dimension to the switch and work. Through this I wanted to create an engaging yet simple work that helps me ground my understanding. To add another layer I wrote code for the light to blink when the key is pressed, which adds a visual to the current auditory element.

Video Demonstration

IMG_7568

Code Snippet

void loop() {
  //Read the state of the switch
  int switchState = digitalRead(A2);

  //Loop to blink if switch is ON
  if (switchState == HIGH) { 
    digitalWrite(13, HIGH); //Turn the LED ON
    delay(250);  // Wait 250 ms
    digitalWrite(13, LOW);   //Turn the LED OFF
    delay(250);  // Wait 250 ms         
  } else {
    digitalWrite(13, LOW); //Keep the LED OFF
  }

}

During this assignment I wanted to integrate two different things we learnt in class to add a little more interactivity to the switch so I decided to use the blinking mechanism during the time where the switch is connected. After prototyping with the regular switch, I added the part of the code where the blinking is handled and noticed a difference on how engaging the switch became which could be useful in creating game interactions or bringing a viewer’s attention to a specific part of a work of interactive art.

Complete Code

Github link

Reflection

This assignment was a great starting point for learning arduino, though a simple task, it helped me make sure that I am well acquainted with the basics of it and I am now excited to learn more. I’d say there is a lot that could be done to improve and develop this work, including adding more LEDs, switches or even more complex visual display when the switch is turned on through a more complex use of digitalRead. Further, this taught me the potential of both hardware and software interacting and what type of work could come out of using such tools together, which inspired me and got me thinking about future projects and how I could make use of this dynamic or how I could have used it in my previous projects.