Week #13 – “To Gaze Is Not To See”, Final Project User Testing

Concept

Throughout history glasses have developed from tool for eyesight that simply follows its function to a fashion piece that remains essential for people with poor vision. As a person who is prescribed to wear glasses, I am a big fan of the frame that I have, but I decide to wear contact lenses instead, since lenses for my glasses make my eyes seem small – at least this is what I notice a lot. Lenses for myopia and hyperopia, two most popular vision conditions, distort the look of eyes through the lenses by making them smaller or bigger, respectively. However, such changes of proportions are often noticed solely by those who wear glasses and not by others – we memorise our own face features and overlook such minor changes on the faces of others.

“To Gaze is Not To See” is an interactive artwork that invites the user to put on a pair of glasses to begin their experience. The frame is red, which is what allows the visuals to appear on the screen – when web camera detects red, the ornament is displayed. Unless the user puts on the glasses, they are unable to explore the work.

The glasses are equipped with two speakers located at the end of both temples, which are connected to Arduino to play sound. The chosen song is “Triangle/Gong/What” by Wolfgang Tillmans – I have decided to play it on loop to make the experience of the user more immersive and slightly disturbing due to the surreal flow of the melody. In order to attach the speakers, I have 3D modelled the glasses from scratch using Rhino, then printed them on Prusa printer, and then spray painted with the desired shade of red. Several prototypes were made and 6 pairs of glasses were printed in total, but only one was perfected and chosen for the showcase.

The electronic components were hidden in a 3d printed eyeball that featured an LED Arcade button as an iris – when it was pressed, the visuals on the screen changed from “myopia” mandala to “hyperopia” mandala. The difference between two ornaments was in the size of the eyes and their placement on the screen, creating a confusing and almost surreal imagery of changing projections of eyes.

Figure 1: testing the placement of speakers

Figure 2: testing the first printed frame downloaded from outside

Figure 3: testing the frame modelled by me

Figure 4: spray painting the frame

User Interaction

During the showcase more than a dozen of people came up to my project to test it. Thanks to planning the setting in advance, I have managed to set up proper light conditions to make the image captured by the web camera more clear. However, due to the loud conversations and music around, it was difficult to hear the melody playing through the glasses, so I had to adjust the volume on the spot.

All people who experienced my work were very interested by the concept and surprised by the implementation, as they did not expect to see their own eyes on the screen displayed in real time. The problem behind the concept was relatable to all participants who also wear glasses, as they have agreed that they experience the same issue.

IMG_9653

Figure 5: User Interaction video

Figure 6: showcase setup

Sketch and code

GitHub Repository: https://github.com/am13870/FinalProjectIM

p5.js sketch:

What I am proud of

I was happy with my concept from the start, and I think I have managed to create a good representation of the experience I was intended to capture. The most challenging part was to connect the audio, as I was not simply using tones that would work directly with Arduino UNO, but decided to play an mp3 file with an actual song. I did not realise that I would have to use additional electronic components for this until the day before the showcase, since I was so involved into modelling of the glasses and perfecting the mandala ornament visuals. As such, I had to make urgent edits of the electronic scheme, adding an MP3 Playback Shield from Adafruit, exploring and testing the libraries that had to be added to my code, then adding an amplifier for the speakers, and finally soldering a connector from actual headphones to the speakers. Nonetheless, in the end I was satisfied with the quality of the audio, and I am very proud that I did not give up with the idea of including an mp3 file and have managed to do this.

Future improvements

While I am glad that I have managed to find and implement a proper library to work with eye tracking, I believe that the visual part could be improved even further in terms of quality of the image and variety and complexity of ornaments. I have tried to use an external web camera to extract video with a higher resolution, however, this made the sketch too heavy for my laptop to run correctly. I wish to develop this project further, as I have really enjoyed working on it and I plan to include it into my portfolio.

 

Final Project Progress

Concept

I have decided to work on the idea that I initially proposed, but with several alterations that will allow to focus on the idea of the distorted image. The glasses will be asymmetrical, with one square and one oval frame, and two lenses will be changing the eye size in different ways – one making it smaller (for myopia) and the other one making it larger (for hyperopia). The web camera will be projecting an image of the eye area of the user on the screen with the p5.js code, where the eyes will be changing their size and moving, creating almost a hypnotising performance accompanied by the audio coming from the glasses.

Arduino

Electronic components, piezo buzzers, will be embedded into the temples of the glasses so that they act as speakers that allow the user to hear the melody clearly. The buzzers will be connected to the Arduino using wires, and the plate will be hidden in a 3D printed eyeball that the user can hold.

p5.js

Using a library WebGazer.js, I will be tracking the eyes of the user and showing a representation of them on the screen. The image of both eyes will be changing on the screen, creating a likeness of a mandala ornament, which will be done by defining the movement of the eye image extracted from the web camera.

Progress

I have discovered several examples of other projects on p5.js where the creators used the same library and came up with various ideas of working with the image of eyes and the gaze tracking. This allowed me to understand the logic of working with this library and possible application to my work.

Final Project Concept

After reading the chapter from Graham Pullin’s book Design Meets Disability and thinking about my personal experiences related to this topic, I decided work on an unconventional model of eyeglasses. My final project will be a synthesis on an interactive artwork and an assistive device, inviting audience to think about the stigma around certain types of products for disabled people.

Drawing inspiration from Meta Ray Ban smart glasses,  I want to combine audio with visuals by adding small speakers to the ends of the temples, so that the user can hear the sound clearly only when they wear the product. In terms of visuals, I want to use camera mode in p5.js to detect the eye area of the user, with the help of glasses designed in a specific way to facilitate this. I will be modelling the glasses myself using Rhino and then printing them on 3D printer in order to create a precise structure that would fit desired electronic components. The speakers will be connected to Arduino, which in its turn will be connected to p5.js sketch using serial communication. Depending on the gazing sequences of the user, the visuals on the screen will be changing, depicting eyes of changing sizes. For user’s eye-tracking WebGazer.js library will be used.

From my individual experience and observations, one of the key reasons why people feel awkward about wearing glasses is because of the way they distort the size of their eyes depending on the prescription, hence altering the familiar look of the face. Such minor things are often invisible to strangers, but people wearing glasses can become extremely conscious about this. By providing an exaggerated illustration of such distortion, I wish to draw attention to the way people with disabilities perceive themselves.

Assignment 9 – Serial Communication

Task 1

Prompt: Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on Arduino is controlled by p5.

We decided to approach all the tasks following Christmas style.

P5 Code:
let positionX = 0; 
function setup() {
  createCanvas(600, 600);
  noFill();
}
function draw() {
  background("#bb010b");
  stroke("#ffcf00");
  // maping the values from light sensor to the width of the canvas
  ellipse(map(positionX, 0, 1023, 0, width), height / 2, 100, 100);
  if (!serialActive) {
  // the initial screen of p5, when it is not connected to arduino, this helps to start serial communication
    background("#2d661b");
    stroke("#white");
    textSize(50);
    text("Press Space Bar to select Serial Port", 20, 30, width - 30, 200);
  }
}
function keyPressed() {
  // starting serial communication when space bar is pressed
  if (key == " ") setUpSerial(); 
}
function readSerial(data) {
  if (data != null)
    // ensuring there is actual data, then storing it in a variable
    positionX = int(data);
}
Arduino Code
int lightPin = A0;

void setup() {
Serial.begin(9600);
pinMode(lightPin, INPUT);
}

void loop() {
int sensorValue = analogRead(A0);
Serial.println(sensorValue);
delay(5);
}

Task 2

Prompt: Make something that controls the LED brightness from p5

For this assignment, we decided to control LED brightness using 4 circles/buttons on p5 interface. By pressing different circles, you increased the brightness of LED.

P5 Code 
let LEDbrightness = 0; 
function setup() {
  createCanvas(600, 600);
  background('#2d661b');
  stroke ('#white');
  noFill();
}
function draw() {
  
  if (serialActive){  
  
  //circles-buttons
  circle (150, height/2, 50);
  circle (250, height/2, 50);
  circle (350, height/2, 50);
  circle (450, height/2, 50);
  
  // to understand whether user pressed specific circles, we used distance function. the LED brightness had 4 variations
  if (dist (mouseX, mouseY, 150, height/2) <= 50){
    LEDbrightness = 60;
  }
  if (dist(mouseX, mouseY, 250, height/2)<=50){
    LEDbrightness = 120;
  }
  if (dist(mouseX, mouseY, 350, height/2)<=50){
    LEDbrightness = 180;
  }
  if (dist(mouseX, mouseY, 450, height/2)<=50){
    LEDbrightness = 255;
  }
    }
  
  if (!serialActive) {
    // to understand if serial communication is happening
    textSize(50);
    text ("Press Space Bar to select Serial Port", 20, 30, width-30, 200);
    }
  }
function keyPressed() {
    if (key == " ")
      // starting the serial connection using space bar press
        setUpSerial(); 
}
function readSerial(data) {
  //sending data to arduino
    writeSerial(LEDbrightness);
}

 

Arduino Code
void setup() {
Serial.begin(9600);
pinMode(5, OUTPUT);
}

void loop() {
analogWrite(5, Serial.parseInt());
Serial.println();
}

IMG_9258

Task 3

Prompt: Take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor.

We decided to use light sensor as analog sensor, which controlled the wind on p5 that affected movement of ball.

P5 Code
let position, velocity, acceleration, gravity, wind;
let drag = 0.99,
  mass = 50,
  hasBounced = false;
function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.5 * mass);
  wind = createVector(0, 0);
  textSize(18);
}
function draw() {
  background(255);
  if (!serialActive) {
    background("#2d661b");
    fill("white");
    text("Press F to select Serial Port", 20, 30, width - 30, 200);
  } else {
    applyForce(wind);
    applyForce(gravity);
    velocity.add(acceleration);
    velocity.mult(drag);
    position.add(velocity);
    acceleration.mult(0);
    if (hasBounced) {
      fill("red");
    } else {
      fill("white");
    }
    ellipse(position.x, position.y, mass, mass);
    if (position.y > height - mass / 2) {
      velocity.y *= -0.9; // dampening
      position.y = height - mass / 2;
      if (!hasBounced && abs(velocity.y) > 1) {
        hasBounced = true;
        setTimeout(() => (hasBounced = false), 100);
      }
    }
  }
}
function applyForce(force) {
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}
function keyPressed() {
  if (key == " ") {
    mass = 20;
    position.set(width / 2, -mass);
    velocity.mult(0);
    gravity.y = 0.5 * mass;
    wind.mult(0);
  } else if (key == "f") {
    setUpSerial();
  }
}
function readSerial(data) {
  if (data != null) {
    let value = int(trim(data)); 
    wind.x = map(value, 0, 1023, -1, 1);
    let bounceState = 0;
    if (hasBounced) {
      bounceState = 1;
    }
    writeSerial(bounceState + "\n");
  }
}

 

Arduino Code
const int ledPin = 5;
const int lightPin = A0;
void setup() {
    Serial.begin(9600);
    pinMode(ledPin, OUTPUT);
    pinMode(lightPin, INPUT);
    while (Serial.available() <= 0) {
        Serial.println("0"); 
;    }
}
void loop() {
    while (Serial.available()) {
        int ledState = Serial.parseInt(); 
        
        if (Serial.read() == '\n') {
            digitalWrite(ledPin, ledState);
            Serial.println(analogRead(lightPin));
        }
    }
}

 

IMG_9260

Reading Reflection – Week #11

Design Meets Disability

While reading the chapter from Graham Pullin’s book, I caught myself thinking about a recent case from my friend’s life – she is a student at Tandon, working on a project in collaboration with Langone Hospital. The theme of the project is development of products for accessible environment, specifically a belt that allows people with blindness to orient in space. My friend’s immediate suggestion was to move the sensors from the belt to hands, e.g. to gloves, so that calculation of the distance to objects was more precise – makes sense, right? However, her idea was met with criticism from the professor – she forgot about that very discretion.

Our goal is to make sure that their limitation not only does not interfere with their lives, but also remains absolutely unnoticeable” – although I do understand the point of my friend’s professor, I would formulate this objective differently. From my understanding, discretion is not simply about hiding the product, but rather about its seamless integration into one’s life. Balancing both engineering and design aspects in one person’s mind is extremely complicated, since the “perfect” solutions in both categories of tasks can be mutually exclusive.

As a person who wears glasses and contacts, used to wear braces and orthopaedic insoles, I have been concerned with the image of such products since I was a child – when I was five, I refused to wear Birkenstock sandals because I was convinced that they were ugly, even though they were prescribed to me by a doctor. That fear of looking stupid and funny is relatable to many people, so I believe that there is a huge room for improvement in the field of design for disability.

Week 10: Musical Instrument – Radio Perhaps?

Concept:

After playing with different songs from the GitHub folder, we have decided to include several of them at the same time into our interpretation of a musical instrument. One way to do so would be to switch between three songs back and forth, creating a version of a radio with changing stations. Three classical compositions were chosen: Ode to Joy, Fur Elise, and Badinerie. However, this idea could be realised simply with two digital switches, so we decided to push it further by adding photoresistors that would allow us to work with a spectrum of values.

In what ways can a melody be altered while playing? Its volume or playback speed can be changed – as such, we decided to take values from one photoresistor to set up how loud the song was playing, and the other to set up the speed. A combination of all these electrical components helped us to create an advanced version of a radio, resembling a music player that allows to modify certain qualities of the sound.

Highlight of the code:

:^)

Demonstration:

IMG_9132

Update

After a week of troubleshooting, changing the Arduino code and circuit, we still couldn’t figure out a way for project to work as we intended. Therefore, we introduced new idea of a player that will reflect the mood of the room based on its lightning. So we integrated photoresistor and a button. Based on how lighted the setting was, if moderate to highly lighted, the piezo buzzer would play major melody (happy), and if lighting was low, minor melody (sad) was played. The button was used to stop the play of the melody completely. When the button is pressed, melody is silenced.

Here is a link to the Arduino code.

We think the most complex part with working on this new idea, just like it was on the previous one was working with buttons, since it had to stop the play of the melody. We could finally resolve this by introducing boolean variables and setting them throughout the code.

void loop() {
  int lightValue = analogRead(lightPin);
  buttonState = digitalRead(buttonPin);
// checking if button is pressed
  if (buttonState == LOW && lastbuttonstate == HIGH) { 
    isPlaying = false;
    noTone(buzzerPin); // stop playing when button is pressed
  }
  lastbuttonstate = buttonState;
  if (!isPlaying) {
    if (lightValue > 600) {
      playMelody(melody1, duration1, melody1Length); //playing happy melody when sensor value is higher than 600
    } else {
      playMelody(melody2, duration2, melody2Length); //playing sad melody when sensor value is lower than 600
    }
  }
  delay(100);
}
void playMelody(int melody[], int duration[], int length) {
  isPlaying = true; // setting playing state of the melody
  for (int i = 0; i < length; i++) {
    if (digitalRead(buttonPin) == LOW) { 
      noTone(buzzerPin);
      isPlaying = false;
      return;
    }
    tone(buzzerPin, melody[i], duration[i]);
    delay(duration[i] * 1.3); //adding delay between notes
  }
  isPlaying = false; 
}

 

Here is the video of how it works:

IMG_9268

Reflection:

Although we did not expect this idea to be so complex, we have faced serious difficulties in implementing the functions of switches – the audio kept repeating itself without any response to the change of digital or analog values. After troubleshooting the whole code using Serial.print(), we decided to ask the professor for help in order to finally tackle the issue.

Reading Reflection – Week #10

The idea of developing immersive technology that goes beyond the limits of the 2D screen has been concerning me for a while already, so I got very involved into the topic of the rant by Victor Bret. Interactive design was an ambiguous term back in 2011, when the text was published, and remains often misunderstood now. How can we call “pictures under glass” truly interactive if we do not engage with them physically?

Haptics are one way to push the boundaries of human-computer interaction further. When our fingers get some sort of response, even a slight vibration, it makes our experience with gadgets more engaging on a cognitive level. Here is an example from my life: my phone is always in silent mode, so when I type something, I do not hear the artificial sound of keys clicking. At some point I stopped understanding whether I am actually typing – so I turned on the “haptic response” feature. This makes the keys vibrate slightly every time I press on them, which creates a more realistic experience of using a physical keyboard.

Nonetheless, I agree with Bret that interactivity can be pushed even beyond haptics. At the same time, it is still difficult to come up with a clear solution, if there is one. Reading Bret’s ideas almost 15 years later was interesting, considering the latest inventions in AI which, to some extent, contradict with his points about the use of voice – turns out it can be used to generate something beyond linguistics.

Temperature Sensor

Concept

Using both digital and analog switches, I came up with a scheme that allows to change the colour and intensity of the RGB LED depending on the state of the button and values captured by the temperature sensor.

When the button is pressed, the RGB LED turns off, otherwise it is on – this is an example of working with digital switch, which works with high-low states. When the temperature sensor captures a value higher than 25C, the RGB LED lights up with red, and the light becomes more intense the higher the temperature increases. When the sensor captures a value lower than 25C, the RGB lights up with blue, and the light becomes more intense the lower the temperature decreases – this is an example of working with an analog switch, which allows to manipulate a spectrum of values.

Highlight of the code

It took me a long time to come up with the baseline temperature that would make the switch from red to blue light in room temperature conditions possible. In the end, I could use my own hands to make the sensor feel warmth, and then I used a frozen thermos wrapped in tissues to make the value decrease below 25 degrees.

if (switchState == LOW) {

  if (temperature > baselineTemp) {
   
    int redBrightness = map(temperature, baselineTemp, 30, 0, 255);
    redBrightness = constrain(redBrightness, 0, 0);
    analogWrite(redPin, redBrightness);
    analogWrite(greenPin, 0);
    analogWrite(bluePin, 0);
  } 
  
  else {
    
    int blueBrightness = map(temperature, baselineTemp, 25, 0, 255);
    blueBrightness = constrain(blueBrightness, 0, 0);
    analogWrite(redPin, 0);
    analogWrite(greenPin, 0);
    analogWrite(bluePin, blueBrightness);
  }
}

 

Demonstration

Amina_Temperature

Reflection

For future improvements of this idea, I could play with more shades of RGB LED light that would be associated with different values on the spectrum, since this electrical component allows to work with almost any colour of the light.

Reading Reflection – Week #9

Physical Computing’s Greatest Hits (and misses)

As I was scrolling through the article by Tigoe before reading it, I immediately recognised several works I have read about or saw before. As I started reading it in depth, I understood why – these categories of physical computing projects are indeed very popular. Nonetheless, some of them were still unexpected, for example, I have never thought that a popular Dance Revolution arcade game is in fact an example of physical computing. This illustrates how many systems we interact with actually belong to one or several of the outlined types of projects, but we tend to overlook this. I plan to experiment with these categories further, possibly creating something on the edge of several ideas.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

Coming from Visual Arts background, I am used to providing a description for each of my works, explaining what was the idea behind the piece and sometimes how to look at it. After reading the article by Tigoe, my perception on the explanatory part had been changed – I have not thought about the importance of open interpretation for interactive works before. From perspective of the creator, I often feel that I have to define the ways audience should observe my art, otherwise they might overlook something. As a viewer, I do feel that sometimes installations and other immersive works in museums and galleries can seem too predefined. As the author mentions, the basics should be outlined and the rest of the job should be left for the interactor – and I agree with this balance.

Reading Reflection – Week #8

Norman,“Emotion & Design: Attractive things work better”

Don Norman, famous for outlining principles of user-friendly design, argues that aesthetics play an important role in the functionality of designs, impacting directly not only usability of the product but also user’s emotional response. An attractive object evokes positive emotions simply when looking at it, which further motivates the user to explore and engage. This is explained through the concept of emotional design, which highlights not only the decorative but also functional role of aesthetic in design.

I agree that it is important to think about the way your work looks like from the point of aesthetics – in both physical and digital works, “pretty” things catch user’s attention, which is then carefully navigated to functionality. Going back to the famous manifesto “form follows function”, in the context of Norman’s ideas I agree with it – attractive things do tend to work better, especially when the aesthetic and functionality of the product are intertwined.

Her Code Got Humans on the Moon

Margaret Hamilton is an incredibly important figure in computer science, and I am glad that I have learned about her work back in middle school. She is a motivating example of a person who has managed to combine her work and home duties back when the opportunities for women to enter technical fields were extremely limited.

Hamilton’s approach to error management is intriguing to me, since she highlighted the importance of simulating all experiences before bringing them to life. Planning potential errors in advance is crucial when it comes to such big inventions as Apollo. The example of tracking an accidental error and then resolving it under pressure says a lot about the importance of paying attention to all details and planning everything that can go wrong in advance.

In my projects, I wish to learn to pay more attention to such minor things that can potentially go faulty, especially since we have started working with physical electronic models – the risks are higher here compared to purely digital simulations.