Final Project- Emotionally Reactive Room

IM Showcase Gallery:

Concept

For the purposes of the Final Project of Introduction to Interactive Media, I was  presented the challenging task of connecting Software(P5.js) and Hardware(Arduino). To achieve this, I decided to make a room that  promotes sustainability and also reacts to human emotions.

Motivation

The Project was inspired by the words of my roommate who is active member of our campus’s sustainability student interest group.  She constantly urges me to be more mindful of energy usage, especially my tendency to leave lights on unnecessarily. Moreover, she pointed out that I could be more emotionally present. That got me thinking: why not design a room that not only conserves energy but also tunes into human emotions? To achieve this, I incorporated a feature where both the music and lighting adapt to the mood of the person inside, creating an environment that is truly sentient and responsive.

P5.js Part

In the p5.js segment of my project, I began by integrating the camera setup. I utilized ml5.js along with face-api.js for emotion detection. Face-api.js is particularly well-suited for this task due to its ability to analyze specific facial points to ascertain emotions. The library offers a range of emotions including neutral, happy, angry, sad, disgusted, surprised, and fearful. However, for my project’s scope, I’m focusing on neutral, happy, sad, disgusted, and surprised.

I designed the system to handle the analysis for just one individual at a time. Although the camera captures all faces within its view, it processes emotions only for the first face detected. To guide the user, I placed text on the top left of the canvas that displays the detected emotion and its corresponding probability percentage. This not only informs the user about the types of expressions to test but also enhances the interactive aspect of the project.

To make the experience more engaging, I created specific graphics for each emotion using Canva. These graphics dynamically appear and float around the canvas as the detected emotion changes. Additionally, I’ve incorporated adaptive music into the p5.js environment; the music alters based on the detected emotion, thus varying the room’s ambiance to match the user’s current emotional expression. I also added a fullscreen feature that activates when the user presses ‘f’, allowing both the canvas and video to fill the entire screen.

Graphics Used

Happy: Sad: Surprised: Disgusted: Neutral:

Arduino Part

For the Arduino component of my project, I’ve integrated three RGB LEDs and three pressure sensors. Each pressure sensor is linked to a corresponding LED, such that pressing a sensor activates its associated LED.

In p5.js, I am analyzing five expressions and converting the detection results into binary values, represented as either 1 or 0. These values are then transmitted to the Arduino. Based on the received data, if an expression is represented by a ‘1’, the corresponding RGB LED changes its color to match the detected expression.

User Testing:

Hardware and its Pictures:

This prototype of my room features a study table, a bed, and a sofa, each equipped with pressure sensors. Above, there are three LED chandeliers hanging from the ceiling.

How it Works:

Here’s how it works:  Firstly, we initiate serial communication between the Arduino and the p5.js script. The music and camera activate simultaneously with the start of the p5.js sketch. Based on the expressions detected by the camera, the graphics on the display and the music will dynamically change to match your current mood.

When a user presses a pressure sensor on either the bed, chair, or sofa, the RGB LED positioned above that particular sensor will light up. Following this, as your facial expressions change, the color of the LED will also change accordingly: happiness triggers a green light, neutrality a blue light, sadness a red light, disgust a yellow light, and surprise a pink light. This creates a responsive environment that visually and audibly reflects your emotions.

Code:

Here’s the logic for activating the LED based on the sensor value and changing the colour based on the expression detected from the p5.js script.

// Light up the corresponding LED only
if (sensor1Value < 30 ) {
  digitalWrite(redLED1, Sad || disgusted || surprised);
  digitalWrite(greenLED1, Happy || disgusted);
  digitalWrite(blueLED1, Neutral || surprised);
} else if (sensor2Value < 40) {
  digitalWrite(redLED2, Sad || disgusted || surprised);
  digitalWrite(greenLED2, Happy || disgusted);
  digitalWrite(blueLED2, Neutral || surprised);
} else if (sensor3Value < 40) {
  digitalWrite(redLED3, Sad || disgusted || surprised);
  digitalWrite(greenLED3, Happy || disgusted);
  digitalWrite(blueLED3, Neutral || surprised);
}
P5.js Display

Surprised: Disgusted: Happy: Sad: Neutral:

Demonstration:

Part of the Project that I take the most pride in:

The part I’m most proud of is how I mapped the expression values to 0 and 1, based on the percentage of the emotion detected, and then stored them in an array. This simplification made it easier to send binary values to the Arduino. However, figuring out this code took some time, as I initially tried storing emotions and their associated values in a dictionary, which didn’t work.

// maps detected expressions to a set of predefined categories and assigns a binary value based on a threshold

function mapExpressions(expressions) {
    const expressionOrder = ['neutral', 'happy', 'sad', 'disgusted', 'surprised'];
    let expressionValues = [];

    expressionOrder.forEach(expression => {
//       if value deteced is more than 50% make it 1 otherwise 0
        let value = expressions[expression] > 0.5 ? 1 : 0; 
        expressionValues.push(value);
    });

    return expressionValues;
}

Difficulties and Future Improvements:

The most challenging aspect of this project was establishing the serial communication between p5.js and Arduino, which took a solid two days of trial and error. Despite trying numerous approaches, nothing seemed to work until I created a duplicate file, which then functioned flawlessly without errors. Another significant challenge was the coding aspect. Although the code itself was not particularly complex, integrating music and graphics with the face-api was time-consuming, necessitating updates to the HTML file.

Additionally, I encountered difficulties with the pressure sensors. Initially, I used piezo sensors, but they produced inconsistent readings. I switched to force sensors which provided more reliable results, although they required recalibration every five minutes, adding another layer of complexity to the project. I wrote two additional Arduino scripts to calibrate the sensors, which allowed me to run the serial monitor and check the pressure sensor values.

For future improvements, I would consider investing in better pressure sensors. Additionally, instead of relying on my computer’s speakers, I’d like to integrate an external speaker directly connected to the Arduino. This setup would enhance the overall functionality and user experience.

Code:

P5.js:
// Initializing varialbes and arrays

let faceapi;
let detections = [];
let expressionValues=[];
let video;
let canvas;
let happyEmojis=[];
let vx;
let vy;
let songs = [];
let currentSongIndex = -1; 

// loading grpahics and music

function preload() {
  
    happyEmojis[0]= loadImage('1.png'); 
    happyEmojis[1] = loadImage('2.png'); 
    happyEmojis[2] = loadImage('3.png'); 
    happyEmojis[3] = loadImage('4.png'); 
    happyEmojis[4]= loadImage('5.png'); 
    happyEmojis[5] = loadImage('6.png'); 
    happyEmojis[6] = loadImage('7.png'); 
    happyEmojis[7] = loadImage('8.png'); 
    happyEmojis[8]= loadImage('9.png'); 
    happyEmojis[9] = loadImage('10.png'); 
    happyEmojis[10] = loadImage('11.png'); 
    happyEmojis[11] = loadImage('12.png'); 
    happyEmojis[12]= loadImage('13.png'); 
    happyEmojis[13] = loadImage('14.png'); 
    happyEmojis[14] = loadImage('15.png'); 
    happyEmojis[15] = loadImage('16.png'); 
    happyEmojis[16]= loadImage('17.png'); 
    happyEmojis[17] = loadImage('18.png'); 
    happyEmojis[18] = loadImage('19.png'); 
    happyEmojis[19] = loadImage('20.png'); 
    happyEmojis[20] = loadImage('21.png'); 
    happyEmojis[21] = loadImage('22.png'); 
    happyEmojis[22] = loadImage('23.png'); 
  
    songs[0] = loadSound('song1.mp3');
    songs[1] = loadSound('song2.mp3');
    songs[2] = loadSound('song3.mp3');
    songs[3] = loadSound('song4.mp3');
    songs[4] = loadSound('song5.mp3');
}

// Setting up the canvas and video settings
function setup() {
    canvas = createCanvas(windowWidth, windowHeight);
    canvas.id('canvas');
    video = createCapture(VIDEO);
    video.size(windowWidth, windowHeight);
    video.id('video');
  
//   initializes the face detection
  const faceOptions = {
    withLandmarks: true,
    withExpressions: true,
    withDescriptors: true,
    minConfidence: 0.5
  };

  //initialize the model: 
  faceapi = ml5.faceApi(video, faceOptions, faceReady);
  
  image1 = new Emoji(happyEmojis[0],random(0,width-250),0,1,1);
  image2 = new Emoji(happyEmojis[1],random(0,width-250),0,0.5,1);
  image3 = new Emoji(happyEmojis[2],random(0,width-250),0,0.5,1);
  image4 = new Emoji(happyEmojis[3],random(0,width-250),0,1,1.5);
  image5 = new Emoji(happyEmojis[4],random(0,width-250),0,1,0.5);
  image6 = new Emoji(happyEmojis[5],random(0,width-250),0,1,1);
  image7 = new Emoji(happyEmojis[6],random(0,width-250),0,1,1.5);
  image8 = new Emoji(happyEmojis[7],random(0,width-250),0,1,0.5);
  image9 = new Emoji(happyEmojis[8],random(0,width-250),0,2,1);
  image10 = new Emoji(happyEmojis[9],random(0,width-250),0,1,1.5);
  image11 = new Emoji(happyEmojis[10],random(0,width-250),0,1,0.5);
  image12 = new Emoji(happyEmojis[11],random(0,width-250),0,1,1.5);
  image13 = new Emoji(happyEmojis[12],random(0,width-250),0,2,1);
  image14= new Emoji(happyEmojis[13],random(0,width-250),0,1,2);
  image15= new Emoji(happyEmojis[14],random(0,width-250),0,1,1.5);
  image16= new Emoji(happyEmojis[15],random(0,width-250),0,1,1.5);
  image17 = new Emoji(happyEmojis[16],random(0,width-250),0,1,1);
  image18 = new Emoji(happyEmojis[17],random(0,width-250),0,1,1);
  image19 = new Emoji(happyEmojis[18],random(0,width-250),0,1,1.5);
  image20 = new Emoji(happyEmojis[19],random(0,width-250),0,1,0.5);
  image21 = new Emoji(happyEmojis[20],random(0,width-250),0,1,1.5);
  image22 = new Emoji(happyEmojis[21],random(0,width-250),0,1,0.5);
  image23 = new Emoji(happyEmojis[22],random(0,width-250),0,1,0.5);
}

// adjust canvas and video size when window is resized
function windowResized() {
    
    resizeCanvas(windowWidth, windowHeight);
    video.size(windowWidth, windowHeight);
}


function draw(){

  clear();
//   drawaing expressios and drawing graphics on the screen based on the detected emotion
  drawExpressions(detections, 20, 20, 14);
  if (expressionValues.length > 1 && expressionValues[1] === 1) { // 
    image1.display();
    image1.update();
    image2.display();
    image2.update();
    image3.display();
    image3.update();
    image4.display();
    image4.update();
    image5.display();
    image5.update();
    
  
  }
  if (expressionValues.length > 1 && expressionValues[4] === 1) { // 
    image11.display();
    image11.update();
    image12.display();
    image12.update();
    image13.display();
    image13.update();
    image14.display();
    image14.update();
    
  
  }
  if (expressionValues.length > 1 && expressionValues[3] === 1) { // 
    image15.display();
    image15.update();
    image16.display();
    image16.update();
    image17.display();
    image17.update();
    image18.display();
    image18.update();
    image23.display();
    image23.update();
    // playSong(2);
  }
  if (expressionValues.length > 1 && expressionValues[2] === 1) { // 
    image7.display();
    image7.update();
    image8.display();
    image8.update();
    image9.display();
    image9.update();
    image10.display();
    image10.update();
    
  }
  if (expressionValues.length > 1 && expressionValues[0] === 1) { // 
    image6.display();
    image6.update();
    image19.display();
    image19.update();
    image20.display();
    image20.update();
    image21.display();
    image21.update();
    image22.display();
    image22.update();
    
  }

//   playingn songs based on the emotion detected
  if (expressionValues.length > 1 && expressionValues[1] === 1) {
    
    playSong(3);
  } else if (expressionValues.length > 1 && expressionValues[4] === 1) {
    // play song 1
    playSong(0);
  } else if (expressionValues.length > 1 && expressionValues[3] === 1) {
    // play song 2
    playSong(1);
  } else if (expressionValues.length > 1 && expressionValues[2] === 1) {
    // play song 3
    playSong(2);
  } else if (expressionValues.length > 1 && expressionValues[0] === 1) {
    // play song 4
    playSong(4);
  }
  
}

function playSong(index) {
  //stop any currently playing song
  for (let i = 0; i < songs.length; i++) {
    if (i !== index && songs[i].isPlaying()) {
      songs[i].stop();
    }
  }

  // play the selected song
  if (!songs[index].isPlaying()) {
    songs[index].play();
  }
}

// class to handle the grpahics
class Emoji {
    constructor(img,x,y,vx, vy) {
        this.img = img;
        this.x = x;
      this.y = y;
        this.vx = vx;
        this.vy = vy;
    }
    
    update() {
        this.x += this.vx;
        this.y += this.vy;
        // check for canvas boundaries
        if (this.x < -130 || this.x > width -200) this.vx *= -1;
        if (this.y < -110 || this.y > height -150) this.vy *= -1;
    }
// display the graphics
    display() {
        image(this.img, this.x, this.y, 500, 500);
    }
}

function keyTyped() {
  // $$$ For some reason on Chrome/Mac you may have to press f twice to toggle. Works correctly on Firefox/Mac
  if (key === 'f') {
    toggleFullscreen();
  }
}

// Toggle fullscreen state. Must be called in response
// to a user event (i.e. keyboard, mouse click)
function toggleFullscreen() {
  let fs = fullscreen(); // Get the current state
  fullscreen(!fs); // Flip it!
}

// Start detecting faces
function faceReady() {
  faceapi.detect(gotFaces);
}

// Got faces
function gotFaces(error, result) {
  if (error) {
    console.log(error);
    return;
  }
//now all the data in this detections
  detections = result; 
  
//make back ground transparent
  clear();
  

  storeExpressions(detections); 
  faceapi.detect(gotFaces);
}

// maps detected expressions to a set of predefined categories and assigns a binary value based on a threshold

function mapExpressions(expressions) {
    const expressionOrder = ['neutral', 'happy', 'sad', 'disgusted', 'surprised'];
    let expressionValues = [];

    expressionOrder.forEach(expression => {
//       if value deteced is more than 50% make it 1 otherwise 0
        let value = expressions[expression] > 0.5 ? 1 : 0; 
        expressionValues.push(value);
    });

    return expressionValues;
}

// store expressions 
function storeExpressions(detections) {
    if (detections.length > 0) {
//   for the first person in the list, map expressions
        let expressions = detections[0].expressions;
        expressionValues = mapExpressions(expressions);
        // console.log(expressionValues);

        
    }
}

// it draws the percentage of detected emotion on the left top corner of the canvas
function drawExpressions(detections, x, y, textYSpace){
  if(detections.length > 0){
    
    let {neutral, happy, angry, sad, disgusted, surprised, fearful} = detections[0].expressions;
    textFont('Helvetica Neue');
    textSize(14);
    noStroke();
    fill(44, 169, 225);
// uses nf(value, left, right) to format numbers
    text("Neutral:       " + nf(neutral*100, 2, 2)+"%", x, y);
    text("Happiness: " + nf(happy*100, 2, 2)+"%", x, y+textYSpace);
    text("Sad:            "+ nf(sad*100, 2, 2)+"%", x, y+textYSpace*2);
    text("Disgusted: " + nf(disgusted*100, 2, 2)+"%", x, y+textYSpace*3);
    text("Surprised:  " + nf(surprised*100, 2, 2)+"%", x, y+textYSpace*4);
  
  }else{
    text("Neutral: ", x, y);
    text("Happiness: ", x, y + textYSpace);
    text("Sad: ", x, y + textYSpace*2);
    text("Disgusted: ", x, y + textYSpace*3);
    text("Surprised: ", x, y + textYSpace*4);
  }
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////
  console.log(expressionValues);
  if (data != null) {
    

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = expressionValues[0] + "," + expressionValues[1] + "," + expressionValues[2] + ","  + expressionValues[3] + ","  + expressionValues[4] + "\n";
    writeSerial(sendToArduino);
  }
}
Arduino:
// Define LED pin constants
int redLED1 = 12;   // First RGB LED, red pin
int greenLED1 = 11; // First RGB LED, green pin
int blueLED1 = 10;  // First RGB LED, blue pin

int redLED2 = 9;    // Second RGB LED, red pin
int greenLED2 = 8;  // Second RGB LED, green pin
int blueLED2 = 7;   // Second RGB LED, blue pin

int redLED3 = 6;    // Third RGB LED, red pin
int greenLED3 = 5;  // Third RGB LED, green pin
int blueLED3 = 4;   // Third RGB LED, blue pin

// Define sensor pin constants
int sensor1 = A2;  // First sensor
int sensor2 = A3;  // Second sensor
int sensor3 = A4;  // Third sensor

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);

  // Set LED pins to output mode
  pinMode(redLED1, OUTPUT);
  pinMode(greenLED1, OUTPUT);
  pinMode(blueLED1, OUTPUT);

  pinMode(redLED2, OUTPUT);
  pinMode(greenLED2, OUTPUT);
  pinMode(blueLED2, OUTPUT);

  pinMode(redLED3, OUTPUT);
  pinMode(greenLED3, OUTPUT);
  pinMode(blueLED3, OUTPUT);

  // Start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // LED on while receiving data
    int sensor1Value = analogRead(sensor1);  // Read first sensor
    int sensor2Value = analogRead(sensor2);  // Read second sensor
    int sensor3Value = analogRead(sensor3);  // Read third sensor

    int Neutral = Serial.parseInt();
    int Happy = Serial.parseInt();
    int Sad = Serial.parseInt();
    int disgusted = Serial.parseInt();
    int surprised = Serial.parseInt();

    if (Serial.read() == '\n') {
      // Reset all LEDs
      digitalWrite(redLED1, LOW);
      digitalWrite(greenLED1, LOW);
      digitalWrite(blueLED1, LOW);
      digitalWrite(redLED2, LOW);
      digitalWrite(greenLED2, LOW);
      digitalWrite(blueLED2, LOW);
      digitalWrite(redLED3, LOW);
      digitalWrite(greenLED3, LOW);
      digitalWrite(blueLED3, LOW);

      // Light up the corresponding LED only
      if (sensor1Value < 30 ) {
        digitalWrite(redLED1, Sad || disgusted || surprised);
        digitalWrite(greenLED1, Happy || disgusted);
        digitalWrite(blueLED1, Neutral || surprised);
      } else if (sensor2Value < 40) {
        digitalWrite(redLED2, Sad || disgusted || surprised);
        digitalWrite(greenLED2, Happy || disgusted);
        digitalWrite(blueLED2, Neutral || surprised);
      } else if (sensor3Value < 40) {
        digitalWrite(redLED3, Sad || disgusted || surprised);
        digitalWrite(greenLED3, Happy || disgusted);
        digitalWrite(blueLED3, Neutral || surprised);
      }

      Serial.print(sensor1Value < 30);
      Serial.print(',');
      Serial.print(sensor2Value < 40);
      Serial.print(',');
      Serial.println(sensor3Value < 40);
    }
  }
  // Optional: Use built-in LED to indicate the system is running
  digitalWrite(LED_BUILTIN, HIGH);
  delay(50);
  digitalWrite(LED_BUILTIN, LOW);
  delay(300);
}

 

Leave a Reply