For my final project, I am planning to develop an “Emotion-Based Room.” This innovative setup will utilize a webcam to capture the facial expressions of a user, employing machine learning models to recognize various emotions such as happiness, sadness, surprise, etc. I will use the ml5.js and face-api.js libraries to implement the emotion detection feature.
Based on the detected emotions, the room’s environment will dynamically adjust to enhance the user’s experience. Using p5.js, the system will control the playback of music to match the user’s current mood. Additionally, an Arduino system will regulate the room’s lighting to complement the emotional atmosphere.
The physical space will be modeled with essential furniture such as a sofa, bed, study table, and bathroom. I plan to incorporate custom-made pressure sensors on these furniture items to further personalize the room’s response. For example, if a person sits on the sofa, the system will interpret this as a mood for studying and adjust the lighting to a more focused setting. Similarly, lying on the bed will trigger the lights to turn off and the music to cease, facilitating a conducive environment for sleep.
This project aims to create a responsive living space that adapts not only to the expressed emotions of its occupant but also to their physical interactions within the room.