We know.
We know music can alter the state of the human mind. When you’re happy, a sad song can sway your happiness. When you’re sad, a joyful song can sway the latter. Music can make you feel alone, it brings out parts of your personality that may otherwise be hidden.
Music can shape us, but what if it could shape the environment around us?
“you have no idea how ALONE you are” is the title of my midterm project.
*Graphics & Memory intensive. 4GB RAM is advised.
Space is so unfathomably vast, that we are but like atoms to mountains amid it. Thus, it is only natural for feelings of solitude and loneliness to be invoked in this alien world beyond our blue horizons. This is exactly the emotion the project targets: to invoke feelings of this loneliness, one that we all have felt at some point in our lives.
And what could accentuate that feeling, or perhaps evoke it from slumber to wake? One avenue is certainly music. The music I have chosen for this project is aligned with the genres of science fiction, of terror and upbeat, and of solace and emotion.
I was inspired by the timelessness of the Temple of Karnak (Egypt), with towering pillars making us feel small and very much alone.
Favorite Code Snippet:
let half = {
x: boxSize.x / 2,
y: boxSize.y / 2,
z: boxSize.z / 2
};
let closest = {
x: constrain(spherePos.x, boxPos.x - half.x, boxPos.x + half.x),
y: constrain(spherePos.y, boxPos.y - half.y, boxPos.y + half.y),
z: constrain(spherePos.z, boxPos.z - half.z, boxPos.z + half.z)
};
let dx = spherePos.x - closest.x;
let dy = spherePos.y - closest.y;
let dz = spherePos.z - closest.z;
let distanceSq = dx*dx + dy*dy + dz*dz;
return distanceSq <= radius * radius;
As the game is 3D, I decided to implement collision detection (wouldn’t be much of a game otherwise).
This uses Axis Aligned Bounding Boxes (AABB) as extended upon in the book Real Time Collision Detection.
I really loved how simple yet effective this part of the logic was. It really helped save resources (as compared to my other, independent and failed attempts).
The Toughest Part:
I had absolutely no idea how the final project would look like until 3 days before submission. The reason: I had to script in my own game engine and graphics rendering pipeline, shaders, etc. before attempting to even build anything in the map. That was a significant amount of trust I had to place inside of my own capabilities, and I’m glad I trusted myself. The final result wasn’t what I initially planned for, but it looks awesome!
Known problems:
Unfortunately, with this project came a lot of problems. Firstly, any graphics, interfaces, etc. that I attempted to implement continued to fail. Thus, I was unable to implement any interfaces to tutorials.
Furthermore, the collision detection works properly, but I use a dumbed down version of collision handling. This was so as to prevent the browser consuming egregious amounts of memory, and to reduce lag spikes. There are some points where stuff glitches, but those are few and far between.
Development Process:
The development process entailed me working on the entirety of the project (unless otherwise stated). I wrote the code, sourced PBR textures from PolyHaven (free textures!) and audio from PixaBay (free with credit!) I used google to search for collision detection algos, and came across the book on collision detection on google. AI was used for only sourcing out textures and audio files. I didn’t fancy it too much beyond that.