Overall Concept:
My midterm project, The Polyglot Galaxy, is an interactive generative text artwork that visualizes multilingual greetings as floating stars in a galaxy environment. The project expands on my Week 6 text generator into a more immersive interactive media system that implements text, sound, animation, state-based interaction and computer vision.
As each time the user clicks on the canvas, a greeting phrase from a different language is stamped onto the screen. Over time, these phrases accumulate and form an interstellar constellation like galaxy. Within the frame, it will display 4 different voices for my project I changed from a 400, 400 to a 600, 600 frames in order for the game to look a bit larger and I decided to split the frame into 4 quadrants consists of upper left, upper right, lower left and lower right. The visual aesthetic is inspired by space, glow, and floating motion, which represents languages as stars in a shared universe.
The visual aesthetic focuses on glow, floating motion, and cosmic space imagery. The project also includes a webcam frame that reacts to movement and brightness in the camera view. When the user moves or dances inside the camera frame, the brightness changes and the stars twinkle more strongly, making the interaction more interactive and playful. Sound is also integrated to create an immersive environment where clicking produces different audio effects and ambient music will be played during the interaction.
Progress Made:
During this spring break, I made improvements to both the visual interaction and the system structure. Firstly, I implemented a blinking glow effect using sin(frameCount) to animate and increase the brightness of the instruction text and the star-like greetings. This creates a subtle pulsating effect that help reinforces the galaxy atmosphere in the frame.
Secondly, I added 8 bursts that have tiny sparkles in the galaxy which was an idea implemented from Dan Shiffman video on the coding train and when the user clicks on the canvas. These small particles would spread outward like tiny dwarfs or planets and a bit like dancing stars. This gives the interaction a more dynamic, lively and playful feel.
Furthermore, I introduced some state-based interaction using a start screen and play state. When the project first loads, a start screen appears with instructions. After clicking, the user enters the interactive galaxy mode where phrases can be stamped.
Interactive sketch: https://editor.p5js.org/po2127/full/LyMPRYzi8
Another major improvement is how I integrated more of the webcam computer vision. Where I had the camera showing the player of the game. The camera brightness is found by sampling pixels from the webcam feed. This brightness value then controls the speed and intensity of the interaction, meaning the stars react to movement or lighting changes in the camera frame.
Lastly, I also improved the layout and interface to make it more readable by adjusting the position of the instruction text and ensuring it fits nicely within the frame. Moreover, I felt that the background music plays continuously during the play state to create an atmospheric soundscape as I decided to have music that resembled galaxy in space.
Code
Below is the code I am particularly proud of, and the core logic used to capture webcam data and calculate brightness for interaction:
cam = createCapture(VIDEO); // use computer cam
cam.size(160, 120);
cam.hide();
}
function updateCamBrightness() {
cam.loadPixels();
let sum = 0;
// sample pixels +40 for faster
for (let i = 0; i < cam.pixels.length; i += 40)
// +40(RGBAx10) for faster and get realtime
{
let r = cam.pixels[i];
let g = cam.pixels[i + 1];
let b = cam.pixels[i + 2];
sum += (r + g + b) / 3;
} // bright->r,g,b will high and sum will high
let samples = cam.pixels.length / 40;
camBrightness = sum / samples; // Avg brightness = 0..255
}
Sampling every 40 pixels helps reduce computational load while maintaining responsive interaction. This allows the program to run smoothly even while performing real-time visual updates.
I am also proud of the 8 sparkle burst effects, which adds immediate visual feedback when users interact. Despite its simple implementation as a lightweight particle system, it significantly improves the sense of energy and responsiveness in artwork while maintaining a good performance.
let bursts = [];
class Spark {
constructor(x, y) {
this.x = x;
this.y = y;
this.vx = random(-2, 2);
this.vy = random(-2, 2);
this.size = random(3, 7);
this.alpha = 255;
this.col = color(random(180, 255), random(180, 255), random(255));
}
update() {
this.x += this.vx;
this.y += this.vy;
this.alpha -= 8;
}
show() {
noStroke();
fill(red(this.col), green(this.col), blue(this.col), this.alpha);
ellipse(this.x, this.y, this.size);
}
finished() {
return this.alpha <= 0;
}
}
for (let i = bursts.length - 1; i >= 0; i--) {
bursts[i].update();
bursts[i].show();
if (bursts[i].finished()) {
bursts.splice(i, 1);
}
}
for (let i = 0; i < 8; i++) {
bursts.push(new Spark(mouseX, mouseY));
}
Challenges and Areas for Improvement
Challenges I encountered involved browser permissions and webcam access. In some environments, the camera simply doesn’t activate unless the page is running in a secure context or the user explicitly allows permission. To avoid interface issues, I chose to hide the raw camera feed and use it primarily as a data source for interaction.
Another challenge was to balance visual complexity with performance. Since the project involves having multiple animated objects and real-time pixel analysis, I needed to optimize certain processes, such as sampling pixels at intervals instead of trying to process the entire image frame.
In the future, the user interface could be improved further with clearer interaction prompts and more refined visual transitions.
Things to Improve for the Future
Although the project works well still there are several areas I would like to improve in the future.
Firstly, I would like to expand the number of languages and phrases in the dataset as currently the phrases come from a JSON file, but increasing the diversity of languages could make the galaxy feel richer and more global.
Moreover, I want to improve the visual design of the stars and glow effects such as by adding stronger particle systems, gradients, or shader effects could make the galaxy feel deeper and more immersive.
In addition, I would like to refine the interaction between the webcam and the visuals. Because as of now the brightness only affects twinkle speed, but in the future it could also influence star size, color, or particle behavior.
Last but not least, the sound design could be expanded because of now clicking produces different sound effects depending on the screen quadrant, but I would like to develop a more reactive sound system where the music evolves as more languages appear in the galaxy.
Overall, I felt like this project really helped me to explore how generative text, animation, sound, and computer vision can combine into a playful interactive media experience.
References
• Daniel Shiffman. (2019). The Coding Train: p5.js Tutorials.
https://thecodingtrain.com/
These tutorials helped me understand concepts such as webcam capture using createCapture(), particle systems, and generative animation techniques used in this project.
• p5.js. (n.d.). p5.js Reference.
https://p5js.org/reference/
The p5.js documentation was used as a reference for functions such as loadJSON(), sin(), map(), createCapture(), and frameCount that are used throughout the project.
• Casey Reas and Ben Fry. (2014). Processing: A Programming Handbook for Visual Designers and Artists. MIT Press.
• Coding Challenge 78: Simple Particle System