Final Project – MINI DISCO

Mini Disco

User testing:

Reflecting on the user testing of my Mini Disco project with my friend, I found that the experience was quite self-explanatory and intuitive. The design, featuring an arrow pointing towards the sound sensor, seemed to effectively guide the user without the need for additional instructions. My friend was able to figure out the interaction – singing or making noise to trigger the light show – quite easily.

From this testing, I realized the strengths of my project lie in its simplicity and the immediate engagement it offers. Users can interact naturally without needing a detailed explanation, which I believe is a key aspect of a successful interactive design.

However, I see an opportunity for improvement in the visual aspect of the project. Initially, I used cotton to diffuse the light from the RGB LED, but I think what might have been the better option was to replace it with a dome however with the lack of materials I wasn’t able to. My aim is to enhance the visual impact of the light display. and so I envision that a dome could better reflect and spread the light, creating a more immersive and expansive effect that more closely mimics the vibrant atmosphere of a disco.

This change, I believe, could elevate the overall experience, making it not just an interactive piece but also a more captivating visual spectacle. The challenge will be to integrate the dome in a way that complements the existing design while also enhancing the interplay of light and sound.

Concept:

Originally, I envisioned creating a dynamic light source that would visually respond to sound. My plan was to set up a microphone/sound sensor(analog signal) to capture the surrounding audio vibes, where different frequencies and volumes would trigger varied light displays in an RGB LED. For instance, high-pitched sounds would shift the LED towards blue hues, while deep bass notes would turn it red.

I intended to use P5.js for color mapping, transforming the intensity and frequency of the captured sound into dynamic, responsive color schemes. The idea was to have the visuals come alive with vibrant colors and gradients, creating a visually harmonious representation of the audio.

Despite a minor adjustment in my original plan, the essence of the project remains intact. Initially, I intended to use a frequency-sensitive sound sensor, but due to its malfunction, I had to opt for a readily available sensor that operates on a digital signal. This new sensor, while not detecting varied sound frequencies, adeptly measures volume levels furthermore the color transitions of the LED now respond to the loudness or softness of the surrounding sounds.

 

How does the implementation work?

Arduino Implementation:

In my Arduino setup, I began by establishing serial communication at a baud rate of 9600, a crucial step for enabling data exchange between the Arduino and my computer. I configured pin 8 as an input to connect my digital sound sensor, which serves as the project’s primary interactive element. Additionally, pins 9, 10, and 11 were set as outputs for controlling the red, green, and blue channels of an RGB LED, allowing me to create a wide range of colors. In the loop function, I constantly read the state of the sound sensor. If no sound is detected (soundData == LOW), I programmed the RGB LED to emit a blue light, however with sound, it glows red. This immediate visual feedback is achieved by manipulating the LED’s color through the changeLEDColor function, using analogWrite to adjust each color channel. Alongside controlling the LED, I also send the sound sensor’s state as serial data to my computer, where it’s utilized in the p5.js sketch for a corresponding visual display.

p5.js Sketch Implementation

In parallel with the Arduino setup, I developed a p5.js sketch to create a digital visual representation corresponding to the physical inputs from the sound sensor. The sketch initializes by creating a canvas and populating it with a series of particles, each represented by an instance of the Particle class. These particles are given random positions across the canvas, along with properties for size, color, and movement speed. The heart of the sketch lies in the readSerial function, responsible for reading and processing the serial data sent from the Arduino. This data, indicating the presence or absence of sound, is used to dynamically alter the behavior of the particles on the canvas. In the draw function, I update the background and set the text properties. If the serial connection is not yet established, the sketch prompts the user to initiate the connection. Once connected, the sketch confirms this with a display message and starts animating the particles based on the sensor data. The particles grow in size and move smoothly across the canvas when sound is detected, creating a visually engaging and responsive digital environment that mirrors the physical inputs from the Arduino.

Schematic

Description of Arduino code

Arduino code:
void setup() {
  Serial.begin(9600);
  pinMode(8, INPUT); // Sound sensor input

  // RGB LED pins
  pinMode(9, OUTPUT); // Red
  pinMode(10, OUTPUT); // Green
  pinMode(11, OUTPUT); // Blue
}

void loop() {
  int soundData = digitalRead(8); // Read the sound sensor
  Serial.println(soundData);      // Send sound data to serial for debugging

  if (soundData == LOW) {
    // Sound not detected - change LED to one color 
    changeLEDColor(0, 0, 255); // Blue
  } else {
    
   
    // sound detected - change LED to another color (e.g., red)
    changeLEDColor(255, 0, 0); // Red
     delay(50);
  }

  
}

void changeLEDColor(int redValue, int greenValue, int blueValue) {
  analogWrite(9, redValue);   // Red channel
  analogWrite(10, greenValue); // Green channel
  analogWrite(11, blueValue);  // Blue channel
}

 

Setup Function:

void setup() {
  Serial.begin(9600);
  pinMode(8, INPUT); // Sound sensor input
  pinMode(9, OUTPUT); // Red
  pinMode(10, OUTPUT); // Green
  pinMode(11, OUTPUT); // Blue
}
  • Initializes serial communication at a baud rate of 9600. This is used for debugging purposes to send data to the serial monitor of the Arduino IDE.
  • Configures the pin connected to the sound sensor (pin 8) as an input.
  • Sets up the RGB LED pins (pins 9, 10, and 11) as outputs. Each pin controls one color component of the RGB LED (red, green, and blue, respectively).

Loop Function:

void loop() {
  int soundData = digitalRead(8); // Read the sound sensor
  Serial.println(soundData);      // Send sound data to serial for debugging

  if (soundData == LOW) {
    // Sound not detected - change LED to one color 
    changeLEDColor(0, 0, 255); // Blue
  } else {
    
   
    // sound detected - change LED to another color
    changeLEDColor(255, 0, 0); // Red
     delay(50);
  }

 

  • Continuously reads the state of the digital sound sensor.
  • If sound is detected the LED changes to red by calling changeLEDColor with (255, 0, 0), which are the RGB values for red.
  • If no sound is detected the LED (soundData isLOW) the RGB LED is set to blue. This is achieved by calling the changeLEDColor function with the parameters (0, 0, 255), representing the RGB values for blue.
  • There is a short delay (delay(50)) at the end of the loop for stability and to control the rate at which the sensor reads data.

changeLEDColor Function:

void changeLEDColor(int redValue, int greenValue, int blueValue) {
  analogWrite(9, redValue);   // Red channel
  analogWrite(10, greenValue); // Green channel
  analogWrite(11, blueValue);  // Blue channel
}
  • A helper function that takes three parameters: redValue, greenValue, and blueValue, each representing the intensity of the respective color channel of the RGB LED.
  • The analogWrite function is used to set the brightness of each color channel. For example, analogWrite(9, redValue); sets the brightness of the red channel.

Description of the p5.js Sketch

p5.js Sketch:
let serial;
let latestData = "waiting for data";
let particles = [];
let cols, rows;
let particleCount = 100; // Adjust for more/less particles
function setup() {
  createCanvas(windowWidth, windowHeight);

  // Create randomly positioned particles
  for (let i = 0; i < particleCount; i++) {
    let x = random(width);
    let y = random(height);
    particles.push(new Particle(x, y));
  }
}

function readSerial(data) {
  console.log(data);
  latestData = data.trim();
}

function draw() {
  background('#00003f');
  textSize(30);
  textFont('Courier New');
  textAlign(CENTER, CENTER)

  if (!serialActive) {
      fill(0, 102, 153);
    text("Press Space Bar to select Serial Port", width / 2, height / 2);
  } else {
    text("Connected", 20, 30);

    let sensorValue = parseInt(latestData);
    particles.forEach(p => {
      p.update(sensorValue);
      p.display();
    });
  }
}

function keyPressed() {
  if (key === ' ') {
    setUpSerial();
  }
}
class Particle {
  constructor(x, y) {
    this.x = x;
    this.y = y;
    this.baseSize = 10; // Base size of the circle
    this.size = this.baseSize;
    this.color = color(random(255), random(255), random(255));
    this.xSpeed = random(-1, 1);
    this.ySpeed = random(-1, 1);
  }

  update(sensorValue) {
    // Resize based on sensor value
    this.size = sensorValue === 1 ? 30 : 10;

    // Update position for smooth floating
    this.x += this.xSpeed;
    this.y += this.ySpeed;

    // Bounce off edges
    if (this.x > width || this.x < 0) {
      this.xSpeed *= -1;
    }
    if (this.y > height || this.y < 0) {
      this.ySpeed *= -1;
    }
  }

  display() {
    fill(this.color);
    noStroke();
    ellipse(this.x, this.y, this.size, this.size);
  }
}

// Resize canvas when the window is resized
function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
}
  1. Setup and Particle Creation:
    • The setup() function initializes the canvas to cover the entire window. Within this function, I create multiple particles, each represented by an instance of the Particle class. The number of particles is determined by particleCount.
    • Each particle is randomly positioned across the canvas. This is done by assigning random x and y coordinates within the canvas’s dimensions.
  2. Serial Data Handling:
    • The readSerial(data) function is responsible for processing incoming serial data from the Arduino. This data represents the state of the sound sensor. The function trims any whitespace from the received data and stores it in latestData for further processing.
  3. Drawing and Animation:
    • In the draw() function, the background is set to a dark blue color ('#00003f').
    • The sketch checks the serialActive flag to determine if the serial connection is established. If not, it prompts the user to activate the serial port. Once connected, it displays “Connected” on the canvas.
    • The particle behavior is updated based on the parsed sensor value (sensorValue). Each particle’s size and position are adjusted accordingly.
  4. Particle Class:
    • The Particle class defines the properties and behaviors of individual particles. Each particle has its position (x, y), base size, color, and speed (xSpeed, ySpeed).
    • The update(sensorValue) method adjusts the particle’s size based on the sound sensor input. It also updates the particle’s position to create a floating effect. If a particle reaches the edge of the canvas, it bounces back, creating a dynamic, contained animation within the canvas boundaries.
    • The display() method draws each particle as an ellipse with its respective properties.
  5. Interactivity:
    • The keyPressed() function listens for a spacebar press to initiate the serial connection setup, a key part of the interaction between the Arduino and the p5.js sketch.
  6. Responsive Design:
    • The windowResized() function ensures that the canvas size adjusts appropriately when the browser window is resized, maintaining the integrity of the visual display.

Description of Interaction Design

  • Engaging Invitation:
    • Users are greeted with an inviting message on the project box, clearly stating: “Sing for the Party People.” This message sets the tone and clearly communicates what is expected from the user.
  • Sound Trigger:
    • As soon as the user starts singing or making noise, the embedded digital sound sensor within the box detects this audio input. The sensor is finely tuned to respond to a range of sounds from soft humming to loud singing.
  • Responsive Light Display:
    • Upon detecting sound, the sensor triggers a colorful light show from the RGB LED. The LED cycles through colors, creating a mini disco effect that transforms the space with vibrant hues.
    • The intensity, frequency, and duration of the user’s singing directly influence the light patterns, making each experience unique and personal.
  • Visual Feedback:
    • The LED serves as immediate visual feedback for the user’s actions. This feedback loop encourages continued interaction and exploration of different volumes of sound.
    • The changing colors of the LED create a playful and immersive environment, enhancing the joyous atmosphere of a disco.

Description of communication between Arduino and p5.js

Arduino to p5.js Communication:
  1. Serial Communication Setup:
    • On the Arduino side, I initialize serial communication in the setup() function using Serial.begin(9600);. This sets up the Arduino to send data over the serial port at a baud rate of 9600 bits per second.
    • In the main loop (void loop()), the Arduino reads data from the digital sound sensor connected to pin 8 using digitalRead(8);. This sensor detects the presence or absence of sound, returning either a HIGH or LOW signal.
  2. Sending Data from Arduino:
    • Depending on the state of the sound sensor, the Arduino sends this information to the connected computer via the serial port using Serial.println(soundData);. The data sent is a simple numerical value (0 or 1) representing the absence or presence of sound.
  3. Receiving Data in p5.js:
    • On the p5.js side, the sketch establishes a serial connection to receive data from the Arduino. This is done using the p5.SerialPort library, which facilitates serial communication in a web environment.
    • The readSerial(data) function in the p5.js sketch is responsible for reading incoming serial data. It processes the data received from the Arduino, trims any whitespace, and stores it in the latestData variable.
p5.js Processing and Visualization:
  1. Data Interpretation:
    • The p5.js sketch interprets the received data (latestData) as the state of the sound sensor. This data is then used to influence the behavior of visual elements within the sketch, such as the size and movement of particles.
    • The draw() function continuously updates the canvas, where each particle’s appearance and behavior are adjusted based on the sensor data. For instance, the presence of sound might cause the particles to increase in size or change position, creating a dynamic and responsive visual effect.
  2. Feedback Loop:
    • The seamless exchange of data between the Arduino and the p5.js sketch creates an interactive feedback loop. Physical input from the sound sensor directly influences the digital visualization, making the experience responsive to real-world interactions.

What are some aspects of the project that you’re particularly proud of?

Reflecting on my project, I feel a deep sense of pride, particularly in the creation of the physical component – the mini sound-activated disco club. This aspect of the project was not only a challenge but a testament to my creativity and technical skills. The process of bringing a conceptual idea to life, blending interactive technology with artistic design, was immensely fulfilling. Another aspect I’m especially proud of is my adaptability and problem-solving skills. When faced with the unexpected challenge of the original sensor breaking, I quickly adapted, demonstrating resilience and quick thinking, hallmarks of a true interactive media student. Utilizing a different sensor and modifying my project accordingly, I managed to preserve the essence of my initial concept. This ability to think on my feet and craft a functional and engaging project with the available resources, even though it diverged from my original plan, is something I take great pride in. It underscores my capacity to innovate and create meaningful interactive experiences, regardless of the obstacles encountered.

What are some areas for future improvement?

Reflecting on my project, I recognize several areas for future improvement, particularly influenced by the challenges and lessons learned during its development. One key area is the need for contingency planning in hardware-based projects. The unexpected malfunction of my original sensor forced me to significantly simplify my original idea, mainly due to time constraints and the limitations of the replacement sensor. This experience taught me the importance of having spare parts and tools readily available. It’s a lesson that will influence my approach to future projects, ensuring I’m better prepared for unforeseen setbacks.

Additionally, the limitations imposed by the replacement sensor, which could only read binary values (0s and 1s), restricted my ability to create a more complex and visually appealing p5.js sketch. This constraint became particularly evident in my efforts to craft a visually aesthetic sound visualizer. The binary input didn’t allow for the nuanced interpretation of sound that I had initially envisioned. Moving forward, I aim to explore more advanced sensors and input methods that offer a wider range of data. This will enable me to create more intricate and engaging visualizations in my p5.js sketches, aligning more closely with my original vision of an interactive and visually rich experience.

IM SHOWCASE

Week 11 Exercises

 

(Mudi & Mariam)

Exercise 1

In this exercise, a potentiometer connected to an Arduino has been employed to influence the horizontal movement of an ellipse in p5.js. As you turn the potentiometer, the ellipse smoothly moves from side to side.

Exercise 2 :

Controlling the brightness of an LED using a slide bar in p5js
Concept:

The left side of the slide bar (owl) will dim the brightness of the LED and the right side of the slide bar (sun) will make the LED brighter. We used the PWM pin 5 for the LED
a 10 K ohm resistor and two jumper wires to make the connection

 

Arduino Code:

int redled = 5; //PWM PIN
void setup() {
  pinMode(redled, OUTPUT);
  Serial.begin(9600);
  
  
  // initializing handshake
  while (Serial.available() <= 0) {
    Serial.println("a moment");  
    delay(200);               // wait 1/2 second
  }
}
void loop() {
  // wait for data to load from p5 before continuing code
  while (Serial.available()) {
    int brightness = Serial.parseInt(); 
    if (Serial.read() == '\n') {
      analogWrite(redled, brightness); // turn on LED and adjusts brightness
      Serial.println("ON"); 
    }
  }
}

 

Week 11 Final Concept Draft 1

Soundscapes in Light

Concept:

A dynamic light source that responds to the captured sound, adding a visual dimension to the auditory experience.

Soundscapes in Light is all about immersing users where the surrounding ambient sounds play the lead role in shaping a captivating visual atmosphere

I stumbled across this video I found which sort of matches the idea of what I want to display on p5.

Description:

I’m thinking of setting up a microphone to capture the audio vibes around. Different frequencies and amplitudes trigger variations in the light display for example if it’s a high-pitched sound the RGB LED nudges towards the blues or a deep bass note the RGB LED changes to reds.

Color Mapping: P5.js takes charge, translating the intensity and frequency of the captured sound into dynamic color schemes. Visuals come alive with vibrant colors and subtle gradients responding to every nuance of the audio, creating a visually harmonious representation.

Light Intensity: The volume and intensity of the sound guide the brightness and saturation of the RGB LED. A crescendo of sound could lead to a fully illuminated, vibrant display, while moments of silence might dim the lights, creating a tranquil ambiance.

Challenges:

Noise Interference:

  • The microphone may capture unintended ambient noise, leading to inaccurate or unwanted visual responses. Implementing effective noise reduction techniques is crucial to maintain the integrity of the sound-to-light transformation.
  • Color Mapping Aesthetics:
    • Determining the appropriate color mapping for different sound frequencies and amplitudes involves both artistic and technical considerations. Striking a balance between aesthetic appeal and meaningful representation is a challenge.

 

Week 11 Reading Reflection

Thinking about what I picked up from the reading, it’s pretty clear that finding the right balance in designing stuff for people with disabilities is key. Leckey’s take on furniture for kids with disabilities nails it keeping things visually cool without making it stand out too much. The bit about radios adding screens throwing a wrench into accessibility for visually impaired folks hits home. It’s a reminder that sometimes less is more, especially when it comes to simplicity making things work for everyone. Digging into multimodal interfaces, like beeping buttons and flashing lights, sounds like a game-changer for folks with sensory issues, giving them more ways to interact.

And then there’s the reminder that everyone’s different. The story about two folks with visual impairments wanting totally different things in their devices shows we can’t do a one-size-fits-all deal. It’s not just about functionality; it’s about personal vibes and choices.

The questions the reading left me with are pretty cool too. Like, how do designers juggle making things accessible without getting too complex? And what’s the deal with fashion designers in the accessibility game? Plus, prosthetics—it’s not just about how they work, but how they fit people’s attitudes and styles. The reading opened up this whole world of thinking about design and accessibility that goes way beyond just ticking boxes. It’s making me look at things with a whole new lens.

Week 10 – Reading response

I get where the author is coming from about how our current way of interacting with tech might be a bit limiting. Using just our fingers on touchscreens seems a tad one-dimensional. But here’s the thing: the touch-and-swipe tech we’ve got now is pretty complex and convenient as is. It’s taken us a long way, and I’m all for making things better. However, I think there’s a sweet spot. We don’t necessarily need more complexity for the sake of it; we’ve got a good thing going. What we really need is to make tech simpler and more accessible, especially for humans with disabilities. Let’s not complicate things for everyone; instead, let’s focus on tech that works for everyone, regardless of their physical abilities. That’s where the real magic lies.

however, this is where I partially agree As one person stated in the follow-up article “My child can’t tie his shoelaces, but can use the iPad.” I’m with the idea that they should step up their game, tapping into the full potential of our grown-up minds and bodies. Referring to tools dumbed down for kids as “toys”? The analogy about channeling all interaction through a single finger to limiting literature to Dr. Seuss’s vocabulary is like a lightbulb moment. Sure, the one-finger tech is more accessible, but I also believe, we adults deserve a lot more sophisticated technological interfaces that go beyond simplicity for the sake of it.

Week 10 Assignment (Mariam & Mudi)

Mariam & Mudi’s Magical Musical Instrument.

Concept :

For our musical instrument, we decided to craft an innovative instrument using an Ultrasonic sensor, a button, and a Buzzer. To kick off the musical vibes, just gently hold down the button. Now, here’s where it gets interesting when you wave your hand in front of the ultrasonic sensor at varying distances it unveils a different array of notes!

int trig = 10;

int echo = 11;

int buttonPin;

long duration;

long distance;



void setup() {

pinMode(echo, INPUT);

pinMode(trig, OUTPUT);

Serial.begin(9600);

}

void loop() {

digitalWrite(trig, LOW); //triggers on/off and then reads data

delayMicroseconds(2);

digitalWrite(trig, HIGH);

delayMicroseconds(10);

digitalWrite(trig, LOW);

duration = pulseIn(echo, HIGH);

distance = (duration / 2) * .0344; //344 m/s = speed of sound. We're converting into cm

int notes[7] = {261, 294, 329, 349, 392, 440, 494}; //Putting several notes in an array

// mid C D E F G A B

buttonPin = analogRead(A0); 

if (distance < 0 || distance > 50 || buttonPin < 100) { //if not presed and not in front

noTone(12); //dont play music

}

else if ((buttonPin > 100)) { //if pressed

int sound = map(distance, 0, 50, 0, 6); //map distance to the array of notes

tone(12, notes[sound]); //call a certain note depending on distance

}

}

 

Challenges:

I wouldn’t call this one a challenge but more of a hiccup really was that we found ourselves repeatedly unplugging and replugging them due to connectivity issues and the Arduino kept on giving errors.

 

Neil Leach Alien Intelligence – BONUS

Prof. Neil did quite a good job in shedding light on the stark differences between artificial and human smarts. From ChatGPT’s unexpected brilliance to AlphaGo Zero’s mind-boggling moves, he painted a picture of AI’s vastness, urging us to hit the brakes on development. It’s not just cool tech; it’s a bit scary. Neil’s vibe warned us: we might be steering into unknown territory, where AI, per Geoffrey Hinton, starts thinking for itself. The lecture left us with a quirky truth—aliens aren’t zipping down from space; they’re brewing up in labs. AI’s the new-age extraterrestrial, and we better handle it with care. However, it did stir up some thoughts I had.

While the talk was interesting, it felt like a déjà vu of AI basics. I craved more than the usual rundown While I found it intriguing, I couldn’t shake the feeling that it might have missed the mark for our crowd. He did mention many basics about AI but it felt like old news for most of us. I was hoping for more profound insights or a deeper dive into the future.

week 9 – Assignment

Ready.. Set.. Go..

Concept:

“Ready Set Go” is a project that lets you control 4 LEDs with just one button. you press the button, and the LEDs light up one after the other like a countdown – it’s like creating your own mini light show!

IMG_1096

Challenges:

The most challenging part of the code is likely the button debouncing mechanism. Even when I’ve pressed the button once the LEDs would rapidly change without a bounce or delay

if (digitalRead(5) == HIGH) {
  newcount = count + 1;
  if (newcount != count) {
    // ... (LED control logic)
    count = newcount;
  }
}

Then I had to find the perfect timing. The delay between consecutive state checks (delay(100)) provides a basic form of debouncing. However, finding the right delay duration was a bit hard because too short, it wouldn’t effectively debounce, and if it was too long, it just felt unresponsive.

 

Areas for Improvement:

For future projects, I’d wish to maybe design the code and circuit in a way that makes it easy to add more LEDs or buttons or add one more idea to which pressing the button a certain number of times can show like a mini light show.

 

Week 9- Reading Reflection

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This article really struck a chord with me The title alone “Set the Stage, Then Shut Up and Listen,” immediately caught my attention. Even before reading the rest of the article. It got me thinking about interactive art as an ongoing conversation rather than a scripted statement. I completely agree that we should set the stage, provide some hints, and then let the audience take the lead. It’s like directing a play where the audience becomes part of the performance. I’ve always believed that art is a shared experience, and this article reinforced that. Letting the audience guide the narrative and listening to their reactions. Interactive art isn’t a finished product; it’s a collaborative, evolving performance. This perspective inspires me to create art that’s not just seen but experienced and shaped by those who engage with it.

As for Physical Computing’s Greatest Hits (and misses)

I would say The Multitouch Interfaces project was my favorite because of the cool variety in sensing touch points, whether through infrared light, distance sensors, or capacitive touch. The challenges, like maintaining sensors and the lack of tactile feedback, felt real and relatable. It wasn’t just about tech; it was about blending human touch with digital finesse. Addie Wagenknecht and Stefan Hechenberger’s CUBIT using infrared light and cameras showcased this fusion perfectly, making Multitouch Interfaces my top pick.

Week 8 Assignment

 

Magic Mouse

Concept:

For my “Magic Mouse” assignment, the aim is to create a capacitive touch sensing system using my Arduino board. The core idea revolves around a touch-sensitive surface, represented by a simple foil, that triggers an LED to illuminate each time the surface is touched by the mouse.

How It Works:
The foil functions as a capacitive touch sensor. When I touch the foil with the mouse, it induces a change in capacitance, which the Arduino detects through the connected digital pin. The Arduino then activates the LED.

Completed project:

IMG_0974

Areas for Improvement:

Looking at the project, I feel like I could’ve dialed up the complexity a bit. It’s good as it is, but adding some extra layers could take it to the next level. Maybe weave in more interactive elements or toss in some sensors. Just a touch more sophistication to keep things interesting and elevate the challenge.