Final Concept: Motion-Activated Mini Garden Visualizer
My final project is the Motion-Activated Mini Garden Visualizer—an interactive installation that uses an ultrasonic sensor and individualLEDs embedded in miniature garden that I built. The project simulates how a garden might come to life in response to presence and movement. Instead of using a microphone to detect sound, the system now detects distance and motion. As a person approaches, LEDs light up in various colors, mimicking the way plants might react to human presence.
This visual experience is enhanced by a P5.js sketch that features lava lamp-inspired blobs. These blobs move and shift color based on proximity—creating a unified physical-digital display that changes in mood depending on how close someone is to the garden.
Why These Changes Were Made
Originally, the project was a mood tree visualize using an LED strip and microphone sensor. However, due to hardware limitations (the Arduino could only power 5 LEDs reliably), I shifted to using individual LED lights. At the same time, I replaced the microphone sensor with an ultrasonic sensor to create a more responsive and stable interaction system. These changes also allowed me to design a mini garden setup that feels more visually integrated and conceptually clear.
Real-World Relevance & Future Use Cases
This concept has real potential for garden and farm environments:
In gardens it could act as a calming, responsive lighting system.
On farms, the same setup could help detect animals like foxes approaching livestock (e.g., sheep). In the future, it could be upgraded with:
Sirens or alerts- when something comes too close.
Automatic light deterrents- to scare animals away.
Wireless notifications- to a user’s phone.
User Testing Reflection
I conducted user testing without providing any instructions, observing how people interacted with the installation:
What worked well:
Users naturally moved closer to the garden and quickly noticed that proximity activated the lights**. The more they explored, the more they connected movement with the garden’s glowing response.
What caused confusion:
each lights colour meant what- where was the sensor exactly
What I had to explain:
I explained the system was using
motion sensing was a bit confusing , and once clarified, users became more engaged.
How I’ll improve the experience:
To make it clearer, I plan to place a small instruction that says:
“Walk closer to bring the garden to life.”
This subtle cue will help guide new users and enhance intuitive interaction.
Final Thoughts
Changing from sound-based input to motion-based sensing not only solved technical challenges, but also made the experience smoother and more immersive. The use of single LEDs in a mini garden created a more grounded and intimate installation. These changes—while unplanned—ultimately led to a more thoughtful, future-facing project that bridges creative expression with real-world functionality.
The video:
https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq