With YEVA SYNTH V1.0, I wanted to create a device that felt fun to play with, responded instantly to human input, and was built from the ground up using just an Arduino Uno, a few buttons, LEDs, and some imagination.
After sketching a few interface ideas, I settled on a layout using two buttons to trigger different sound effects, a potentiometer to switch between modes, and two small LCD screens—one for control feedback and one for visual flair. The FX selector (an analog potentiometer) lets the user scroll between different sound modes like “Laser,” “Melody,” “Wobble,” “Echo,” and more. Pressing a button instantly triggers the selected effect through a piezo buzzer. One LCD shows the current FX name, while the second displays an animated visualizer that bounces in response to sound activity. The LEDs tied to the Arduino’s analog pins light up during sound playback, giving a simple but satisfying burst of light that makes the synth feel alive.
Building it was both straightforward and occasionally frustrating. Wiring two LCDs in parallel required careful pin management to avoid conflicts, and the Arduino Uno’s limited number of usable pins meant I had to repurpose analog pins as digital outputs. The buzzer was a challenge at first because some FX didn’t make any audible sound until I discovered I had to hardcode appropriate pitch and modulation values and remove interrupt logic that was prematurely cutting playback short.
One major success was making the sound effects interruptible and responsive. Early versions of the code would lock the device into one sound effect until it finished, but I rewrote the logic to allow button spamming so users can mash buttons and get immediate feedback, making the instrument feel more playful.
Of course, there are limitations. The piezo buzzer is not exactly a high-fidelity speaker, and while it’s great for beeps and bleeps, it can’t produce anything resembling full-range audio. I also wanted the visualizer to respond to actual audio signal amplitude, but without analog audio input or FFT analysis, I had to simulate that based on pitch values and FX activity. That said, the effect is convincing enough to match the synth’s character. Another improvement would be to allow the synth to send commands to a computer so that real sound files could be played through the laptop’s speakers instead of the buzzer. I’ve already prototyped this using a Python script listening over serial.
#include <LiquidCrystal.h> LiquidCrystal lcd1(12, 11, 5, 4, 3, 2); LiquidCrystal lcd2(8, 7, 6, A4, A3, A2); // Pins const int fxSelector = A5; const int button1 = 9; const int button2 = 10; const int buzzerPin = 13; const int led1 = A0; const int led2 = A1; // FX Setup int pitch = 440; // A4 int mod = 50; int fxIndex = 0; const int NUM_FX = 8; String fxNames[NUM_FX] = {"Laser", "Melody", "Alarm", "Jump", "Sweep", "Wobble", "Echo", "Random"}; void setup() { lcd1.begin(16, 2); lcd2.begin(16, 2); pinMode(button1, INPUT_PULLUP); pinMode(button2, INPUT_PULLUP); pinMode(buzzerPin, OUTPUT); pinMode(led1, OUTPUT); pinMode(led2, OUTPUT); Serial.begin(9600); // Debug lcd1.setCursor(0, 0); lcd1.print("YEVA SYNTH V1.0"); lcd2.setCursor(0, 0); lcd2.print("MAKE SOME NOISE"); delay(1500); lcd1.clear(); lcd2.clear(); randomSeed(analogRead(A3)); } void loop() { fxIndex = map(analogRead(fxSelector), 0, 1023, 0, NUM_FX - 1); lcd1.setCursor(0, 0); lcd1.print("FX: "); lcd1.print(fxNames[fxIndex]); lcd1.print(" "); lcd1.setCursor(0, 1); lcd1.print("Pitch:"); lcd1.print(pitch); lcd1.print(" M:"); lcd1.print(mod); lcd1.print(" "); if (buttonPressed(button1)) { triggerFX(fxIndex); } if (buttonPressed(button2)) { triggerAltFX(); } drawVisualizer(0); } bool buttonPressed(int pin) { if (digitalRead(pin) == LOW) { delay(10); // debounce return digitalRead(pin) == LOW; } return false; } void showFXIcon(int index) { lcd2.setCursor(0, 0); lcd2.print("FX: "); switch (index) { case 0: lcd2.print(">>>>"); break; case 1: lcd2.print("♫♫"); break; case 2: lcd2.print("!!"); break; case 3: lcd2.print(" ↑"); break; case 4: lcd2.print("/\\"); break; case 5: lcd2.print("~"); break; case 6: lcd2.print("<>"); break; case 7: lcd2.print("??"); break; } } void drawVisualizer(int level) { lcd2.setCursor(0, 1); int bars = map(level, 0, 1023, 0, 16); for (int i = 0; i < 16; i++) { if (i < bars) lcd2.write(byte(255)); else lcd2.print(" "); } } void triggerFX(int index) { lcd2.clear(); showFXIcon(index); digitalWrite(led1, HIGH); Serial.println("Triggering FX: " + fxNames[index]); if (index == 7) { int randFX = random(0, NUM_FX - 1); triggerFX(randFX); return; } switch (index) { case 0: // Laser for (int i = 1000; i > 200; i -= (10 + mod / 20)) { tone(buzzerPin, i); drawVisualizer(i); delay(10); } break; case 1: { // Melody int notes[] = {262, 294, 330, 392, 440, 494, 523}; for (int i = 0; i < 7; i++) { digitalWrite(led2, HIGH); tone(buzzerPin, notes[i] + mod); drawVisualizer(notes[i]); delay(200); digitalWrite(led2, LOW); delay(50); } break; } case 2: // Alarm for (int i = 0; i < 5; i++) { tone(buzzerPin, 400 + mod); drawVisualizer(600); delay(150); noTone(buzzerPin); delay(100); } break; case 3: // Jump tone(buzzerPin, pitch + 200); drawVisualizer(800); delay(150); break; case 4: // Sweep for (int i = pitch - mod; i <= pitch + mod; i += 5) { tone(buzzerPin, i); drawVisualizer(i); delay(5); } for (int i = pitch + mod; i >= pitch - mod; i -= 5) { tone(buzzerPin, i); drawVisualizer(i); delay(5); } break; case 5: // Wobble for (int i = 0; i < 15; i++) { int wob = (i % 2 == 0) ? pitch + mod : pitch - mod; tone(buzzerPin, wob); drawVisualizer(wob); delay(80); } break; case 6: // Echo int echoDelay = 200; for (int i = 0; i < 5; i++) { int toneFreq = pitch - i * 20; tone(buzzerPin, toneFreq); drawVisualizer(toneFreq); delay(echoDelay); noTone(buzzerPin); delay(echoDelay / 2); echoDelay -= 30; } break; } noTone(buzzerPin); digitalWrite(led1, LOW); drawVisualizer(0); } void triggerAltFX() { lcd2.clear(); lcd2.setCursor(0, 0); lcd2.print("FX: BLIP"); for (int i = 0; i < 3; i++) { tone(buzzerPin, 600 + mod); digitalWrite(led2, HIGH); drawVisualizer(600); delay(100); noTone(buzzerPin); digitalWrite(led2, LOW); delay(100); } drawVisualizer(0); }
Reading response
A Brief Rant on the Future of Interaction Design by Bret Victor made me rethink how we use technology today. He argues that all these futuristic concept videos we see where everything is controlled by touchscreens or voice commands are actually super boring. Not because they’re unrealistic, but because they’re unimaginative. We’re just slightly upgrading what already exists instead of rethinking how we interact with tech in the first place.
Victor’s main point is that our current interfaces like the iPad might feel revolutionary now, but they’re still pretty limited. Everything is flat, behind glass, and designed for a single finger. It works, sure, but it’s kind of like if all literature was written at a Dr. Seuss level: accessible, but not exactly fulfilling for a fully grown adult. He’s asking, “why aren’t we building tools that take advantage of the full range of human abilities—our hands, our spatial awareness, our sense of touch?”
What I found really interesting is that he’s not anti-technology. He actually says the iPad is good for now, kind of like how black-and-white film was great in the early 1900s, but eventually color took over because people realized something was missing. He’s trying to get people, especially researchers and funders, to realize what might be missing in today’s tech and explore new directions, like dynamic tactile interfaces or haptic environments.
He also talks about how voice and gesture controls aren’t the answer either. Voice is fine for simple commands, but it doesn’t help if you want to build something or deeply explore a system. Same with waving your hands in the air. It’s cool in theory, but weird and disorienting in practice, especially without any physical feedback. His whole point is that we learn and create best when we can physically engage with things.
One thing that really stuck with me is this quote he includes from a neuroscientist about how important our fingers are for brain development. Like, if kids grow up only using touchscreens and never really using their hands, they miss out on a whole layer of understanding (physically and conceptually). That spoke to me. It’s not just about functionality, it’s about how tech shapes the way we think and grow.
So yeah, it’s not a rant in the sense of being angry for no reason. It’s more like a wake-up call. He’s saying, “We can do better. We should do better.” And honestly, I agree.