Week 11: Musical Instrument

Concept

My concept is heavily inspired by the show Squid Game and I wanted to make a musical instrument (well I guess in this, more of a music box/player?) where the light controls the mood of the music, so I used one of the songs from a game in the show.  I used a photoresistor/LDR where the darker it is, the slower the tempo is to create that “creepy vibe,” and the brighter it is, the normal tempo is played. I also used a red LED when the creepy version is played  and a green LED when the normal one is played, these lights’ colors are actually used in the show in another one of the games.

Full Code | Video Demo | Schematics

Code that I’m proud of

ldrValue = analogRead(LDR_PIN);
isDark   = (ldrValue < DARK_THRESHOLD);
tempoMult = isDark ? CREEPY_TEMPO : NORMAL_TEMPO;

int duration = (int)((1000.0 / durations[note]) * tempoMult);

tone(BUZZER_PIN, melody[note], duration);

I’m proud of this because I had to figure out how to make the LDR play a role in the song without just making turn on and off. First thing I had to consider was that the LDR doesn’t hand me “LOW” or “HIGH” and just gives me a raw number between 0 and 1023, and I had to find my own threshold by covering the sensor with my hand and watching the Serial Monitor until I landed on 420. For the actual song, I decided to use a tempo multiplier because I didn’t want the notes themselves to change (just the feel of them), I multiplied the duration by 2.5 in the dark to make every note hang longer and the whole melody sounds heavier and more unsettling without needing to change the notes.

How this was made

I started with the slide switch wired to pin 2 using INPUT_PULLUP, which meant I didn’t need an external resistor. When the switch is open, the buzzer stops and both LEDs go off straight away. For the LDR I had to build a voltage divider using a 10kΩ resistor because the Arduino can only read voltage and not resistance directly. One leg of the LDR goes to 5V, the other connects to A0 and one leg of the 10kΩ resistor, and the other resistor leg goes to GND. In bright light the LDR resistance drops and A0 reads high, and in darkness the resistance rises and A0 reads low. I landed on 420 as my threshold after some testing with the Serial Monitor (that I temporary added but later removed). The buzzer plays the melody using tone() and I used a code from Github for that melody (referenced below), and the duration of each note gets multiplied by 2.5 in dark conditions to make it creepy. The green LED lights up in bright conditions and the red LED lights up when it goes dark. I also made sure to re-check the LDR and switch inside the melody loop on every single note so the instrument reacts to light changes mid-song rather than waiting for the whole melody to finish before checking again.

Reflection & Future Improvements

Honestly this assignment gave me more trouble than I expected, but in a good way. I went in thinking the LDR would be straightforward as it’s something that I’ve learned from the analog sensor lesson and it kind of was at the hardware level, but making it do something musically interesting took some more thought. The Squid Game inspiration kept me motivated too because I had a clear vision of what I wanted it to feel like, which made the troubleshooting feel worth it rather than frustrating. If I kept going I’d want to include more songs from the different games and have more “modes.” I’d also want to try changing the actual pitch in the dark, dropping notes lower to make it sound even more sinister, which feels very on brand for the Squid Game theme. But for what it is, I’m genuinely happy with how the concept came through.

References

https://docs.arduino.cc/language-reference/en/functions/advanced-io/tone/

https://docs.arduino.cc/built-in-examples/digital/toneMelody/?_gl=1*yqplbf*_up*MQ..*_ga*MTM3MDIzMDA5Ni4xNzc2ODY2MzM5*_ga_NEXN8H46L5*czE3NzY4NjYzMzckbzEkZzAkdDE3NzY4NjYzMzckajYwJGwwJGgxNjY2NjUxMjk0

For the melody: https://github.com/hibit-dev/buzzer/blob/master/src/movies/squid_game/squid_game.ino

Final Project Draft

For the final project, I want to make room ambience controller. I have a rgb light ribbon which runs around my study table. it lights are controlled by a a remote. There are different colors, blinking modes and brightness controls. I will communicate to that using a infrared LED. Firstly Ill map those controls to P5. Then I will define different setting for different ambiences. The user will be able to choose from them. I will also add music to it, I am still thinking of where to get the sound output from. Next I will have songs mode which will make the lights’ blinking corresponding to the song’s tempo, amplitude etc. This is core concept of my project

I want to add a gesture to pause everything. I plan to use the distance sensor for it. I also want to incorporate the LCD but I am not sure if that will go with it because the whole experience is light dependent.

Week 11 – Reading Response Zere

I resonate with the argument made by the author, as in my opinion, the notion of touchscreens being the “ending point” of a technological revolution seems a little odd to me. I believe that there is much more to the technological development of the future than an object that resembles a phone.
What I found most interesting was the part about hands. The author makes a great point about how useful our touch senses are. Human evolution can be strongly tied to the senses that are on the tip of our fingers. Touch is an extremely powerful “tool”, if we could call it that, and we’ve basically designed it out of our tools entirely.

This reading reminded me of our time in class, when the professor showed us simple motion detectors and control using p5js. This technology is not touch-related directly, but still presents a more futuristic/alternative concept rather than touchscreens. This transitions into my thought that interactions that involve your whole hand, your movement, your body, basically, other additional parts of our body (also including touch), are much more interesting and feel more “alive”, compared to just sliding your finger across a surface.

To conclude, I think that it’s easier to make the argument than to solve the problem. It’s one thing to say “we should use our hands more” and another to design something that’s actually as convenient and as accessible as a smartphone. He kind of admits this in the responses page , he doesn’t have the answer, just the problem. Which is fair, but also frustrating at the same time.

Reading Reflection

The readings made me rethink what interaction design is and what it could become. Bret Victor’s main idea that computers should help people think, not just react stood out to me the most. It made me realize that many of my own projects focus on simple interactions, like clicking or triggering events, rather than supporting deeper understanding. This distinction between reactive systems and thinking tools feels important, especially for someone studying interactive media.

Another key idea is visibility in design. Victor argues that systems should show how they work instead of hiding their processes. I strongly relate to this from a learning perspective when I can see changes happen in real time, I understand concepts much faster. This connects interaction design with education, suggesting that good design is not just about usability, but also about helping users learn and explore. At the same time, I found myself questioning some of Victor’s ideas, especially about using the full human body in interaction. While designing for human capabilities is powerful, it can also exclude people who are not able-bodied. I agree with my peer’s point that accessibility should be central to design. Technology exists to expand access, not limit it. For example, devices like the Meta Quest offer adjustable and inclusive features, showing that immersive design and accessibility can coexist.

I was also interested in discussions about touch and sensory interaction. Technologies like haptic feedback and VR controllers show how physical sensation can enhance digital experiences. The example of devices that simulate textures or resistance demonstrates how interaction can go beyond screens and become more embodied. However, these ideas still feel experimental, and it is unclear how widely they can be applied in daily life.

Week 11 Assignment

The Concept

My idea for this stemmed from a desire to have instruments that among easy to operate; are tangible when it comes to playing around with them; and produce tones similar to that of a flute, which is actually remarkably difficult to reproduce digitally, due on part to how the actual flute material reverberates and deforms on a micro level to every finger placement and note produced.
For this task, a force sensor seemed to be the closest option to finger placements over a flute. Its ability to vary voltage across it based on applied force allowed for the ability to vary the sound produced depending on the pressure applied.

Since we are talking about electricity, I decided adding an oscilloscope across the piezo would enable for the user to ‘see’ the sound they are producing. I added a NeoPixel ring of lights around the pressure sensor as well, which subtly change color from a red-hue to a blue/purple hue as pressure increases.

How It’s Made

Decorations aside, the force sensor connects from the 5V port to the analogue port A0 with a resistor across it. The program measures if a change in force (voltage across the sensor) is present, and then emits a sound by using the voltage measured, thus allowing for a varied frequency.

The output given from the digital pin 8 connection to the Piezo passes first though a potentiometer to allow for volume control, and then through two capacitors to smooth the varying voltage out, producing a more cohesive sound, instead of something jumpy and discordant.

#include <Adafruit_NeoPixel.h>

Adafruit_NeoPixel pixels(12, 7, NEO_GRB + NEO_KHZ800);

float force = 0;
float oldForce = 0;

int curve = 50;
float reduce = 100.0;

int frame = 0;

void setup()
{
  pinMode(8, OUTPUT);
  pinMode(A0, INPUT);
  Serial.begin(9600);
  pixels.begin();
}

void loop()
{
  frame++;
  //noTone(8);
  force = analogRead(A0);
  
  for (int i=0; i < 12; i++) {
    pixels.setPixelColor(i, pixels.Color((int(force/914 * 255) + i * frame)%255, 0, 255 - (int(force/914 * 255) + i * frame)%255));
  }
  pixels.show();
  if (oldForce != force) {
    tone(8, 440 * pow(2.0, ((force * curve/914)) / reduce), 50);
    oldForce = force;
  }
  delay(50); // Delay a little bit to improve simulation performance
}

What I Like

I must confess, my favorite part is definitely the oscilloscope. Being able to see the sound one is producing as a wave is a cool feature I would love to expand in the future.

What I Could Do Better

I would like to make this in the real world, using real components, instead of using a simulator. For one, the simulator has audio issues and disruptions. Secondly, being able to actually feeling and interacting with the components would be better encompassing of my initial concept.

https://www.tinkercad.com/things/kso8VMi5L4r-e-flute-thingy?sharecode=jQ0HXUzojiu1iO5e9L2bvm4cXdhaP65zPHE8GPSiJww

Week 11 – Assignment

I wanted to make a non-traditional music instrument that feels like a game the could be played around with for hours, although it is very simple. It contains different modes but not only does have audio feedbacks, but also visual feedback through the LCD Screen and LED lights. It reminds me of a very simplified and modest version of a music instrument attached to a pedal, which gives different music effects.  The LCD screen showing the hertz and bpm reminds me of a pedal.

Hand-Drawn Schematics

 

Simulator

Again, I used the simulator to make sure I wouldn’t accidentally burn any components. When building a circuit, I take it step by step: I test the LEDs first—everything works great—then I add the photoresistor to control them and test again. After that, I add the piezo and repeat the process until I reach the LCD. I build each part in the simulation first, then immediately try it on the physical board. Which helped me realize the wires order connecting to the LCD display were flipped.

 

 

Video

How the code works

This code implements a light-controlled theremin with three distinct musical modes on Arduino, using an LDR as the primary input. The core structure reads the analog light value, smooths it with a 20-sample circular buffer, and maps it to different musical parameters depending on the active mode, Theremin, Scale, or Pac-Man. Mode 0 -Theremin- produces continuous pitch with glide and vibrato, generates a pulsing heartbeat animation on the LCD, and sweeps the RGB LED through a color gradient based on frequency. Mode 1  quantizes the light reading to 15 discrete C major notes, displays rainbow colors per note, and shows VU meter bars on the LCD using custom characters. Mode 2  maps light intensity to game speed, runs a side scrolling Pac-Man game on the LCD with ghosts and dots, and plays the classic sound. The button handling supports short-press to cycle modes and long-press to enter/exit sleep mode, while the RGB LED fades smoothly between target colors using a step-based transition. The LCD uses custom character sets loaded on-demand and tracks dirty rows to minimize redraws. A sinus lookup table generates vibrato and LED pulsing, and the audio output on the piezo uses tone() with frequency modulation. The code is organized into modular functions for each mode, character loading, LED fading, and button debouncing, with global state variables tracking everything from heart rate BPM to ghost positions.

Future improvements and Reflection

In the future, I would like to turn this prototype into a PCB and add more components and sensors to transform it into a more realistic musical instrument.

I struggled mainly with connecting the LCD screen. After working for long hours, I started to lose focus and couldn’t fully debug what was going wrong. Eventually, I realized that the two breadboards were not connected to each other, which fixed part of the issue. However, I still faced problems—the display would turn on but only showed strange white boxes.

I then checked the V0 pin on the LCD and noticed it was connected to the potentiometer but not properly connected to ground and power. After correcting the wiring and adjusting the potentiometer, the display sometimes still showed weird shapes and white boxes. I removed the LCD to inspect it and realized the wiring was flipped, since I was using the original LCD from the Arduino starter kit. The characters appeared as numbers at first, and some were reversed.

After fixing the wiring orientation and connections, everything started working properly.

Week 11 – Reading Response

I really enjoyed this week’s readings, especially his response to people’s response as it was quite entertaining to read.

Firstly, I did find myself agreeing with the initial arguments. I personally think it’s sad to see technology taking over everything. For instance, in the video, the person’s glasses translates the announcements. While, of course, this is convenient, I think it takes away from the normal human experience of asking someone for directions, and struggling to understand one another, but still find that human bond. This might be a bit nit-picky, but I feel like if humans have lived till now without all this technology, then maybe not everything needs to be changed.

Another thing that I thought of, that can kind of be used for both sides of the arguments, is disability and accessibility. Blind and deaf people heavily rely on their sense of touch to do most everyday tasks, for example, when pouring water they can feel when the cup is getting lighter. However, on a similar note, technological advancements have also been essential for other disabilities, for instance, someone with limited hand dexterity can find it easier to user their voice to do some tasks, rather than using a keyboard or a screen. This isn’t an argument that the author brought up, however, it’s something that immediately came to mind for me and I was surprised it didn’t come up.

Overall, despite literally being a Computer Science major who’s whole career path is probably going to be linked to technological advancements and AI taking over the world, I still really feel like it’s important to take a step back and observe whether something really needs to be digitalized and technologized and AI-fied. I might have strayed away from the topic of the reading, but I feel it all is strongly linked.

Reading Reflection Week 11 – Kamila Dautkhan

Honestly, the Victor reading was kind of a trip because he’s basically saying our iPhones are kind of a step backward. It’s wild to think that we have all these nerve endings in our fingers but we’re stuck just swiping on flat glass all day. He calls it “Pictures Under Glass,” and it made me realize how much better it feels to use actual physical tools where you can feel the weight and the edges of things. It definitely makes me want to build something that isn’t just another touchscreen.

Connecting that to the BlinkWithoutDelay thing actually makes a lot of sense now. If you’re trying to build a cool, responsive tool like Victor is talking about, you can’t have your code stuck on a delay() command. It’s like trying to have a conversation with someone who randomly freezes for two seconds every time they blink. Using millis() is basically the only way to make sure the hardware is actually “awake” enough to feel what the user is doing in real-time.

One thing I’m still stuck on is how to actually build the 3D stuff he’s talking about. Like, it’s easy to code a button, but how do you code something that feels like “opening a jar” or sensing weight? I also definitely need to practice the if (currentMillis - previousMillis >= interval) logic more because it’s way less intuitive than just typing delay(1000). It feels like a lot of extra math just to keep the light blinking while doing other stuff.

Week 11: Reading Response

A Brief Rant on the Future of Interactive Design + Follow-up

The first article is essentially arguing that the dominant vision of future technology, which is everything being a flat glassy touchscreen you slide your finger across, is not actually visionary at all. It is just a timid extension of what already exists, and what already exists ignores almost everything that makes human hands remarkable. His point is that our hands do two things extraordinarily well, they feel things and they manipulate things, and touchscreens strip both of those capabilities away in exchange for a visual interface that he calls Pictures Under Glass. I reflected on his example of making a sandwich. He asks you to pay attention to how many tiny adjustments your fingers make without you even thinking about it, switching grips, sensing weight, feeling texture, and then he asks whether we are really going to accept a future interface that is less expressive than that. That question reminded of the time I tried learning drums through one of those tablet apps, and the difference between that and sitting in front of a real kit is almost laughable. On a real drum the stick bounces back after you hit it, and that rebound produces important information. Your wrist reads it and adjusts the next stroke automatically, and I could feel even as a beginner that my hands were supposed to be learning something from that response. On the app there is nothing. You tap a flat surface, it makes a sound, and that is the entire relationship. I was learning the pattern but I was not learning to actually play, and from what I can understand, that distinction is what the author is getting at.

About his response to pushback, I actually found it more interesting than the original rant. In the part when someone asked about voice interfaces and he said he has a hard time imagining a painter telling his canvas what to do. That, again, reminded  of the drums. There is no way to describe with a voice or replicate on a screen the feeling of a snare cracking back against your stick, or the way a cymbal responds differently depending on where and how hard you hit it. That knowledge is supposed to live in your hands built up over time, and I genuinely felt the absence of it every time I went back to the app and realized my fingers were learning nothing they could transfer to a real instrument. It felt like I was practicing the appearance of playing drums without any of the physical intelligence that actually makes someone a drummer.

Reading Reflection Week 11: The Touchy Touchscreen

Bret Victor makes a really good point about the so called “Pictures under glass” concept of technology. I mean, as a bit of a fun fact, in my elementary school, our books were so in need of an update that the latest technology the textbooks about computer science, mention touchscreens as the latest technology. It’s quite laughable thinking about it. But I do have to agree, in a generation and era where we’re living a touchscreen oriented life, the magic does get lost along the way.

But thing is I think right now we have at least one example of trying to make technology adapt to fit our human body, being the reMarkable paper tablet. Initially when I was looking to buy some new technology I had this one goal in mind; that it can have a touchscreen for me to draw and write on. I primarly was looking at laptops that had touchscreens when I was introduced to the reMarkable. Now I thought, I mean its nothing special when it came to what it is and what it lacked as opposed to a conventional laptop. But after testing it out, it was astounding. It felt so natural to write on it, like I was writing in a notebook. I didn’t get that artifical feel that iPads give me and genuienly is what sold me on buying the reMarkable in the end.

At some point, we do have to ask ourselves, at what point is our life simulated by screen or whether our reality is literally a headset we can’t take off? (obviously I’m joking but it is getting quite dystopian no?)