Final Project – Finalized Concept and Progress

 

Pressure-Based Digital Musical Instrument – Progress Update

For my final project, I am developing a physically interactive digital musical instrument that combines Arduino and p5.js. The main idea is to simulate a small piano-like interface using pressure sensitive sensors (FSRs) as keys. Each key corresponds to a musical note, and when pressed, it triggers both the sound and visual feedback on the computer.

The system is designed to feel closer to a real instrument rather than just a simple button interface. Instead of using traditional push buttons, I chose Force Sensitive Resistors (FSR406 for white keys and FSR402 for black keys) so that the interaction feels more natural when placing fingers on the surface.

The project also includes a visual component in p5.js, where each note triggers a graphical response (such as expanding shapes or bursts) aligned with the keyboard layout.

 

Arduino System Design (Input & Output)

On the Arduino side, the main responsibility is to read input from multiple FSR sensors and send that data to p5.js through serial communication.

Since I am using 13 keys (8 white + 5 black), I needed a way to expand the number of analog inputs. To solve this, I am using a CD74HC4067 multiplexer, which allows me to read multiple sensors using a single analog pin.

Inputs

* 8 × FSR406 (white keys)
* 5 × FSR402 (black keys)

Each FSR is connected through the multiplexer channels (C0–C12). The Arduino cycles through each channel, reads the analog value, and determines whether a key is being pressed based on a threshold.

Processing

* Scan each multiplexer channel
* Detect press events (based on pressure threshold)
* Map each sensor to a note index

Outputs (to p5.js)

* Sends serial data representing which key is pressed
(e.g., key index or note trigger)

 

p5.js System Design (Sound & Visual Output)

The p5.js side handles both sound synthesis and visual feedback.

Input from Arduino

* Receives serial data from Arduino (note index or trigger)
* Interprets incoming data as note events

Sound Output

* Uses p5.sound to generate tones
* Each note is mapped to a frequency (C4 → C5 range)
* Oscillator is used to produce sound (triangle waveform for smoother tone)

Because of browser audio restrictions, the user must click once on the screen before sound can play. This is an important part of the interaction design.

Visual Output

* Background image of a piano keyboard is displayed
* When a note is triggered:
* A visual “burst” or circle appears at the corresponding key position
* The animation fades out over time

This creates a synchronized audio-visual experience.

 

System Interaction Flow

Overall system flow:

1. User presses an FSR key
2. Arduino reads the sensor via multiplexer
3. Arduino sends note data through serial
4. p5.js receives the data
5. Sound is generated
6. Visual feedback appears on screen

This creates a continuous loop of input → processing → output.

 

Current Progress

So far, I have:

* Set up the basic circuit with some FSR sensors and breadboard
* Soldered to connect and tested the CD74HC4067 multiplexer
* Implemented initial Arduino code to read multiple inputs
* Keyboard mapping
* Sound generation
* Visual feedback system
* Tested interaction using keyboard input as a temporary replacement for hardware input

* Make cardboard box to cover and protect wirings

VDO of Current Progress:

On Arduino side, only test Voltage-Divider on 2 FSRs to see the correct circuit and their threshold and on p5 side only test sound oscillator and visual effect by computer keyboards

https://youtu.be/MlDVwkXKKx0?si=0aQA0kxDGdNnhAR8

Github of testing FSR:

https://github.com/skyorachorn/Intro-to-IM/blob/b0862f67a54b640df9c8442db31b8cd9b5a48c83/Test_FSR.ino

p5:

I have also started integrating real sensor input, though I am still refining the stability of readings and mapping.

 

Diagram:

Next Steps

For the next stage of development, I plan to:

* Fully integrate all 13 FSR inputs with stable readings
* Improve serial communication between Arduino and p5.js
* Add the arcade switch as a system control (on/off interaction)
* Refine visual feedback to better align with physical key positions
* Shorter components such as resistors to avoid short circuit

 

Reflection

One challenge so far has been managing multiple inputs with limited Arduino pins, which is why the multiplexer became an important part of the design. Another challenge is balancing hardware reliability with software responsiveness.

At the same time, combining Arduino and p5.js has made the project more interesting, since it allows both physical interaction and digital visualization.

 

References:

https://youtu.be/b_MMGJiUcbM?si=z7di2mc-pPGQBt1R

https://youtu.be/XAOo8DEbvck?si=oO7VKZ046IhJgHwh

https://kjh.ypj.mybluehost.me/glossary/triangular-wave/

Week 12- Reading Response

The author feels that assistive devices enable people with the least possible attention. “The priority for design for disability has traditionally been to enable, while attracting as little attention as possible.” The author’s main argument is to highlight why assistive devices could be fashionable rather than discreet, and mentions eyeglasses as an example that could be applied to other assistive devices. But I disagree; not everyone wants to make their disability visible. Also, I don’t think the use of eyeglasses is the best example, because if it were the case, in my opinion, we would not have contact lenses. Also, people wear sunglasses, which lessens the stigma around medical eyeglasses, not to mention that an estimated 4 billion people worldwide wear glasses, many of which do so due to age, and as we grow older we will wear glasses. I would use the same argument with canes and walkers. We will all eventually age. I also disagree with the use of eyeglasses as a “fashion statement” because they sit right on my face; they should be comfortable and should suit my eye color and facial features due to their necessary placement, and not everyone can replace them with contacts. I think the same for hearing aids as for eyeglasses. But when it comes to prosthetics, I understand making them fashionable because we should make them as fashionable as clothing.

I believe I agree with the author that having options is a good idea, but with these options comes a problem: the cost increases, making people feel more left out than necessary. Imagine two elementary school students with hearing aids, eyeglasses, a wheelchair, etc. One has a colorful, cute, girly pink assistive device and the other has a simple black one. We do have the choice to accessorize, but imagine how it might feel to the student who could not afford to choose. Honestly, not everything needs to be designed to look great, fashionable, and trendy. This is where social media comes into play. In the TV era, mainly the early 2000s, people did what was affordable, convenient, and suitable for them when it came to clothing, children’s toys, etc. Social media is now constantly pushing ads, making people feel they are not enough. So my question is: should we stick to simplicity if it brings us happiness, regardless of what it is?



Reading Reflection Week 12: Reinventing our Hearing and Seeing of Society

Reading through Graham Pullin, he makes some interesting arguments of how our world can actually enhance and create identity even in our disabilities we have. I think the best case made is with our glasses. It’s something I see a lot in many of my family members who require glasses to see better. Some don’t really say anything about them, while others ask around if or how the glasses they’re wearing makes them look like. But if I’m honest, I think glasses actually can very much enhance the stylistic choices of a person’s clothes. You could have say glasses that match your outfit, or highlight your personality. Even though I personally have never really judged (I mean why would you yk) for wearing glasses, I really like when I see some people just kinda go all out and choose a pair that makes them stand out, instead of hiding them like the medical world would like.

Another interesting thing is the hearing aids. Recently my grandfather actually started wearing them as he had trouble hearing and honestly I’ve not noticed them at all. I believe the main fact is that they’re flush in his ears and they’re a bodily sort of skin color to blend in with his ears. And if I’m honest, even reading the points the books makes are partially outdated as the book itself was published in 2009. Now we live in a world where everyone wears earbuds or some sort of earphones, most likely wireless as well. It mostly helps us with our listening needs of music, or podcasts and you could see them everywhere. And since we’ve normalized their use, now even some have features that can act just like hearing aids. Again, this helps a lot in making earbuds acceptable for everyone to wear. In short, now that everyone is wearing a “hearing aid” we can definitely build on making our world much more connected.

Both products at one point were stigmatized. Now they’re normalized. This practice is exactly how we can achieve a better world. Not just by reinventing meaning, but creating a design choice. Narrative does a lot in influencing public perception of a given stigma or bias. And honestly, kudos to all that are designing a more acceptable world for all to take part in, this is what we need more of. Small achievements like this, really can amount to big changes.

Week 11: Musical Instrument

Concept

My concept is heavily inspired by the show Squid Game and I wanted to make a musical instrument (well I guess in this, more of a music box/player?) where the light controls the mood of the music, so I used one of the songs from a game in the show.  I used a photoresistor/LDR where the darker it is, the slower the tempo is to create that “creepy vibe,” and the brighter it is, the normal tempo is played. I also used a red LED when the creepy version is played  and a green LED when the normal one is played, these lights’ colors are actually used in the show in another one of the games.

Full Code | Video Demo | Schematics

Code that I’m proud of

ldrValue = analogRead(LDR_PIN);
isDark   = (ldrValue < DARK_THRESHOLD);
tempoMult = isDark ? CREEPY_TEMPO : NORMAL_TEMPO;

int duration = (int)((1000.0 / durations[note]) * tempoMult);

tone(BUZZER_PIN, melody[note], duration);

I’m proud of this because I had to figure out how to make the LDR play a role in the song without just making turn on and off. First thing I had to consider was that the LDR doesn’t hand me “LOW” or “HIGH” and just gives me a raw number between 0 and 1023, and I had to find my own threshold by covering the sensor with my hand and watching the Serial Monitor until I landed on 420. For the actual song, I decided to use a tempo multiplier because I didn’t want the notes themselves to change (just the feel of them), I multiplied the duration by 2.5 in the dark to make every note hang longer and the whole melody sounds heavier and more unsettling without needing to change the notes.

How this was made

I started with the slide switch wired to pin 2 using INPUT_PULLUP, which meant I didn’t need an external resistor. When the switch is open, the buzzer stops and both LEDs go off straight away. For the LDR I had to build a voltage divider using a 10kΩ resistor because the Arduino can only read voltage and not resistance directly. One leg of the LDR goes to 5V, the other connects to A0 and one leg of the 10kΩ resistor, and the other resistor leg goes to GND. In bright light the LDR resistance drops and A0 reads high, and in darkness the resistance rises and A0 reads low. I landed on 420 as my threshold after some testing with the Serial Monitor (that I temporary added but later removed). The buzzer plays the melody using tone() and I used a code from Github for that melody (referenced below), and the duration of each note gets multiplied by 2.5 in dark conditions to make it creepy. The green LED lights up in bright conditions and the red LED lights up when it goes dark. I also made sure to re-check the LDR and switch inside the melody loop on every single note so the instrument reacts to light changes mid-song rather than waiting for the whole melody to finish before checking again.

Reflection & Future Improvements

Honestly this assignment gave me more trouble than I expected, but in a good way. I went in thinking the LDR would be straightforward as it’s something that I’ve learned from the analog sensor lesson and it kind of was at the hardware level, but making it do something musically interesting took some more thought. The Squid Game inspiration kept me motivated too because I had a clear vision of what I wanted it to feel like, which made the troubleshooting feel worth it rather than frustrating. If I kept going I’d want to include more songs from the different games and have more “modes.” I’d also want to try changing the actual pitch in the dark, dropping notes lower to make it sound even more sinister, which feels very on brand for the Squid Game theme. But for what it is, I’m genuinely happy with how the concept came through.

References

https://docs.arduino.cc/language-reference/en/functions/advanced-io/tone/

https://docs.arduino.cc/built-in-examples/digital/toneMelody/?_gl=1*yqplbf*_up*MQ..*_ga*MTM3MDIzMDA5Ni4xNzc2ODY2MzM5*_ga_NEXN8H46L5*czE3NzY4NjYzMzckbzEkZzAkdDE3NzY4NjYzMzckajYwJGwwJGgxNjY2NjUxMjk0

For the melody: https://github.com/hibit-dev/buzzer/blob/master/src/movies/squid_game/squid_game.ino

Fina_Concept

I want to create a calming, meditation like interface where the usdders physical actions are recorded using arduino sensors and shown as the physical states of a bonsai in the p5 sketch. Essentially, the user will be able to take care of the plant through caring for themselves.

Inspiration

My original thought was only to make something that can feel calming, because I have been staying indoors too much lately and that is emotionally frustrating. When I was thinking of exactly how to do it I thought I am not feeling my best because I am inside and not getting enough sunlight, air, and hydration, which is also what plants need to thrive. Its also spring outside, which gives people a sense of life and flourishing growth, making everything feel hopeful. These just clicked and I thought why not use a plant and a growth of a plant to create the calming, meditating effect? I first thought of using a flower but then the blossom of the flower would be like a “win” ending, which is not exactly what I am aiming for. So I looked for some plants related to meditation and found the bonsai, a sort of specially treated plants coming from Japan. Gardenery and taking care of plants in Japan is considered to be a calming process, generating a meditation effect, so I decided to use that as the final design.

Conceptual Interface sketch

(conceptual image generated by gemini)

My current idea is that the user’s breathing or heartrate can be monitored through certain methods. These can be then used to assess stress levels and reflecfted towards the plants growth. If possible I would want the plant itself to move its leaves or wave in sync with the users breath. Watering of the plant can be done by tilting the tilt sensor (like when actually watering plants) or by the user themselves drinking water. I am still deciding between the two ideas.  Light conditions is directly linked to the users light conditions, since staying in a sunlit room is probably more emotionally beneficial to the user than staying in a dark room. Focus might be assessed by fidgiting or other factors, which may include flex or distance sensors, which I havent decided on.

References

https://greengoddess.com/the-zen-of-bonsai-cultivating-tranquility-in-miniature-landscapes/

Google gemini for image genrating.

Tutorials of interest

arduino breathing monitor:

https://www.youtube.com/watch?v=WDRokF_ZW9A

arduino heart rate monitor:

https://www.youtube.com/watch?v=aKus0FV4deU

flex sensor:

https://learn.sparkfun.com/tutorials/flex-sensor-hookup-guide/all

tilt sensor:

https://lastminuteengineers.com/ball-tilt-switch-sensor-arduino-tutorial/

Week 11: Final Project Preliminary Concept

Concept:

For my final project, I want to create an interactive game that smoothly processes between the p5 sketch and Arduino and provides a realistic experience for users. I chose to create an interactive restaurant game/experience where the user is set in a kitchen with multiple sections, such as a food station, desert station, and coffee machine, and gets to choose a section and actually experience creating it as if it were real.

Inspiration:

I actually had multiple inspirations that helped me come up with this idea. Firstly, one of my options for my midterm project was an Emirati kitchen on p5, but I ended up choosing another one, so I thought it would be a good idea to use it for my final project. I also thought of many cooking games I used to play, such as Cooking Mama from my childhood, and Cooking Fever, which is more recent. Then, because I wanted it to feel more realistic, I came up with more interactive ideas.

Vision:

I have a specific vision for my final project that I wish to create if possible. Starting with the p5 sketch, I will have an aesthetic but colorful kitchen setup with around 3 different sections, and in each section there will be animations of the actions the user can perform. On the Arduino side, I would like to create a simple controller, such as arrow buttons or a joystick, to move between selections, and a push button for confirmation. I also want to include an ultrasonic sensor and create kitchen utensils where players can move in front of the sensor to finish the activity. I might also add LED lights of different colors to indicate the state of completion, such as red before starting, yellow during the process, and green when it is done.

References: 

For P5, along with recapping what we learned:

https://youtu.be/b2s8yZ06waQ?si=Su3dHMMqvrrCNXIz  
https://youtu.be/HfvTNIe2IaQ?si=EhtGjs7IjOrHAGLF

For Arduino, along with recapping what we learned:

https://youtu.be/vo7SbVhW3pE?si=ZUV6hNZY7ecwRSva
https://youtu.be/wTfSfhjhAU0?si=So-vFN7DNnjQD3hn
https://youtu.be/a37xWuNJsQI?si=uddzGXgVkkTvSW1k

Final Project Draft

For the final project, I want to make room ambience controller. I have a rgb light ribbon which runs around my study table. it lights are controlled by a a remote. There are different colors, blinking modes and brightness controls. I will communicate to that using a infrared LED. Firstly Ill map those controls to P5. Then I will define different setting for different ambiences. The user will be able to choose from them. I will also add music to it, I am still thinking of where to get the sound output from. Next I will have songs mode which will make the lights’ blinking corresponding to the song’s tempo, amplitude etc. This is core concept of my project

I want to add a gesture to pause everything. I plan to use the distance sensor for it. I also want to incorporate the LCD but I am not sure if that will go with it because the whole experience is light dependent.

Week 11 – Reading response

This article was really interesting when talking about about physical implements and how they affect user. This goes back to the recurring theme of designing interactive system where the user should be the center of the design and user feedback. The idea of pictures under glass in my opinion creates a pseudo-realistic feeling where everything is meant to imitate the actual objects not not give the user the actual feeling and new design principles are becoming more and more virtual which makes I question how much humans will be involved with tools or implements in the coming period and we may actually lose touch with what we deem as reality now. The reading itself was very self-explanatory so I do not have much to say but it was just a simple reminder to keep in touch with physical things when designing interactive systems.

Week_11_Reading Response

The most interesting idea raised by the article is shifting assistive devices from hiding the disability to embracing design. Glasses were used as an example of a good design that became a fashion trend rather than a clinical tool. But apart from its simple structure and variability in shapes and forms, I would tend to believe that another reason classes is such a success is because nearsightedness is increasingly common, and the variety of design came from the variety of needs. This idea could also insipire designerd to look at what the disabled group actually want for their assistive devices.

The other idea mentioned about not overburdening the device with features relates to one of the other readings in the past weeks about the realationship between mood, design and usability. Keeping the design simple would make it easy and good to use, but would lack emotional function. Adding design to the devices would add emotional support, which would also help bring a more possitive view to disabilities, making it less of a thing to be ashamed of. This would in turn generate more interest in design of diasbility aid and  actively catalyze and inspire broader design culture.

Week 11 – Reading Response Zere

I resonate with the argument made by the author, as in my opinion, the notion of touchscreens being the “ending point” of a technological revolution seems a little odd to me. I believe that there is much more to the technological development of the future than an object that resembles a phone.
What I found most interesting was the part about hands. The author makes a great point about how useful our touch senses are. Human evolution can be strongly tied to the senses that are on the tip of our fingers. Touch is an extremely powerful “tool”, if we could call it that, and we’ve basically designed it out of our tools entirely.

This reading reminded me of our time in class, when the professor showed us simple motion detectors and control using p5js. This technology is not touch-related directly, but still presents a more futuristic/alternative concept rather than touchscreens. This transitions into my thought that interactions that involve your whole hand, your movement, your body, basically, other additional parts of our body (also including touch), are much more interesting and feel more “alive”, compared to just sliding your finger across a surface.

To conclude, I think that it’s easier to make the argument than to solve the problem. It’s one thing to say “we should use our hands more” and another to design something that’s actually as convenient and as accessible as a smartphone. He kind of admits this in the responses page , he doesn’t have the answer, just the problem. Which is fair, but also frustrating at the same time.