Week 12 – Final Proposal Megan

Concept

My project is called Music Producer, and it’s basically an interactive system where you take something super simple, like your voice saying “hello”, and turn it into a full song and visual artwork.

You start by recording your voice directly in p5.js. After that, everything becomes about experimenting and building on top of it. The idea is that you’re not just editing sound, you’re playing with it. You can change how it sounds, slow it down, make it higher or lower, and then start adding layers like beats, piano, or bass.

What I find interesting is that the same controls can mean different things depending on what mode you’re in. So instead of having a million controls, you have a simple setup that changes function depending on what you’re trying to do. At the same time, everything you add or change shows up visually, so by the end you don’t just have music, you have something that looks like a living artwork.

Mediums

This project uses p5.js and Arduino, but they do very different things.

  • p5.js is where everything actually happens. It records the audio, plays it back, lets you edit it, and also creates all the visuals. It also has different “sections” or modes, like pitch, tempo, piano, beats, and bass.
  • Arduino is just the controller. It doesn’t deal with sound directly. Instead, it sends inputs using buttons and a potentiometer. The buttons are used to trigger things like adding beats or notes, and the potentiometer controls continuous changes like pitch or tempo.

So basically, p5.js is the brain, and Arduino is the hands.

Process

First, I’m going to make sure I can record and loop audio in p5.js, since that’s the base of everything.

Then I’ll build the Arduino circuit with the buttons, LEDs, and potentiometer, and make sure it sends clean data through serial.

After that, I’ll connect both systems and start mapping inputs. The important part here is that the controls change depending on the mode you’re in. For example:

  • in pitch mode, the potentiometer controls pitch
  • in tempo mode, it controls speed
  • in piano mode, the buttons trigger notes

Then I’ll add different sound layers like beats, bass, and simple melodies.

At the same time, I’ll design the visuals so they match each element:

  • voice looks like waveforms moving and changing size
  • piano shows points that grow and shrink
  • beats look like stars or pulses
  • bass looks like slower waves
  • different notes have different colors

Finally, I’ll focus on making everything feel smooth, responsive, and easy to understand.

Final Goal and Interaction

The goal is that you can take something super basic, like one word, and turn it into a full composition.

The interaction is very simple but keeps going:

  • you record your voice
  • you choose a mode
  • you use the physical controls to change or add things
  • the system responds instantly with sound and visuals
  • you keep building

By the end, you have your own song, and at the same time a visual piece that represents everything you created.

Expectations for the Project

I want this project to feel fun and intuitive, not confusing. It should be clear what you’re controlling, and every action should have an immediate result.

I also want people to feel like they’re actually creating something, not just pressing buttons. Even if the input is simple, the output should feel complex and expressive.

Overall, I’m expecting a system where sound, visuals, and interaction all feel connected, and where the user can experiment freely and end up with something that feels like their own piece of art.

WEEK 12 – Reading Response

Design Meets Disability

This article made me rethink how I see design and disability. At first, I used to think of design for disability as something separate or special, but the article shows that it actually influences mainstream design in powerful ways. For example, ideas that started from solving disability-related problems have inspired everyday products and furniture. This made me realize that designing for disability is not a limitation. It can actually lead to more creative and useful solutions for everyone.

One idea that really stood out to me was the tension between hiding disability and expressing it. Some designs try to make disabilities invisible, like hearing aids that are hard to notice. Others, like stylish glasses or expressive prosthetics, embrace visibility. This made me think about how design can shape how people feel about themselves. If something is always hidden, it can feel like it is something to be ashamed of. But if it is designed in a beautiful or expressive way, it can become something empowering instead.

Inclusive design should not feel like an extra feature. It should be part of normal design thinking. Going forward, I would want to design things that include everyone and make people feel confident using them, not just functional.

WEEK 11 – Reading Response

A Brief Rant on the Future of Interaction Design

This article argues that current ideas about future technology are not truly innovative. Instead of creating new ways for people to interact with technology, many designs simply improve what already exists. The author criticizes “pictures under glass,” meaning touchscreens, because they ignore how humans naturally use their hands. I found this argument interesting because I had never questioned touchscreens before. They feel modern, but the article shows they may actually limit human ability.

A key idea in the article is that tools should match human capabilities. The author explains that our hands are powerful because they can feel and manipulate objects in rich ways. In contrast, sliding a finger on a flat screen is very limited. This made me reflect on how much we lose when we move from physical interaction to digital screens. The examples, like holding a book or a glass of water, helped me understand how important touch and feedback are in everyday actions.

This article made me rethink what “innovation” really means. True progress is not just about better technology, but about better interaction between humans and tools. The future should not ignore the human body but should work with it. In my own thinking, especially when building products, I would try to design systems that use natural human abilities instead of reducing them.

Responses: A Brief Rant on the Future of Interaction Design

This article responds to criticism of the original rant and clarifies its purpose. The author explains that the goal was not to give a solution, but to highlight a problem. I found this honest and realistic because not all problems have immediate answers. Instead, the author encourages more research and exploration into better interaction methods. This shows that progress often starts with asking the right questions, not having perfect solutions.

Another important idea is that current technologies, like the iPad, are good but not final. The author compares them to early black-and-white photography, which was useful but later improved. This perspective helped me understand that we should not become too comfortable with current technology. Just because something works well now does not mean it cannot be improved. Innovation requires continuing to question and push boundaries.

This article emphasizes the importance of long-term thinking. It reminds us that technology should evolve to better match human abilities, not replace them. I also liked the discussion about the human body, showing that interaction should involve more than just a finger or voice. This reflection encourages me to think more critically about design and to aim for solutions that fully use human potential rather than limiting it.

WEEK 10 – Reading Response

Physical Computing’s Greatest Hits (and Misses)
This article talks about common ideas used in physical computing projects. It shows that many students often repeat similar themes, like musical instruments, gloves, or interactive mirrors. At first, I thought repeating ideas meant a lack of creativity. However, the article explains that even common ideas can still be original if you add your own variation. This changed my thinking. I now see that creativity is not always about doing something completely new, but about improving or reimagining existing ideas.

Another important point is that not all projects offer meaningful interaction. Some projects look beautiful but do not give users much to do. For example, video mirrors are interesting to look at, but they often lack depth in interaction. This made me realize that good design should not only focus on appearance, but also on how users engage with the system. A successful project should balance aesthetics and interaction, making sure the user feels involved and not just entertained.

This article made me reflect on how I approach building projects. Instead of trying to avoid common ideas, I should focus on making them better and more meaningful. It also reminded me to think about the user experience, not just the technical side. In the future, I would aim to design projects that are both creative and interactive, where users can clearly understand and enjoy their role in the system.

WEEK 9 – Reading Response

Reflection on “Attractive Things Work Better”

The article explains that design is not only about how something works, but also how it feels. The author shows that attractive things can actually work better because they make people feel good. When people feel positive, they think more creatively and are more patient with small problems. I found this idea interesting because it challenges the common belief that function is more important than appearance. The example of the teapots helped me understand how design, usability, and beauty can all matter in different ways.

Another important idea is how emotions affect behavior. The article explains that when people are stressed, they focus more but become less flexible. In contrast, when people are relaxed and happy, they can think more broadly and solve problems better. This made me realize that good design should consider the emotional state of the user. For example, tools used in stressful situations should be simple and clear, while tools used in relaxed settings can focus more on enjoyment and beauty.

The article made me think differently about design. I used to think that beauty was just extra, but now I see that it plays an important role in how people interact with things. Good design is a balance between usability and aesthetics. In my own work, especially in building products, I would try to create designs that are both functional and pleasant to use.

Reflection on “Her Code Got Humans On The Moon – And Invented Software Itself”
This article tells the story of Margaret Hamilton and her important role in the Apollo space program. I was impressed by how she worked in a time when women were not encouraged to join technical fields. Despite these challenges, she became a leader in software engineering and helped make the moon landing possible. Her dedication and courage stood out to me, especially how she balanced her work and her role as a mother.

One key idea from the article is how new and uncertain software engineering was at the time. There were no clear rules, and Hamilton and her team had to figure things out as they went. This shows how innovation often comes from stepping into the unknown. I also found it interesting how software, which was once not taken seriously, became critical to the success of the mission. It highlights how important behind-the-scenes work can be.

This article inspired me to think about persistence and impact. Margaret Hamilton’s work not only helped humans reach the moon but also shaped modern software development. It reminds me that important contributions are not always recognized immediately. Her story encourages me to take risks, work hard, and believe that my efforts can have a lasting impact.

Final Proposal

1. Finalized concept for the project

My project is a simple Arduino interactive LED indicator system. I will use an Arduino board, sound sensor and regular LED lights to make a basic interactive device. When the sound sensor detects clapping or loud noise, the LED will turn on, flash or change state; when it is quiet, the LED will turn off automatically. The whole idea is beginner-friendly, no complex game logic, just basic sensor input and LED output control, suitable for a freshman Arduino practice project.

2. Design and description of Arduino program function
  1. Read real-time signal value from the sound sensor.
  2. Set a simple sound threshold: if the detected sound is louder than the threshold, the program will turn on the LED and let it blink several times.
  3. If there is no loud sound for a few seconds, the LED will turn off and stay in standby mode.
  4. The program runs in a loop all the time, continuously detecting sound and controlling the LED reaction.
3. Overall project work & progress documentation
  1. First, I confirmed my project idea, chose sound sensor + LED as the main hardware, decided to make a simple sound-activated light.
  2. I prepared all components: Arduino UNO, sound sensor, LED, resistor, jumper wires and breadboard.
  3. I learned basic Arduino wiring rules and finished circuit connection step by step.
  4. I started to write simple test code, upload to Arduino, and test if the sensor and LED work normally.
  5. I adjusted the sound threshold value many times to make the sensor respond stably to clapping sound.
  6. Now I am optimizing the code to make the LED blink smoothly and the standby mode work correctly.
4. Most challenging part + small demo
The most challenging and unknown part for me is how to correctly read the sound sensor value and set a stable threshold, and make sure the sensor only reacts to claps instead of background noise. I am not familiar with analog sensor reading at first.
I made a small separate demo: I only connect the sound sensor and use a simple demo code to print sensor data to the Serial Monitor. I clap near the sensor many times to observe the value change, then find a suitable number as my threshold. This small demo lets me understand how the sound sensor works and solves the hardest part of my project.
Reference Resources
  1. Arduino Official Documentation: https://www.arduino.cc/en/Reference
  2. Arduino Sound Sensor Tutorial (YouTube & Instructables)
  3. Instructables Basic Arduino Sensor Projects: https://www.instructables.com/circuits/arduino/
  4. W3Schools Arduino Basic Syntax Learning
  5. Course lecture slides and in-class Arduino basic example codes
  6. AI for inspiration and some coding thing
  7. The book that the store gave me when I bought Arduino

Week 12 – Final Project Proposal

The Grove

Overview

The Grove is a cozy resource management and crafting game built in P5.js for the midterm. The player moves between five locations — a Map, River, Forest, Pottery Studio, and Greenhouse — collecting resources, making pottery, and growing plants. For the final, the project is being expanded into a physical table installation. All mouse and keyboard input is replaced with custom props and sensors, so every in-game action has a corresponding physical one.

What’s Already Built

The full P5.js game is complete from the midterm. This includes:

    • All five scenes with backgrounds, navigation, and per-scene background music
    • A full inventory and resource system (clay, soil, water, seeds, pots)
    • The complete pottery workflow: place clay, shape on wheel, fire in furnace, collect
    • A plant growth system with three growth stages, timers, and harvest logic
    • Sprite sheet animations for plants, pots, and the watering can
    • Sound effects for every interaction and background music per scene
    • A custom cursor, backpack inventory overlay, main menu, pause menu, and instructions screen
Physical Components
Joystick — Navigation

The joystick is the main input device and replaces the mouse entirely. It handles all navigation and selection across every scene.

    • Left/Right on the Map cycles through locations (Studio, Greenhouse, Forest, River). Button enters.
    • In scenes, Up moves focus to the upper HUD (Back to Map / Menu). Down moves to the lower HUD (inventory). Left/Right cycles between options in the current zone. Button confirms.
    • In the Studio, Left/Right switches focus between the pottery wheel and furnace.
    • In the Menu, Up/Down cycles options and Button selects.
Ultrasonic Proximity Sensor — Pottery Wheel

An HC-SR04 sensor is mounted face-up at the Studio zone. To shape a pot, the player holds both hands above it with palms facing down, mimicking cupping clay on a wheel. The closer the hands, the faster the pot shapes. Pulling hands away pauses progress. This replaces the original click-and-hold mechanic.

Potentiometer — Furnace

A rotary dial at the Studio zone controls the furnace. Turning it up starts firing. The player watches the pot sprite on screen and turns the dial down at the right moment to retrieve the pot. Leaving it too long burns or destroys it.

Digging Mechanic — Forest

Five fixed plot positions in the forest scene each map to a physical point on the table. On every spawn and respawn (after a 30-second cooldown), each plot is randomly assigned as clay or soil, readable from the sprite on screen.

The current plan is to use a conductive shovel prop with an aluminum foil tip and five corresponding foil contact points on a flat board (four corners and a center), each wired to its own Arduino pin. Touching the shovel to a point completes the circuit and triggers that plot. Contact must be held for 200ms to avoid false triggers.

An alternative being considered is a keypad, where each key corresponds to one of the five plot positions. This would be simpler to wire and more reliable, but less immersive than the shovel prop given the physical metaphor of the rest of the installation.

Water Sensor — Greenhouse

A moisture sensor sits inside a small cup with drainage holes at the bottom. After placing a seed, the player physically pours water into the cup to water the plant in-game. A high analog threshold distinguishes a real pour from residual moisture, and a short debounce timer prevents repeated triggers while the cup drains.

Arduino

The Arduino reads all sensors and sends a single comma-separated line over serial each loop:

joyX, joyY, joyBtn, proximity, potValue, waterValue, dig0, dig1, dig2, dig3, dig4

All game logic stays in P5. The Arduino only handles reading and transmitting. Debounce and sampling logic is handled on the Arduino side: 150ms for the joystick, 200ms hold for shovel contacts, 3-4 second suppression after a water trigger, and a 5-sample rolling average for the proximity sensor.

P5 Changes

The existing code is being modified, not rewritten. The main changes are:

    • input.js: Web Serial API replaces all mouse and keyboard handlers. Incoming serial data is parsed into a global sensorState object read every frame.
    • globals.js: New sensor state variables added alongside existing ones.
    • screens.js: A focus system is layered into each scene. The joystick moves focus between defined zones per scene. The button triggers whatever is currently focused. Pulsating glow and highlight drawn on the focused element.
    • classes.js: ResourcePlot updated to re-randomise its type on each respawn.
    • sketch.js: Minimal changes.
Table Layout

Each prop sits in a zone on the table that corresponds to its location in the game. The joystick is at the center. The proximity sensor and potentiometer are grouped at the Studio zone. The digging board and shovel are at the Forest zone. The watering cup is at the Greenhouse zone. The idea is that as the player moves between locations in the game, they also shift their focus and hands to a different part of the table.

Progress
    • Done: Full P5.js game from midterm
    • Done: Physical interaction design for all five components
    • Done: Joystick navigation model defined for all scenes
    • In progress: Arduino wiring and serial protocol
    • Pending: Web Serial integration in P5
    • Pending: Focus and highlight system in P5
    • Pending: Sprite changes in P5
    • Pending: Sensor threshold calibration
    • Pending: Physical prop construction and table setup

Reading Week 12 (Design Meets Disability)

Among all the readings, I would categorize this as one of the readings I enjoyed a lot because it talks about design which is something I am very interested in (I think that shows in all my past projects, and soon I will major in it, in sha Allah). What I liked most was how it showed that design is not just about fashion or making things look nice because I feel like in today’s world most people don’t understand design or maybe confuse it with just interior design or graphic deisgn, but it actually starts from human needs. The reading included some nice examples like glasses, which started more as medical appliances in the 1930s and later became part of fashion, personality, and self-expression. I think that transformation is beautiful because it shows how something made for function can also become something meaningful and personal. At the same time it kinda reminds me of how products were made and advertised before, people mostly focused on functionality and not design. I also found it interesting how design connects to the military, because before learning about design or art I never really linked them with things like war because they both felt so different. Looking at art too, I understand that a lot of digital art and design tools for example were connected to the military and first used by the government, engineers, scientists, and during world wars.

I got so excited when I saw Aimee Mullins because I think she is the perfect example to include. I watched her TED Talks, The Opportunity of Adversity and My Twelve Pairs of Legs, and I found them really powerful, especially when she talks about looking up the word “disabled” in the thesaurus and finding words like “crippled,” “helpless,” “useless,” and “mangled.” She says: “my voice broke, and I had to stop and collect myself from the emotional shock.” That part stood out because I personally went and did my own research and unfortunately these words have not changed, the meaning connects to the same negative or weird language used.  It is sad because these words shaps the way people see disability. I also like how the UAE uses the term “People of Determination” because it feels much more respectful and empowering instead of focusing on the weakness especially when most people were born naturally like this </3. Going back to Aimee, I honestly think that what she did was amazing! She changed something people usually viewed as negative into something creative, expressive, and even fashionable. Her idea of having a “wardrobe of legs” completely changes the question from “what is missing?” to “what can a leg do, and what can it be?” as she says. I also think that she is such a strong example of how design can challenge the way society thinks.

Reading reflection

I think this article is really interesting because it shows disability design in a different way. Before reading it, I thought disability design was mostly about helping people move, hear, see, or live more easily. But this article shows that design is also about beauty, identity, and confidence.

One idea I found important is that disabled people should not always have to hide their disability. For example, glasses used to be seen as medical objects, but now many people wear them as fashion. So I wonder why hearing aids, prosthetic legs, or wheelchairs cannot also be designed in a stylish way.

This article also makes me think that “normal” should not always be the final goal. A good design does not need to make everyone look the same. Instead, it can help people feel comfortable and proud of who they are.

My question is: how can designers create products that are both useful and beautiful, while also giving disabled people the choice of how they want to present themselves?

Final Project Process

Finalized Concept

My concept will still follow the opera idea I initially had with a few tweaks. The user will be a composer trying to compose the music needed, if the user composes the wrong pitch, an LED would flicker and the buzzer would make a sound.

I originally had the idea of having the composer’s batons but then I realized that would be extremely difficult to map out for me, so I decided to settle with one glove. the user would move the glove and thats how the music is composed

Arduino:

I will be mapping the glove’s x and y positions for octave and note, and p5 will receive this data from an accelerometer and translate it to audible music.

p5

p5 will be used to run the UI and will send information to the arduino on whether the right note is being played which will then cause the LED light to flicker, I was also thinking of adding ballet dancers in there which will fall from their dance if the music is composed incorrectly.

STIPEND USAGE:

I will be using the stipend to buy this MPU6050 accelerator from amazon

https://www.amazon.ae/AOICRIE-GY-291-ADXL345-Acceleration-Transmission/dp/B09S633YTW/ref=sr_1_1?crid=15F0J7UIXQK35&dib=eyJ2IjoiMSJ9.gDQl-_42Yd4esBPagpxPgPQ1K0jRtEl1-iW409PI-IfKNZ2RDA0hRICe1Ca5CQO83N_NFecaKnxd0M0wb-MUX1e1_oV4-HaxCe8KarHRjisgkRf5YXzLzubvBosNxFh3jF1fxCOIuON14P6KvVqARPChru9DCNpV5LlMncvQg8Ro-_7YjQ8SDpSgTryUpC0bzRT_iYJgE7TdhPJPFwpEfobjkqCHFv6UofjnQWaQvz2MidmSXbQrPRS50HEMsv52y0cumuy0z_MNJD41on7dsMC64UjQqNZ6qDc8AXm-5ME.Jzj2a_U_k_AoU6BZxRHVxhwAFpV0Zcu0PJnDUZjpUM0&dib_tag=se&keywords=GY-291+3-Axis+Accelerometer+ADXL345&qid=1777274584&sprefix=gy-2913-axis+accelerometer+adxl345%2Caps%2C261&sr=8-1 

I also consulted ChatGPT to assess the difficulty of building my project, and it told me its possible but will be a bit tricky. I’ve also used a previous student’s model to understand how to use the accelerometer and map it to my p5. (Shota Matsumoto)