Week 11 — Reading Response

Reading about the intersection between design and disability really got me thinking about many facets of design that I often ignore in my day to day life as an able-bodied person. For example, when I purchase makeup, let’s say, a face cream, I see what the lid or container looks like and whether it looks easy to pack, carry, and open (while also being hygienic). However, recently I saw a video about how a rare beauty blush lid was disability friendly since it had a circular grip that made opening it much easier.

Similarly, in the reading they talk about something called trickle-down effect where designs are expected to eventually find its way to disabled people in a way that is accesible to them. It then mentions that the splint medical device to chair furniture pipeline challenges this idea. Usually we think innovation flows from mainstream products down to specialized disability products — like how smartphone tech might eventually make its way into accessible devices years later, but usually as watered-down technological versions. Although I understand that this is also a matter of making profit, it seems to me from the reading that this is actually a missed opportunity since looking at things from a different angle can also birth innovation for the masses in unexpected ways.

This makes me wonder about whether we approach software design in a very constrained way. For instance, we’re often taught to build for the “average users” first, then maybe add accessibility features later if we have time (which let’s be honest, most student projects never get to). But what if we started with accessible design principles from the beginning? Would we end up with more innovative solutions that actually work better for everyone? This is the question I was left with, that I still don’t have an answer to.

Week 11 — Homework

Before diving into the first task, we began by loading the sample Arduino and p5.js code from the previous class. We then read through each line to see how Arduino connects and communicates with p5. This served as a helpful foundation to jumpstart our workflow.

Task 1:

After reviewing the code and identifying the necessary components, we proceeded to build the circuit using a sliding potentiometer. Using analogRead from pin A0, we captured the potentiometer’s data and sent it to p5. The values ranged from 0 to 900, so we divided them by 2.25 to map them to the x-position on the canvas, ensuring smooth and accurate movement. A global variable ‘pos’ is updated and mapped into the x position of the ellipse.

Here is the p5.js code:

let pos = 0;
function setup() {
  createCanvas(400, 400);
}

function draw() {
  background(220);
  ellipse(pos,200,100,100);
}

function keyPressed() {
  if (key == " ") {
    setUpSerial(); // setting up connection between arduino and p5.js
  }
}

function readSerial(data) {

  if (data != null) {
    let fromArduino = trim(data) + "\n";
    pos = fromArduino/2.25; // to map 0 to 900 in the right range in p5.js (400 by 00) canvas
    writeSerial(sendToArduino);
  }
}

and the arduino code:

int sensor = A0;

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  digitalWrite(led, HIGH);
  delay(200);
  digitalWrite(led, LOW);

  // starting the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  digitalWrite(LED_BUILTIN, LOW);
  int sensor = analogRead(A0);
  Serial.println(sensor); // sending sensor information to p5.js
  
}

Here’s the video of it in action:

https://drive.google.com/file/d/1kT32H353kkMX_5HeKBphHf4Cxy-xhMF_/view?usp=sharing

 

Task 2:

We decided to create an input box where if the user inserted a number between 0-255 and pressed enter, it would then reflect the corresponding brightness onto the blue LED on the breadboard. It was a relatively simple implementation that required very minimal code changes.

Here’s the p5.js code:

let ledval = 0;
let input;

function setup() {
  createCanvas(400, 400);
  input = createInput('');
  input.position(120, 100);
}

function draw() {
  background(220);
}

function keyPressed() {
  if (key == " ") {
    setUpSerial(); // setting up connection
  }
}

  if (data != null) {
    let fromArduino = trim(data);
    let sendToArduino = input.value() + "\n";  
    writeSerial(sendToArduino);
  }
}

and the arduino code:

int led = 3;

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(led, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(led, HIGH);
  delay(200);
  digitalWrite(led, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int ledVal = Serial.parseInt();
    if (Serial.read() == '\n') {
      analogWrite(led, ledVal);
      delay(5);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
  
}

and finally, the video of it in action:

https://drive.google.com/file/d/1eMi1d_3H6abYxtYwyEpybCnZB7-fTVXF/view?usp=sharing

Task 3:

For the last task, we needed to first open up and examine the given gravity wind code. We identified two key things we could alter that would complete the given task at hand: the “wind.x” variable and the “(position.y > height-mass/2)” IF statement. We could map the analog value we read in from pin A0 to the wind.x position to alter the ball’s position on the x axis and since the aforementioned IF statement indicates when the ball has touched the ground, we could simply sneak in a line that sets a boolean flag to true and sending this to arduino and performing a digitalWrite (replacing the previous analogWrite from the input()).

Here’s how we did it in p5.js:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let floor = false;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
      floor = true; // light up the LED!
    }
  
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed(){
  if (key == " ") {
    setUpSerial(); // setting up serial connection
  }
  
  if (keyCode==LEFT_ARROW){
    wind.x=-1;
  }
  if (keyCode==RIGHT_ARROW){
    wind.x=1;
  }
  if (key=='s'){ // changed from space to 's' since SPACEBAR is used to initiate serial connection pairing to arduino
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
}

function readSerial(data) {
  if (data != null) {
    let fromArduino = trim(data);
    wind.x = map(fromArduino, 0, 912, -2, 2); // mapping sensor's analog value to ball's wind x axis value

    let sendToArduino = Number(floor) + "\n";
    
    writeSerial(sendToArduino);
    floor = false; // turning off blue LED
  }
}

*We used the Number() function to convert the boolean flag value to an integer value since initially we were encountering issues where it was not actually being send as a numeric value to turn on the LED in digitalWrite.

and the arduino code:

int sensor = A0;
int led = 3;

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(led, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(led, HIGH);
  delay(200);
  digitalWrite(led, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int ledVal = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(led, ledVal);
      delay(5);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
  int sensor = analogRead(A0);
  Serial.println(sensor);
  
}

Finally, here’s the video of the final product (we have two to demonstrate both the analog and digital capacity):
1. https://drive.google.com/file/d/1TcwYwz7HcyUobzH0MwLQ3P1pbf2rw8BR/view?usp=sharing

2. https://drive.google.com/file/d/1Ydz9OjHuqt8VPypLTQhBYDtB-ShDBGmk/view?usp=sharing

VOILÁ!

Week 11: Reading Response

I really appreciated this reading, and the way that the author juxtaposes design with disability, and also underscores problems that are faced by those with disabilities due to, at times, arbitrary design constraints. For instance, the conversation about hearing was a bit shocking–especially how hearing aids could be significantly better for the people wearing them–if they could just be a bit larger.

This phenomena underscored a couple of problems to me… One of the first was that as a society, we potentially have stigmatized the need for hearing aids, and as such, smaller hearing aids are better. They should be subtle and not noticeable at all costs–including their primary purpose. At that point, hearing aids are prioritizing preferences of the broader society over the needs of the person who is hard of hearing.

As the article later explores how to “keep the design in design for disability,” I think that it’s crucial these engineering teams be comprised of diverse groups of people, who can provide additional perspectives and ensure that products are usable by a broader range of people. Moreover, I appreciate how it emphasizes that tools designed for those with disabilities are in no way less in need of design as products for those without disabilities.

week 10- assignment

Concept: Music Box

Our inspiration for this project was from a music box, which is a pretty nostalgic instrument that plays melodies when you wind it up. Music boxes are usually these small, mechanical devices that play a set of tunes with a simple winding mechanism, and have a sweet, tinkling sound.

Our version of this is more interactive and customizable. Instead of just one melody, our “music box” allows the users to choose between two pre-programmed songs , which are Twinkle Twinkle Little Star and Frère Jacques. In addition to that, they can also increase or decrease the tempo of the song with a button, adjust the pitch of the melody with the potentiometer, and as the music plays, the LEDs flash and change, synchronized to the rhythm and notes.

Week 10

Code Snippet/Difficulties

One aspect we had difficulties with was getting the LED’s to align with music and the notes played, however after a few tries we were able to get this code. In summary, to get this “visualizer” effect, we used the flashVisualizer() function , and called it inside the playSong() function, right after each note was played. The i variable, which is the index of the current note in the song, is passed to flashVisualizer(), and so, as the song progresses, the i value increments, causing the 3 LEDs we used to cycle through in sequence. In general, every time a note is played, the flashVisualizer() function is called, which results in a flash of an LED that matches the timing of the note. So, the flashing LED visualizes the rhythm of the music, and since the song is made up of an array of notes, the LEDs change with each note.

// Function to play the melody
void playSong(int *melody, int *durations, int length) {
  for (int i = 0; i < length; i++) {
    int baseNote = melody[i];
    int noteDuration = tempo * (4.0 / durations[i]);  // Calculate note duration
    // Read potentiometer and calculate pitch adjustment
    int potVal = analogRead(potPin); // range: 0–1023
    float pitchFactor = 0.9 + ((float)potVal / 1023.0) * 0.4;  // pitch range: 0.9–1.3
    int adjustedNote = baseNote * pitchFactor;
    if (baseNote == 0) {
      noTone(buzzer); // Pause
    } else {
      tone(buzzer, adjustedNote, noteDuration); // Play adjusted tone
      flashVisualizer(i); // Flash LEDs
    }
    delay(noteDuration * 1.3); 
  }
  // Turn off all LEDs after song ends
  digitalWrite(led1, LOW);
  digitalWrite(led2, LOW);
  digitalWrite(led3, LOW);
}
// LED Visualizer Function
void flashVisualizer(int index) {
  // Turn off all LEDs first
  digitalWrite(led1, LOW);
  digitalWrite(led2, LOW);
  digitalWrite(led3, LOW);
  // Turn on one LED based on index
  if (index % 3 == 0) digitalWrite(led1, HIGH);
  else if (index % 3 == 1) digitalWrite(led2, HIGH);
  else digitalWrite(led3, HIGH);
}

 

Reflections/Improvements

Overall, this project was very fun and engaging, and we are very happy with how it turned out as we were able to implement most of the ideas we brainstormed. That said, there are a few things we’d improve. For one, we could expand the number of songs it can play. Also, the current LED visualizer is pretty simple and linear, so adding more LEDs or creating more complex patterns based on pitch, tempo, or things like that, could make it feel more like a true light show.

Final Project Idea

For the final project, I am leaning more towards creating an interactive experience/artwork rather than a game. The user would interact with the physical components on Arduino which would trigger changes / feedback on the p5js sketch. I want my project to have a coherent theme that ties the physical and visual components together, and though I haven’t settled on this idea completely, this is what I’m currently thinking:

    • theme: human connection / friendship / bonding
      • 2 users each put one hand on a surface that measures temperature (temperature sensor) and uses the other hand to hold each other’s hand, with an object containing a force dependent resistor (FDR) in between
      • the warmth of the hands are used to change the color of the visualization and the 2 temperatures make the colors lerp
      • how strongly the 2 are holding hands together, measured by the FDR, changes the stroke width of the illustration (tbd) or how big/small it is etc
      • the resulting sketch would represent the 2 users’ connection; their body warmths dictate the color changes and the illustration dynamically changes depending on how strongly the 2 are holding hands together
    • resources: temperature sensor, force dependent resistor bundled up inside a flat/thin object that 2 people can hold in between their palms while holding hands

I want the p5js visuals to be minimalistic, calming and have a mild/muted color palette.

12 Seasons in Colour Analysis, Examples of Colour | Roberta Lee - The  Sustainable Stylist

I am currently looking for inspiration for the actual illustration/art on the p5js and here are some I really liked after searching on the web and revisiting some of our past class readings:

Generating Art — d3.js vs. p5.js. [Post is also available at… | by  playgrdstar | creative coding space | Medium12 - Perlin Noise Wave | vorillaz.com

Reading Reflection – Week 11

Design Meets Disability

This reading was probably my favorite out of all the pieces we’ve read so far. It shifted how I thought about design and its relationship to disability; before this, I had mostly seen assistive devices as purely functional tools but Pullin’s writing made me realize that it’s really not the case. He reframes assistive devices – i.e. hearing aid, prosthetic hand/leg, wheelchair – as opportunities for self-expression and avenues for celebrating individuality, the same way how eyeglasses have evolved into fashion statements. So much of design for disability has been shaped by a desire to hide and minimize differences and prioritized functionality over aesthetics. It has also often been suboptimally fulfilled through universal design where a product is overburdened with mediocre at best features that attempts to accommodate everyone’s needs, a concept which is perfectly summarized by the expression “flying submarine” in the piece. To bridge this disconnect between design and disability, the author argues that disabled people should participate in the design process as collaborators alongside engineers, designers and medical practitioners to actualize a true inclusive design for assistive devices that balance function with aesthetics, and embrace individuality over invisibility. The idea that inclusion is not just about access and compliance, but also deeply involves choice and self-expression, is one that I will remember and apply in the things I do now moving forward.

Week 10 – Reading Response

A Brief Rant on the Future of Interaction Design

I feel like this piece hit hard, not because it presented this radical concept I’d never considered, but because it made me realize how “numb” we’ve become to bad interaction design. The author’s rant about Pictures Under Glass being a transitional technology really stuck with me, as I really hadn’t consciously thought about how much feedback we’re missing when we interact with screens, but now I can’t unsee it. It’s like we’ve accepted a more flattened version of reality just because it looks sleek and futuristic. The author’s praise for the hands also reminded me of how intuitive and rich real world interactions are, so like the simple example of turning the pages of a book or making a sandwich made it feel so obvious. Our fingers do a million things, completely automatically, and we’ve built an entire tech world that barely acknowledges that. The ending made me feel slightly  hopeful, though, I love the idea that the future is a choice, and that inspired people, like us students can try to help push interaction design in new directions.

As for the second reading which was a response to this first reading, what I took away is that if we don’t push for more sensory-rich interaction design, we might risk narrowing our creative and cognitive possibilities. I feel there was a subtle warning which tells designers and everyone else to not let convenience trap us in shallow interaction as we deserve better tools that can really challenge and extend the full capabilities of our bodies and minds.

Week 10 – Reading Response

Through the reading “A Brief Rant of the Future of Interaction Design,” I was convinced that the Future of Interaction should not be a single finger and should have good feedback, and I think big innovations needs to raise the usefulness of current things. It is problematic that most Future Interaction Concepts completely ignore parts of the body we rely on. If Pictures Under Glass were an invention when the ipad or touch-screen phones were not yet made, then perhaps it would be considered a good advancement (that still needs further improvement on interaction). Pictures Under Glass looks similar to the current phone or ipad, and I think I was visioning a different way of seeing andinteracting with things you find in the phone or ipad in the future – like content you can swipe and change up in the air (is this a good thing or not?). While watching the video “Microsoft – Productivity Future Vision”, I felt I was looking for something useful, something more. For example, I think the translation using what you see with your glasses looks very useful. If you can make your recipes while reading what you need up in the air, this could be useful because you just need to look up from the food. If things look futuristic simply for the sake of looking futuristic, is that really progress or moving backward?

However, I do disagree with one definition used for a tool (“A tool addresses human needs by amplifying human capabilities”) as I think a tool can address human needs in another way, such as by helping open up a new possibility for humans rather than amplify existing human capabilities. Another definition of a ‘tool’ strikes me well: “That is, a tool converts what we can do into what we want to do. A great tool is designed to fit both sides.” I have never thought of it that way and I sure do agree with this one.

As for the second reading, I was reminded of my experience with virtual reality, when I tried to touch objects I see in the air but not feel feedback. I was still amazed by what I experienced, but imagine if two people were learning fencing in a virtual world without feeling the weight of the weapon and its impact on the opponent’s weapon – I really think that’s not going to work… Also, while virtual reality is cool, like the author, I also have a problem with a future where people can and will spend their lives completely immobile, if they spend all their time on computers that are not part of the physical environment. That is unhealthy, and the inventions should be used to help people and encourage them in taking good actions.

Week 9 – Reading Response

Physical Computing’s Greatest Hits (and misses)

I used to believe that something could only be considered unique if it was entirely new or never seen before. That made me overlook the value of familiarity. But now, I’ve come to understand that familiarity isn’t a limitation—it can actually deepen meaning and make an idea more impactful. From reading about the musical instrument examples and others too, I realized how something as common as an instrument can still be a powerful space for creativity and interaction. It showed me how familiar forms can be reimagined in playful, thoughtful ways. This has shifted how I think about uniqueness, and I’m now considering using the concept of a musical instrument as inspiration for my final project—not because it’s entirely new, but because it holds potential for creative reinterpretation.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This was such an eye-opening and exciting read—it completely shifted how I view and approach interactive art and media. Before engaging with the article, I believed it was essential not only to guide the audience on how to interact with my work but also to explain how they should feel while doing so. I struggled with finding the boundary between offering direction and allowing personal interpretation. Now, I understand that interactive art is less about delivering a fixed message and more about creating an open-ended environment—one that invites the audience to explore, interpret, and respond in their own unique way. It’s not about scripting every possible outcome but rather about designing the space, the elements, and the cues in a way that encourages genuine engagement and discovery. Going forward, I aim to embrace this approach in my future projects. I want to focus more on creating spaces that speak for themselves—spaces that offer intuitive cues, invite interaction, and allow viewers to craft their own interpretations. I’m excited to stop over-explaining and start listening more, letting the dialogue between artwork and audience unfold naturally.

 

Week 10 – Assignment

Concept: Music Box

Our inspiration for this project was from a music box, which is a pretty nostalgic instrument that plays melodies when you wind it up. Music boxes are usually these small, mechanical devices that play a set of tunes with a simple winding mechanism, and have a sweet, tinkling sound.

Our version of this is more interactive and customizable. Instead of just one melody, our “music box” allows the users to choose between two pre-programmed songs , which are Twinkle Twinkle Little Star and Frère Jacques. In addition to that, they can also increase or decrease the tempo of the song with a button, adjust the pitch of the melody with the potentiometer, and as the music plays, the LEDs flash and change, synchronized to the rhythm and notes.

Code Snippet/Difficulties

One aspect we had difficulties with was getting the LED’s to align with music and the notes played, however after a few tries we were able to get this code. In summary, to get this “visualizer” effect, we used the flashVisualizer() function , and called it inside the playSong() function, right after each note was played. The i variable, which is the index of the current note in the song, is passed to flashVisualizer(), and so, as the song progresses, the i value increments, causing the 3 LEDs we used to cycle through in sequence. In general, every time a note is played, the flashVisualizer() function is called, which results in a flash of an LED that matches the timing of the note. So, the flashing LED visualizes the rhythm of the music, and since the song is made up of an array of notes, the LEDs change with each note.

// Function to play the melody
void playSong(int *melody, int *durations, int length) {
  for (int i = 0; i < length; i++) {
    int baseNote = melody[i];
    int noteDuration = tempo * (4.0 / durations[i]);  // Calculate note duration

    // Read potentiometer and calculate pitch adjustment
    int potVal = analogRead(potPin); // range: 0–1023
    float pitchFactor = 0.9 + ((float)potVal / 1023.0) * 0.4;  // pitch range: 0.9–1.3
    int adjustedNote = baseNote * pitchFactor;

    if (baseNote == 0) {
      noTone(buzzer); // Pause
    } else {
      tone(buzzer, adjustedNote, noteDuration); // Play adjusted tone
      flashVisualizer(i); // Flash LEDs
    }

    delay(noteDuration * 1.3); 
  }

  // Turn off all LEDs after song ends
  digitalWrite(led1, LOW);
  digitalWrite(led2, LOW);
  digitalWrite(led3, LOW);
}

// LED Visualizer Function
void flashVisualizer(int index) {
  // Turn off all LEDs first
  digitalWrite(led1, LOW);
  digitalWrite(led2, LOW);
  digitalWrite(led3, LOW);

  // Turn on one LED based on index
  if (index % 3 == 0) digitalWrite(led1, HIGH);
  else if (index % 3 == 1) digitalWrite(led2, HIGH);
  else digitalWrite(led3, HIGH);
}

Reflections/Improvements

Overall, this project was very fun and engaging, and we are very happy with how it turned out as we were able to implement most of the ideas we brainstormed. That said, there are a few things we’d improve. For one, we could expand the number of songs it can play. Also, the current LED visualizer is pretty simple and linear, so adding more LEDs or creating more complex patterns based on pitch, tempo, or things like that, could make it feel more like a true light show.