Reading Reflection — Week 11

After reading Design Meets Disability, I noticed how it sheds light on an ongoing issue with how society views disabilities. The author explains how when design and disability meet, the outcome is always the most intricate and interesting. It’s an opportunity for designers to test out their skills in both creativity and functionality.

I loved the section where he criticizes designers for making many assistive technology (AT) devices, like hearing aids and prosthetics, hidden. I also believe that disabilities should not be shamed, but embraced. By prioritizing the concealment of these devices, we are discouraging disabled individuals rather than empowering them, making them feel like outsiders instead of supporting them. By making AT more visible and beautifully designed, I believe designers can help change how people perceive disabilities.

The idea to bring engineers and designers together (‘solving’ and ‘exploring’) to create AT devices is brilliant because the result will be assistive technologies that are both useful and inspiring. I also think designers will be very useful in coming up with a comfortable device for users, since they’ll be wearing or using them all day (sometimes all night).

I absolutely agree with the enforcement of universal design, meaning designing products that are usable by as many people as possible, which of course also includes people with disabilities. The author mentions that rather than making a device for a very specific disability only, we could make devices with designs that are simple enough to be broadly adapted but smart enough to serve people with different needs. Overall, I believe this approach to design makes devices more accessible and more human.

Week 11 – Group Exercises

Group Members: Joy, Yiyang

(Arduino codes are commented at the bottom of p5 codes)

1. Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/63Yg60k8D
Video Demonstrative: IMG_9638

2. Make something that controls the LED brightness from p5

p5 & Arduino Code: https://editor.p5js.org/yiyang/sketches/dtftbIzaK

3. take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

p5 & Arduino Code: https://editor.p5js.org/joyzheng/sketches/v77Sd41K4
Video Demonstrative: IMG_9640

Week 11: final project idea

For my final project, I plan to build an interactive installation called “Discover Nepal: Interactive Cultural Storyboard.” The goal is to create a small tabletop experience that teaches people about Nepali culture, history, geography, and fun facts in a playful, visual way. The project will use both Arduino and p5.js and will function like a tiny museum exhibit about Nepal.

Physically, I will create a flat board or “map” of Nepal with five labeled zones: Mountains, Festivals, Music & Dance, Food, and History & Fun Facts. Each zone will be connected to an Arduino input using either pushbuttons or simple touch pads made from aluminum foil. I will also add a single knob (potentiometer) that lets the user scroll through multiple facts within a category. When the user touches or presses one of the zones, the Arduino will detect which category was selected and read the knob position, then send this information as serial data to a p5.js sketch running on my laptop.

In the p5 sketch, I will draw a stylized interface representing Nepal, with icons or scenes for each category. Based on the messages from the Arduino, p5 will highlight the active category, display a short piece of text (such as a fun fact about the Himalayas, a description of Dashain or Tihar, information about Nepali instruments or foods like momo and dal bhat, or a historical event), and play a short sound that matches the theme (wind in the mountains, festival bells, a madal rhythm, etc.). Turning the physical knob will move through different facts inside that category, so the user can explore more than one piece of information in each area.

This project satisfies the course requirements because it is a physically interactive system that relies on a multimedia computer for processing and display. The Arduino “listens” to the user’s physical actions on the board (pressing zones and turning the knob), and the p5.js program “thinks” by interpreting those inputs and deciding what cultural content to show. The system “speaks” back to the user through animated graphics, text, and sound. My intention is to make the interaction feel like a fun, hands-on way to discover Nepal’s culture and history, rather than just reading a static page of information.

Final Idea

I want to create an immersive experience with space. I want users to come to an empty canvas and use pressure sensors to add planets, meteors and stars. These stars can be added and let’s say we have a minimum of 5 stars, a star constellation will be created and can grow if the user adds more stars. These will be floating around or soaring by (meteors) and so it feels like you’re in space. I want to have a projector to project the P5 screen so users can really feel the grandeur of it.

The Interaction

I will have 8 stepping stones on the floor that when the user steps on them they do different things:

Star Makers (Stones 1, 4, 6):

  • Quick tap: Adds one small white star to that stone’s constellation
  • Each stone creates its own independent star collection
  • When 5 stars are placed, they automatically connect with glowing lines to form a constellation
  • Can expand up to 9 stars, making the constellation more intricate and complex
  • Three unique constellations can exist simultaneously in different regions of the sky

Planet Makers (Stones 2, 5, 7):

  • Hold for 3+ seconds: Materializes a planet in space
  • Random planet type appears: gas giants, ringed planets, rocky worlds, ice planets, or mysterious alien worlds
  • Planets drift randomly through space, floating past your constellations
  • Creates a living, breathing solar system

Meteor Makers (Stones 3, 8):

  • Quick tap: Shoots one meteor across the sky in a random direction
  • Hold for 3+ seconds: Triggers a meteor shower with 5 meteors streaking across space
  • Adds dynamic movement and excitement to the scene

Atmosphere Control (Potentiometer dial):

  • Physical dial near the stepping area
  • Controls both the visual intensity and audio volume
  • Low: Clear, minimal space with silence
  • High: Rich nebula clouds, cosmic dust, and immersive ambient soundscape
  • Gives users creative control over the mood of their universe

The Experience

Users approach a darkened floor projection showing empty space. As they explore the stepping stones, they discover they can build their own universe, star by star, constellation by constellation. The moment when 5 stars connect into a glowing constellation creates a magical sense of achievement. Planets drift through the creations, meteors streak across the sky, and the atmosphere can be dialed from stark and minimal to rich and dramatic.

Week 11 – Final Project Prompt

My final project is an interactive installation titled “The Snail’s Journey to School.” The idea is inspired by the opening scenes of Monsters University, where a small character makes their way toward school. In my version, the user helps a robot snail travel through a physical obstacle course until it reaches its destination.

Arduino will control the physical movement of the robot snail using motors and buttons for directional input. It may also include an optional sensor, such as a photoresistor or distance sensor, to detect obstacles or the finish line. The snail robot and the obstacle course will be fabricated by me, either through 3D printing or hand-built materials.

p5.js will act as the narrative and feedback layer. It will display an introduction, instructions, and story elements that respond to the user’s physical interactions. As the user presses buttons to move the snail, Arduino will send messages to p5.js, which will update the visuals, play small animations or sounds, and react to progress in real time. When the snail reaches the “school” area, p5 will display the final scene.

The interaction loop centers on listening (reading the user’s button presses), thinking (moving the snail and sending corresponding data), and speaking (p5.js reacting immediately with feedback).

Week 11 Reading Response

Reading Design Meets Disability made me think about how deeply design shapes the way we see ability and difference. I liked how the book challenged the traditional idea that design for disability should be purely functional. When I saw examples like the stylish hearing aids or the eyeglasses that evolved from medical tools to fashion accessories, I realized how design doesn’t just solve problems—it tells stories about people and identity. I found myself really appreciating how the author described the tension between discretion and expression, especially in how disability devices can either hide or celebrate difference. It made me think about how design always carries a message, even when it pretends to be neutral.

What I liked most was the way the book connected design with emotion and culture. I loved seeing how something as simple as a prosthetic leg could become a work of art, or how Apple’s minimalist products fit into the same conversation about accessibility and beauty. The idea that “good design on any terms” can come from understanding human diversity really stayed with me. I felt inspired by the thought that inclusive design isn’t about charity but creativity—it’s about expanding what we consider beautiful and functional at the same time. This reading made me want to look more closely at everyday objects and think about the values they quietly express.

Week 11 – Reading Response

In Design Meets Disability, Graham Pullin challenges the way society frames disability by questioning why assistive devices are often treated as purely functional rather than expressive. He argues that design for disability should not be limited to medical necessity, it should also include aesthetics, identity, and personal preference. What stood out to me is how Pullin highlights the quiet power imbalance in design: mainstream objects like glasses or smartphones have endless variations and styles, while many assistive tools remain clinical and uniform. This difference reveals how disability is still seen as something to “fix” instead of a natural part of human diversity.

Pullin pushes the reader to consider that assistive devices could be opportunities for creativity rather than reminders of limitation. For example, he discusses the possibility of hearing aids becoming fashionable accessories instead of devices people feel pressured to hide. His argument reframes disability not as a deficit but as a design space full of potential innovation.

Overall, the reading invites designers to rethink their assumptions. Instead of designing for disabled people, Pullin encourages designing with them, treating disability as a source of insight and richness. The book ultimately suggests that inclusive design is not just ethical, it also expands the possibilities of design itself.

Lily pad

For this project, I wanted to recreate a peaceful scene of a frog sitting on a lily pad in a pond. Since I hadn’t worked much with sensors yet, I thought this would be the opportunity to incorporate one. I decided to use a distance sensor to control the ripples in the water, the closer your hand gets to the sensor, the more frequently the frog hops and creates ripples.

The Arduino codesimply measures how far away your hand is from the sensor and sends that data to the p5.js code. The Arduino code measures distance by sending an ultrasonic pulse and calculating how long it takes to bounce back, then converts that time into cms using the speed of sound (0.034 cm/microsecond) divided by 2 since the sound travels to the object and back.

The p5.j code then uses that distance data to dictate how often the ripples should occur. In the p5 code, I also hosted the visuals/art element of the project.

Next time, I think it would be fun to use a pressure sensor and have users actually jump up and down at different strengths and show that on p5.js.

// ARDUINO CODE
const int trigPin = 9;
const int echoPin = 10;

void setup() {
  Serial.begin(9600);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
}

void loop() {
  // measure distance
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  
  // send distance to p5.js
  Serial.println(distance);
  
  delay(50);
}

IMG_2539

Week 11 – Reading Response

Design Meets Disability treats constraint more than a deficit. When a flow fails under limited vision or one‑hand use, the failure is a measurement of hidden demand, e.g., precision, memory, timing. Fixing those loads tends to make the product faster and safer for everyone.

The shift from accommodation to inclusive design is a key shift in the design workflow. An add‑on ramp or a screen reader retrofit treats accessibility as a patch. Building large hit targets, consistent structure, multimodal signals, and undo by default treats accessibility as a core requirement. The second approach reduces support costs and broadens use. Captions prove the point, becuase they serve deaf users first, but help viewers in noisy spaces, language learners, and anyone scanning a video on mute. Curb cuts started for wheelchairs, then made cities usable for strollers, carts, and bikes. The spillover is not an accident, but it is the direct result of “designing for constraints” that generalize. As Charles Eames believed, “design depends largely on constraints.” And I agree.

Inputs, perception, and cognition form a clear framework. Products assume hands with fine motor control, perfect vision and hearing, and uninterrupted attention. Alternatives (e.g., switch controls, eye‑tracking, high contrast, scalable text, haptics, chunked steps) lower friction and error rates across contexts. Voice control demonstrates its utility along this path—essential for limited mobility, valuable hands‑free while cooking or driving. Predictive text began as assistive technology, now it is an everyday efficiency feature, especially in coding today, a humorous but significant save of less efficient efforts. That’s why Cursor dominates.

I do have questions. Are there cases where inclusive design adds complexity that hurts clarity? For instance, voice, haptic, and visual signals can compete. The answer may be progressive disclosure. Maybe default to one strong signal, then let users layer or swap modes. But it raises another concern. How do we balance among options, and must we spend resources to develop all of them? How do teams budget for inclusive testing without turning it into a checkbox? We must tie success to task completion under constraint, not to the presence of features. If two flows pass under one‑hand use, low contrast, and divided attention, or even less for a more specific use case they are ready.

The actionable stance: Write constraints into requirements. Set minimum hit area sizes, enforce semantic structure, and require undo and confirmation for destructive actions. Prefer multimodal capability, but choose a clear default per context. Measure success as completion with minimal precision under time pressure.

Shahram Chaudhry – Final Project Brainstorm

I don’t know why I’m so obsessed with memories, even my midterm project was memory-themed. I guess that’s what happens when you don’t get to major in neuroscience but end up majoring in computer science instead.

For my final project, I want to create a physically interactive memory-sequence game that plays with the idea of “recovering a forgotten memory.”  I’ve always liked memory games, and I thought it would be interesting to turn that mechanic into a metaphor: every correct sequence helps restore a blurry image on the screen, as if the player is trying to remember something long lost.

The physical side of the project is intentionally minimal. I’m planning to use four LEDs (diff colours) paired with four corresponding buttons, wired to an Arduino. The Arduino will flash sequences of LEDs, starting easy and growing in complexity, and the user has to repeat them by pressing the buttons in the same order. When the user gets a sequence right, the p5 interface will respond instantly by revealing more detail in the image, for e.g. decreasing the blur.  If they get it wrong, the image  becomes more distorted, symbolizing the memory slipping further away. Only if the player successfully completes all three levels does the final clear image appear. Otherwise, the memory remains lost. I’m also considering having a different image each game, so even if the user replays the game, they can’t recover a memory they “failed”, reinforcing the idea that some memories can be lost forever. (Life is unfair , I know.)

On the p5 side, I want to focus on smooth feedback and atmosphere. The screen will always show the partially recovered image, and p5 will handle visualization, sound feedback (buzzer for wrong sequence) , tracking correctness, and the level progression. The project feels manageable for my current skill level, but I think it is still creative and expressive.