Week 11 Reading

This week’s reading provides me new perspectives to look at everyday objects. Pullin questions the space between function and feeling and noticed how assistive devices are related to identity, pride, and possibility. For example,a hearing aid could become jewelry and a prosthetic has potential to become sculpture.

He inspired me to consider how disability can become a kind of creative catalyst. Instead of treating physical difference as something to hide or correct, he treats it as a an invitation to rethink what objects can be. His action reminds me how narrow my own assumptions have been since I was heavily influenced by a world that often prioritizes discretion over expression.

Week 2 – When Planets Align

This post documents my digital artwork, “When Planets Align,” a looping animation about cosmic harmony and human potential. The artwork shows planets orbiting a sun, eventually falling into a rare, perfect alignment. The core message is simple: if planets can align, what can you not achieve?

Concept

“When Planets Align” uses a celestial event as a metaphor for achieving the improbable. Eight planets dance around a sun in a rhythmic, constant motion. The key moment is their unlikely, simultaneous alignment—a fleeting instance of perfect order. This moment is meant to inspire viewers to reconsider their own perceived limitations.

Code Highlight

I’m particularly proud of the simple logic that governs the planetary alignment. By setting each planet’s speed as a multiple of a base speed, I could guarantee they would all align periodically.

JavaScript

const alignmentInterval = 600;
const baseSpeed = (Math.PI * 2) / alignmentInterval;

// Inside the planet creation loop
let harmonicSpeed = baseSpeed * (i + 1);

To make the alignment impactful, the animation briefly pauses, allowing the viewer to absorb the moment before the cycle continues.

JavaScript

// Pause for 1 second on perfect alignment
if ((frameCount + startOffset) % alignmentInterval === 0) {
  noLoop();
  setTimeout(loop, 1000);
}

Embedded Sketch

Here is the embedded p5.js sketch of “When Planets Align.”

Reflection

Creating this piece was a fulfilling journey of turning a simple idea into a meaningful visual. “When Planets Align” is about the beautiful interplay between order and chaos. It’s a reminder that extraordinary moments can and do happen. My hope is that it inspires others to believe in their own potential to achieve the unlikely.

Week 11 – Reading Reflection

This reading was very insightful. I always viewed medical devices through a lens of pure function, so this focus on design and identity was a new perspective for me. I found the glasses example to be the most compelling. They successfully shed any stigma by embracing fashion. People now wear them as a style choice. This proves that a device for the body does not need to be invisible to be accepted. The conflict between discretion and technology is a difficult one. The hearing aid case shows how the goal of hiding the device can actually limit its performance. I believe the author is correct. The problem is not the user’s desire to hide, but a lack of well-designed alternatives that people would feel confident displaying. I also agree that the “one fits all” model is flawed. The book shows that preferences vary greatly, even among people with the same disability. Some may want a realistic prosthetic, while others prefer a sleek tool or even an artistic statement. A single design cannot meet all these needs. The concepts of “appliances” and “platforms” were new to me. A simple, focused appliance like the iPod shuffle can be more inclusive and delightful than a complex platform trying to do everything. I was also struck by the note on dementia. Learning that it can involve a heightened artistic sense makes thoughtful design even more critical. Beautiful, intuitive objects can significantly improve a person’s daily experience. In my opinion, the core idea is to invite different kinds of creators into this field. Fashion designers and artists can bring new values. They can help create devices that feel like a part of a person’s identity, not just a tool they have to use.

Week 10 Reading Reflection

Reading Bret Victor’s “A Brief Rant on the Future of Interaction Design” made me think differently about how I use technology. The idea that a tool has “two sides”—one that fits the person and one that fits the solution—really stuck with me. It made me realize that most of the digital tools I use today, like phones or tablets, seem to fit the solution well but not necessarily the person. They make things efficient, but they don’t always feel natural or engaging to use. It made me question whether convenience has come at the cost of a deeper, more human form of interaction.

The part about the hands also made me pause. Victor reminds us that our hands aren’t just for manipulation—they’re for feeling. That distinction hit home because I often forget how much information comes through touch. When I’m drawing, typing, or even cooking, my hands constantly sense texture, pressure, and movement. But when I’m on a screen, all that disappears. His point that “touch does the driving and vision helps from the backseat” made me see how backwards our relationship with technology has become. I depend entirely on my eyes, while my hands just tap glass.

I connected most with his frustration about “pictures under glass.” It made me realize how passive I’ve become when using digital devices—how I only swipe, scroll, or pinch, instead of actually making or feeling something. It reminded me how satisfying it is to sketch on paper or build something physical, where every motion feels responsive. Victor’s ideas made me want to look for tools and experiences that let my hands do more than just point. It’s a call to design and use technology that reconnects me to my body, not one that distances me from it.

Reflection

When I read “A Brief Rant on the Future of Interaction Design” by Bret Victor and its follow-up article, I began to think differently about how technology shapes the way we interact with the world. Victor argues that designers often focus too much on screens and touch gestures, forgetting that humans are physical beings with hands meant to explore, build, and create. He believes that the future of design should go beyond flat screens and should instead give people tools that let them use their bodies, senses, and creativity more naturally.

This idea really connected with me because, as someone interested in computer science and interactive media, I often think about how to make technology feel more human. I realized that many modern interfaces such as  phones, tablets, laptops imit our movements and creativity, even though they feel advanced. Victor’s point made me reflect on my own projects and how I can design in ways that allow people to move, touch, and engage with technology more freely.

The second  article deepened this idea by exploring how designers and engineers are beginning to create more physical and immersive experiences. It reminded me that innovation isn’t just about new technology but about how it connects with human experience. Reading both pieces made me want to think more carefully about how design can make people feel present and creative, not just efficient.



Week 10 — Reading Response

Reading 1

I don’t completely agree with the author’s criticism that modern technologies like touchscreens and digital interfaces make interaction feel “glassy” and disconnected. While it’s true that they remove some tactile feedback, I think their purpose isn’t to replace physical touch but to make life more efficient and accessible. For example, instead of physically going to a post office to send a letter, I can send an email or message in seconds. This doesn’t make the experience meaningless it just reflects how technology has evolved to save us time and effort. These interfaces have also allowed people with disabilities to communicate, work, and interact in ways that might not have been possible with traditional physical tools. In that sense, “pictures under glass” might actually expand human capability rather than limit it.

However, I understand the author’s point about how important our sense of touch is and that certain interactions lose something when they become purely digital. For example, learning to play a real piano or sculpting clay engages the hands in ways that a touchscreen keyboard or 3D modeling app never could. I think the balance lies in knowing where tactile interaction matters and where digital convenience should take over. For creative or learning experiences, keeping physical feedback is valuable it builds skill, emotion, and memory. But for communication, organization, and quick access to information, digital tools are the smarter choice. So rather than rejecting “pictures under glass,” I think the future of interaction should combine both worlds using technology to simplify life without losing the richness of real, physical touch.

Reading 2

After reading Bret Victor’s responses, I actually liked his explanation a lot more because it helped me understand his real point. At first, I thought he wanted us to go “old school” and reject technology completely, like he was against modern progress. That’s why I used the example of sending an email instead of going to the mailbox because I thought he didn’t appreciate how technology saves time and makes life easier. But after reading his clarification, I realized he’s not saying we should stop using touchscreens or digital tools; he’s saying we should build on them and make them better. I especially liked his comparison of the iPad to black-and-white film before color came along that made so much sense. He wants us to advance technology even further, but in a way that brings back the richness of physical touch and real interaction. I still think that won’t be possible for everything, because the future is definitely digital, but if we can find ways to blend technology with physical sensations, that would be amazing it would make our interactions more natural, creative, and human.

Analog Sensor

Concept

For this project, I used one analog sensor and one digital sensor (switch) to control two LED lights.

The analog sensor I used was a photoresistor (light sensor). It changes how much electricity passes through it depending on how bright the light in the room is. The Arduino reads this change and adjusts the brightness of one LED  when it’s dark, the LED gets brighter, and when it’s bright, the LED becomes dimmer.

For the digital sensor, I used a pushbutton connected to a digital pin. When I press the button, it turns the second LED on or off.

To make it different from what we did in class, I added a “night light” feature. When the photoresistor detects that the room is very dark, the button-controlled LED automatically turns on, like a small night light. When the light comes back, the button goes back to working normally.

This made my project more interactive and closer to how real sensors are used in everyday devices.

 Schematic of my circuit
It shows the Arduino connected to:

  • A photoresistor and 10 kΩ resistor forming a voltage divider to read light levels.

  • A pushbutton connected to a digital pin.

  • Two LEDs , one controlled by the light sensor and the other controlled by the button

Final Results

When I tested the circuit:

  • The first LED smoothly changed its brightness depending on how much light the photoresistor sensed.

  • The second LED turned on and off with the button as expected.

  • When the room got dark, the second LED automatically turned on, working like a night light.

It was a simple but satisfying project, and the extra feature made it stand out from the class example.

Video: video-url

Arduino Code

Part of Cold I am proud of

void loop() {
  // --- Read photoresistor ---
  int lightValue = analogRead(lightPin); // 0–1023
  int brightness = map(lightValue, 0, 1023, 255, 0);
  analogWrite(ledAnalog, brightness);

  // --- Button toggle ---
  if (digitalRead(buttonPin) == LOW) {
    ledState = !ledState;
    delay(200);
  }

  // --- Night light feature ---
  if (lightValue < 300) { // If it's dark, auto turn on LED
    digitalWrite(ledDigital, HIGH);
  } else {
    digitalWrite(ledDigital, ledState ? HIGH : LOW);
  }

  // --- Print readings ---
  Serial.print("Light: ");
  Serial.print(lightValue);
  Serial.print(" | Brightness: ");
  Serial.print(brightness);
  Serial.print(" | LED State: ");
  Serial.println(ledState ? "ON" : "OFF");

  delay(200);
}

Github url: Github

Challenges and Further Improvements

While I was able to make both the analog and digital sensors work, I struggled a bit with arranging all the wires and resistors neatly on the breadboard. It took a few tries to get everything connected correctly.

I also had to test different threshold numbers for the night light feature to decide when the LED should automatically turn on. Once I found the right value, it worked well.

For my next project, I want to try using other kinds of sensors, like sound or temperature sensors, and make the circuit respond in new ways. I’ll also practice reading the code line by line to understand how each part works better before adding new features.

Week 9 – Reading Reflection

Physical Computing’s Greatest Hits (and Misses)

Reading Physical Computing’s Greatest Hits (and Misses) really made me reflect on the role of human presence and movement in interactive design. The article explores recurring motifs in physical computing, such as theremin-style sensors, body-as-cursor interfaces, tilty tables, and mechanical pixels, and evaluates which ideas succeed and which fall short. What struck me most was the idea that interaction alone is not meaningful unless it is framed with intention or context.  I found this insight particularly relevant because gestures, motion, and bodily engagement only carry meaning when integrated into a space or narrative. The article also emphasizes that even commonly used ideas can be made fresh through variation and creativity.

The discussion of emotionally responsive projects, like “remote hugs”, also inspired me to think about the potential of physical computing to create connection and presence. It made me consider designing experiences where participants’ actions are not only triggers for a response but also carriers of meaning, emotion, or narrative. I found myself imagining interactive installations or performance spaces where movement, gesture, and proximity could communicate emotion or tell a story, giving participants a sense of agency and contribution. Overall, the article reinforced the importance of centering human input and intention over technical complexity. It motivated me to experiment more boldly with interactive media, blending technology, space, and human engagement in ways that feel purposeful, immersive, and emotionally resonant.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

When I was reading Making Interactive Art: Set the Stage, Then Shut Up and Listen I noticed a perspective of the audience or participant as a a co-creator, and the creator’s role is to design opportunities for exploration and discovery. The article encouraged setting the stage, providing affordances, and then stepping back to let participants engage on their own terms. This concept resonated deeply with me because I often feel the need to over-explain or control how people interact with my work, whether in interactive media projects, installations, or themed environments. Learning to trust participants’ curiosity and creativity is both challenging and exciting, and it made me rethink how I approach design: sometimes the most compelling experiences arise when the creator resists guiding every step and instead observes how others explore, interpret, and respond.

I also liked the idea  of “listening” to participant interaction. Observing how people engage, adapt, or even misuse an interactive installation can reveal insights the creator never intended, and these discoveries can guide future iterations. This connects to my interests in performance and immersive storytelling because, in both cases, the audience’s reactions shape the experience. It also made me reflect on how I design spaces and experiences outside of class projects, including themed parties or interactive setups, where I can experiment with encouraging participation rather than prescribing behavior. The article inspired me to embrace unpredictability, co-creation, and emergent experiences, reminding me that interaction is not just about technology or novelty, it is about creating a dynamic relationship between the participant, the space, and the narrative. Now, I want to apply this mindset to my projects, designing experiences where participants’ actions carry weight and meaning, and where discovery becomes a central part of engagement.

Week 8 – Unusual Switch

For my project, I created an Arduino switch that activates through physical contact, specifically, a hug. Instead of using hands, I built a simple “hug switch” using two pieces of aluminum foil connected to the Arduino. One piece was taped onto my sleeve, and the other onto a plush toy sitting on my chair. When I hugged the toy, the foil pieces touched and completed the circuit, turning on an LED on the breadboard.

This setup used digitalRead() to detect when the circuit closed, lighting up the LED as a visual indicator. It’s a very basic circuit, just two foil pads, a resistor, and an LED; but it demonstrated how the human body can act as a conductor to trigger digital inputs. I liked how small physical gestures could translate into electronic signals. The process reminded me how interaction design can make technology feel more human, even with something as simple as a hug that lights up a tiny LED.

Schematic Diagram

Week 8 – Unusual Switch

For this recent assignment I decided to build a breath-activated switch.

 

How It Works

The concept is straightforward. I took a flexible tube and placed a small metal ball inside it. At one end of the tube, I positioned two disconnected wires from my circuit. When I blow into the open end of the tube, the force of my breath pushes the metal ball along the tube until it makes contact with both wires simultaneously.

This contact completes the circuit, and—voila!—a connected blue LED lights up to signal that the switch is “on.” When I stop blowing, the ball rolls back, the circuit breaks, and the light turns off. It’s a hands-free switch that’s both fun and functional.

The Build and Components

As you can see in the photo, the setup is quite simple. Here’s what I used:

  • An Arduino as a power source.

  • A breadboard for easy prototyping.

  • One blue LED.

  • Resistors to protect the LED.

  • Jumper wires to connect everything.

  • A flexible plastic tube.

  • A small metal ball.

The circuit itself is a basic LED setup connected to the development board. The magic is all in the custom-built switch. The tube, the ball, and the carefully placed wires are what make this an “unusual” solution that perfectly met the project’s requirements.

This was a fantastic exercise in creative problem-solving and a great way to apply basic circuit principles in a new and interesting way. It proves that with a little ingenuity, you don’t always need a traditional button or switch to make things work.

Video Demo

IMG_8195