Week 2 – Post Response

Chance Operations and Ordered Randomness

Watching Casey Reas talk about chance operations changes how we see art. It is not full control. It is not total randomness. It is something in between.

Determinism is boring because everything is already decided. There is no surprise. Chaos is also boring because nothing has meaning when anything can happen.

Ordered randomness is different. The artist sets simple rules. The system creates the result. The final work is not chosen by hand. It grows on its own. It feels natural and alive.

I think the best balance is when the user knows how their input affects the system, but not the exact result. If the output is predictable with total certainty, the art becomes flat. On the other hand, if the output is completely random, it feels like TV noise. There is nothing to hold on to.

We see this balance in nature. Small rules create big patterns. Clouds form. Birds move in groups. No one controls every part, yet it still makes sense.

Chance operations show that we do not need to choose between control and chaos. We can guide the process without forcing it. When we create space for the unexpected, the outcome becomes more interesting.

Week 12 – Documentation on Final Project

Finalized Concept: “The Snail’s Journey to School”

Project Summary
The Snail’s Journey to School is an interactive physical–digital storytelling installation.
The player controls a small snail robot navigating a handmade obstacle course, helping it reach “school.” As the player moves the snail using physical buttons, a connected p5.js sketch narrates the adventure, updates animations, and reacts to the snail’s progress in real time through serial communication with Arduino.

1. Finalized Detailed Concept

Physical World (Arduino + Snail Robot)

The snail robot sits on a small motorized base (the base will be the robot that can be built from Arduino Uno kit). The user controls the snail with three buttons:

  • FORWARD
  • LEFT
  • RIGHT

A “finish line” sensor (photoresistor or IR distance sensor) detects when the snail reaches the school.

The obstacle course includes:

  • A puddle (painted blue)
  • Small pebbles
  • A small cardboard ramp
  • A school gate with the sensor hidden inside

Digital World (p5.js Storytelling Screen)

The p5 screen visually tells the story through 4 scenes:

  1. Intro Scene
    “The snail is late for school! Guide it through the obstacle course.”
  2. Instructions Scene
    Shows button directions + images.
  3. Live Story Mode
    Reacts to every button press and displays animations such as:
  • “Snail moves forward!”
  • “Turning left…”
  • Little movement animations or sound effects.
  1. Ending Scene
    When the sensor triggers finish, p5 displays:
    “You made it! The snail is in class now!” with a cute animation.
  2. If the player doesn’t make it on time, which is 2.5 minutes, or fails to cross more than 1 obstacle the screen shows “oh no, you’re late to school!”

Project Interaction Loop (Final Version)

  1. Player presses physical button
  2. Arduino moves snail robot + sends message to p5
  3. p5 receives message and updates story animation
  4. Snail reaches sensor
  5. Arduino sends “finish” to p5 ->  p5 plays ending

Arduino Program Design

Final Project Idea

I am developing a small, battery-powered Arduino device that displays the room’s status and simple pixel-style emojis on a 16×2 LCD.  The device is controlled wirelessly from my computer using either an nRF24L01 radio module or a Bluetooth serial module, depending on what hardware becomes available. The device will mount on a wall and serve as a minimal, visually friendly indicator of what’s happening inside the room.

Development Stages

Stage 1 — Crude Functional Prototype

My first goal is to build the simplest version of the system:

  • Connect an Arduino board, a 16×2 LCD, and whichever wireless module I choose.

  •  Load basic firmware that listens for simple incoming messages and updates the LCD with status text and a small emoji.

  • Test commands from a computer program.

  •  Focus on verifying communication and display logic, without worrying about wiring neatness, battery life, or enclosure design.

The objective of this stage is to prove that the device concept works end-to-end.

Stage 2 — Improved Prototype and Physical Enclosure

Once the first prototype is working, I move to making it usable in a real space:

  • Tidy the wiring and make the device compact.

  • Design a simple case in a 3D modeling tool such as Fusion 360 or Tinkercad.

  • 3D-print the enclosure so the LCD is visible from the front, the electronics fit securely inside, and the device can mount flat against a wall.

  • Refine battery placement so the device can be opened or recharged easily.

Stage 3 — Final Visual and Interaction Refinement

After the device is physically assembled:

  • Adjust the display layout so the text and emoji look balanced and readable.

  • Refine how the device reacts to incoming commands (such as smoothing updates, adding small transitions, or improving clarity).

  • Add small visual improvements such as backlight changes for attention or custom character tweaks for better emoji expression.

This stage is about making the device feel polished and pleasant.

Project Architecture

Device Side:

The device contains three main elements:

  1. Microcontroller — the Arduino runs the core program that listens for wireless messages and updates the display.

  2. Display System — the 16×2 LCD shows both text and custom emoji characters.

  3. Wireless Module — either an nRF24L01 or a Bluetooth serial module receives commands from my computer.

Internally, the Arduino software is structured around:

  • A small message handler that receives text commands wirelessly.

  • A display manager that decides what to show based on the message.

  • A custom character bank for emoji graphics.

This architecture keeps the device simple, efficient, and easy to maintain.

Computer Side

On my computer, I run a lightweight program that:

  • Opens a wireless communication link (either through a paired Bluetooth COM port or through a USB radio dongle for the nRF24L01).

  • Sends simple text commands such as “BUSY”, “AVAILABLE”, “MEETING”, or an emoji instruction.

  • Lets me manually choose the room status using a small interface or a command-line tool.

The computer-side software remains minimal because all visual work happens on the Arduino.

Emoji Design Approach

Since the 16×2 LCD uses a 5×8 pixel character grid, I design emojis as tiny pixel icons:

  • Create simple patterns — smiling, neutral face, busy face, resting face, or symbols like checkmarks or caution icons.

  • Define each pattern using the LCD’s built-in custom character feature.

Week 11 Production(Ling and Abdelrahman)

Conceptualization:

The central idea was to build a simple connection between physical and digital worlds.

Step 1: Single-Sensor and p5.js Movement
Using only one analog sensor (a potentiometer), the Arduino continuously reads values and streams them to p5.js over serial. p5.js interprets those readings and moves an ellipse along the horizontal axis, keeping it vertically centered.

Step 2: LED Brightness Controlled by p5.js
Next, I reversed the flow. Instead of only reading from Arduino, I sent numerical values from p5.js back to the board so it could adjust LED brightness using PWM.

Step 3: Gravity + Wind Integration
Finally, I modified the p5.js gravity wind sketch. Each time the ball hits the “ground,” p5 sends a signal to Arduino, turning an LED on briefly before switching back off. Meanwhile, an analog sensor feeds continuous data to p5.js to influence the wind force acting on the falling ball.

Video Demonstration:

https://drive.google.com/file/d/1Morf2y7cxIAgYLHKVnitsadjr813cX4Z/view?usp=sharing

Schematic:

Code Highlight:

oid setup() {
 Serial.begin(9600);
  // wait for p5 to connect
 while (Serial.available() <= 0) {
   Serial.println("0,0");
   delay(300);
 }
}
void loop() {
 // wait for data from p5
 while (Serial.available()) {
   digitalWrite(LED_BUILTIN, HIGH);
    Serial.read();     // read incoming
  
   int sensorValue = analogRead(A0); // read sensor
  
   Serial.println(sensorValue);   // send sensor value
 }
 digitalWrite(LED_BUILTIN, LOW);
}
// serial variables
let port;
let connectBtn;
let sensorValue = 0;

function setup() {
  createCanvas(640, 360);
  
  port = createSerial();   // create serial connection
  
  // create connect button
  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(10, 10);
  connectBtn.mousePressed(connectToArduino);
}

function draw() {
  background(220);
  
  // read from Arduino
  let str = port.readUntil("\n");
  if (str.length > 0) {
    sensorValue = int(str);
  }
  
  port.write("\n");   // send handshake to Arduino
  

  let xPos = map(sensorValue, 0, 1023, 0, width);   // map sensor to horizontal position
  
  // draw ellipse in middle vertically
  fill(0);
  ellipse(xPos, height/2, 50, 50);
  
  // show sensor value
  fill(0);
  noStroke();
  text("Sensor: " + sensorValue, 10, 50);
  text("Turn potentiometer to move circle", 10, 70);
}

// connect to Arduino
function connectToArduino() {
  if (!port.opened()) {
    port.open(9600);
  }
}

Reflection:

This project helped me practice the  bidirectional serial communication between Arduino and p5.js.

Week 11 Reading

This week’s reading provides me new perspectives to look at everyday objects. Pullin questions the space between function and feeling and noticed how assistive devices are related to identity, pride, and possibility. For example,a hearing aid could become jewelry and a prosthetic has potential to become sculpture.

He inspired me to consider how disability can become a kind of creative catalyst. Instead of treating physical difference as something to hide or correct, he treats it as a an invitation to rethink what objects can be. His action reminds me how narrow my own assumptions have been since I was heavily influenced by a world that often prioritizes discretion over expression.

Week 2 – When Planets Align

This post documents my digital artwork, “When Planets Align,” a looping animation about cosmic harmony and human potential. The artwork shows planets orbiting a sun, eventually falling into a rare, perfect alignment. The core message is simple: if planets can align, what can you not achieve?

Concept

“When Planets Align” uses a celestial event as a metaphor for achieving the improbable. Eight planets dance around a sun in a rhythmic, constant motion. The key moment is their unlikely, simultaneous alignment—a fleeting instance of perfect order. This moment is meant to inspire viewers to reconsider their own perceived limitations.

Code Highlight

I’m particularly proud of the simple logic that governs the planetary alignment. By setting each planet’s speed as a multiple of a base speed, I could guarantee they would all align periodically.

JavaScript

const alignmentInterval = 600;
const baseSpeed = (Math.PI * 2) / alignmentInterval;

// Inside the planet creation loop
let harmonicSpeed = baseSpeed * (i + 1);

To make the alignment impactful, the animation briefly pauses, allowing the viewer to absorb the moment before the cycle continues.

JavaScript

// Pause for 1 second on perfect alignment
if ((frameCount + startOffset) % alignmentInterval === 0) {
  noLoop();
  setTimeout(loop, 1000);
}

Embedded Sketch

Here is the embedded p5.js sketch of “When Planets Align.”

Reflection

Creating this piece was a fulfilling journey of turning a simple idea into a meaningful visual. “When Planets Align” is about the beautiful interplay between order and chaos. It’s a reminder that extraordinary moments can and do happen. My hope is that it inspires others to believe in their own potential to achieve the unlikely.

Week 11 – Reading Reflection

This reading was very insightful. I always viewed medical devices through a lens of pure function, so this focus on design and identity was a new perspective for me. I found the glasses example to be the most compelling. They successfully shed any stigma by embracing fashion. People now wear them as a style choice. This proves that a device for the body does not need to be invisible to be accepted. The conflict between discretion and technology is a difficult one. The hearing aid case shows how the goal of hiding the device can actually limit its performance. I believe the author is correct. The problem is not the user’s desire to hide, but a lack of well-designed alternatives that people would feel confident displaying. I also agree that the “one fits all” model is flawed. The book shows that preferences vary greatly, even among people with the same disability. Some may want a realistic prosthetic, while others prefer a sleek tool or even an artistic statement. A single design cannot meet all these needs. The concepts of “appliances” and “platforms” were new to me. A simple, focused appliance like the iPod shuffle can be more inclusive and delightful than a complex platform trying to do everything. I was also struck by the note on dementia. Learning that it can involve a heightened artistic sense makes thoughtful design even more critical. Beautiful, intuitive objects can significantly improve a person’s daily experience. In my opinion, the core idea is to invite different kinds of creators into this field. Fashion designers and artists can bring new values. They can help create devices that feel like a part of a person’s identity, not just a tool they have to use.

Week 10 Reading Reflection

Reading Bret Victor’s “A Brief Rant on the Future of Interaction Design” made me think differently about how I use technology. The idea that a tool has “two sides”—one that fits the person and one that fits the solution—really stuck with me. It made me realize that most of the digital tools I use today, like phones or tablets, seem to fit the solution well but not necessarily the person. They make things efficient, but they don’t always feel natural or engaging to use. It made me question whether convenience has come at the cost of a deeper, more human form of interaction.

The part about the hands also made me pause. Victor reminds us that our hands aren’t just for manipulation—they’re for feeling. That distinction hit home because I often forget how much information comes through touch. When I’m drawing, typing, or even cooking, my hands constantly sense texture, pressure, and movement. But when I’m on a screen, all that disappears. His point that “touch does the driving and vision helps from the backseat” made me see how backwards our relationship with technology has become. I depend entirely on my eyes, while my hands just tap glass.

I connected most with his frustration about “pictures under glass.” It made me realize how passive I’ve become when using digital devices—how I only swipe, scroll, or pinch, instead of actually making or feeling something. It reminded me how satisfying it is to sketch on paper or build something physical, where every motion feels responsive. Victor’s ideas made me want to look for tools and experiences that let my hands do more than just point. It’s a call to design and use technology that reconnects me to my body, not one that distances me from it.

Reflection

When I read “A Brief Rant on the Future of Interaction Design” by Bret Victor and its follow-up article, I began to think differently about how technology shapes the way we interact with the world. Victor argues that designers often focus too much on screens and touch gestures, forgetting that humans are physical beings with hands meant to explore, build, and create. He believes that the future of design should go beyond flat screens and should instead give people tools that let them use their bodies, senses, and creativity more naturally.

This idea really connected with me because, as someone interested in computer science and interactive media, I often think about how to make technology feel more human. I realized that many modern interfaces such as  phones, tablets, laptops imit our movements and creativity, even though they feel advanced. Victor’s point made me reflect on my own projects and how I can design in ways that allow people to move, touch, and engage with technology more freely.

The second  article deepened this idea by exploring how designers and engineers are beginning to create more physical and immersive experiences. It reminded me that innovation isn’t just about new technology but about how it connects with human experience. Reading both pieces made me want to think more carefully about how design can make people feel present and creative, not just efficient.



Week 10 — Reading Response

Reading 1

I don’t completely agree with the author’s criticism that modern technologies like touchscreens and digital interfaces make interaction feel “glassy” and disconnected. While it’s true that they remove some tactile feedback, I think their purpose isn’t to replace physical touch but to make life more efficient and accessible. For example, instead of physically going to a post office to send a letter, I can send an email or message in seconds. This doesn’t make the experience meaningless it just reflects how technology has evolved to save us time and effort. These interfaces have also allowed people with disabilities to communicate, work, and interact in ways that might not have been possible with traditional physical tools. In that sense, “pictures under glass” might actually expand human capability rather than limit it.

However, I understand the author’s point about how important our sense of touch is and that certain interactions lose something when they become purely digital. For example, learning to play a real piano or sculpting clay engages the hands in ways that a touchscreen keyboard or 3D modeling app never could. I think the balance lies in knowing where tactile interaction matters and where digital convenience should take over. For creative or learning experiences, keeping physical feedback is valuable it builds skill, emotion, and memory. But for communication, organization, and quick access to information, digital tools are the smarter choice. So rather than rejecting “pictures under glass,” I think the future of interaction should combine both worlds using technology to simplify life without losing the richness of real, physical touch.

Reading 2

After reading Bret Victor’s responses, I actually liked his explanation a lot more because it helped me understand his real point. At first, I thought he wanted us to go “old school” and reject technology completely, like he was against modern progress. That’s why I used the example of sending an email instead of going to the mailbox because I thought he didn’t appreciate how technology saves time and makes life easier. But after reading his clarification, I realized he’s not saying we should stop using touchscreens or digital tools; he’s saying we should build on them and make them better. I especially liked his comparison of the iPad to black-and-white film before color came along that made so much sense. He wants us to advance technology even further, but in a way that brings back the richness of physical touch and real interaction. I still think that won’t be possible for everything, because the future is definitely digital, but if we can find ways to blend technology with physical sensations, that would be amazing it would make our interactions more natural, creative, and human.