Final Project Progress

Finalised Concept for the Project

My final project is a dual-player wheel controlled dance survival game that uses both Arduino and p5.js to create a physically interactive digital experience. A physical wheel attached to a rotary encoder or potentiometer acts as the primary input device. When the player rotates the wheel, the Arduino reads real-time angle data and sends it to p5, which controls the horizontal position of two characters represented by circles inside a circular arena.

Random obstacles fall from the top of the p5 canvas, and the player must rotate the wheel to move both characters and avoid collisions. If either character touches a falling object, the game ends

Arduino Program:

Inputs:

  1. Potentiometer: measures the angle of the physical wheel. Maps the analogue input to degree values.
  2. This value is sent to p5 via Serial.

Outputs:

  1. Red / Green LED- for the disco effect of lights. Activates according to the song beats

P5 Program:

    1. Draw two player circles on the arena perimeter or surface.
    2. Spawn falling objects at random intervals and positions.
    3. Detect collisions between objects and characters.
    4. Read wheel angle data from Arduino
    5. Smoothly rotate the two characters horizontally based on the mapped value.
    6. Track the best score
    7. Play a song in the background and send instructions to Arduino to light up LEDs according to the beats

Assignment Exercises

Exercise 1

For this exercise, we decided to use Ultrasonic sensor because we thought it will interesting to control the ellipse with our hand kind of like in video games Wii. Wherein based on your physical hand movement the objects move on the screen. To proceed with this idea, we first made a connection to the Arduino on P5.js, similar to how we did it in class. Then to control the x-axis we first read the serial write in P5.js and then mapped the value read from 0 to windowWidth. Another thing we noticed while performing this assignment is that the movement for ellipse was very abrupt and so we used a function called ‘lerp’. I had previously used this function for one of the P5 assignments and so I recalled that to smooth distance we can use this function. This function basically generates a value between my last x position to the current measured distance at certain interval (0.1 in this case). This makes sure that the circle moves in a smooth movement. Below I have attached my code to P5 and Arduino along with the demonstration video.

P5.js

Arduino

assignment1-demo

Exercise 2

For the second exercise, we chose to create a visual representation of the conversion from one side to another. Inspired by the chess clock, where time shifts between players while playing. We designed the LEDs so that as one grows brighter, the other dims, in turn symbolizing the passing of control from one side to the other. We started by first establishing the serial connection between Arduino and p5.js, similar to the previous exercise and what we did in class. In the p5 we created a slider ranging from 0 to 255, which we used to determine the brightness of LED 1, then for LED 2 we then set the inverse of that value so that as one increased, the other decreased. We continuously sent these two mapped values through serial in the format “value1,value2” and read them on the Arduino side to update the LED brightness using analogWrite. This setup allowed us to control both LEDs simultaneously from the browser and visually see the transition between them. Below are the p5.js and Arduino code along with the demonstration video.

P5.js

Arduino

Arduinoexercise2-demo

Exercise 3

To complete this exercise we made changes to the gravity ball code provided. We first made sure that the LED lights up everytime the ball bounces to do so we maintained a state variable so everytime the ball touches ground we switched the state to HIGH else we set the state to LOW. This helped us achieve the goal. For the second module of this exercise, we decided to use our concept in exercise 1 i.e. ultrasonic sensor to control the wind movement of the ball. We looked at our distance from ultrasonic sensor and then decided to set a threshold of 50. So if the distance is >50, we set the wind speed to -3, else 3. This helped to move the ball in different directions and control it with our hand. We have provided the P5.js and Arduino code below for your references.

P5.js

Arduino

exercise3-demo

Week 11: Final Project Idea

For my final project, I am designing a physically interactive dual-player dance survival game that integrates both Arduino and p5.js.

At the core of the interaction is a physical wheel connected to an Arduino. As the player rotates the wheel, the Arduino continuously reads the wheel’s angular direction using potentiometer (not sure). This sensing becomes the “listening” component of the system. The Arduino then sends this data to p5.js, where it controls the movement of a circle containing two characters.

In the p5 sketch, objects fall randomly from the top of the screen, and the player must rotate the wheel to shift the characters left or right, dodging the falling obstacles. The challenge increases because the wheel controls both characters simultaneously: if either one is hit, the game ends. This creates a dynamic where the player must keep track of two positions at once.

Furthermore, to communicate from p5 to Arduino I plan on lighting a green LED light everytime the player successfully dodges obstacle and red light for when game is over.

Week 11: Reading Reflection

Fashion versus discretion is a central theme in design for disability. Traditionally, assistive products like glasses or hearing aids were often designed to be discreet, hidden away to avoid stigma or social attention. But this reading and my own experience show that disability does not have to mean invisibility or shame.

Having worn glasses almost my entire life, I recall how they were initially seen through the lens of social stigma. Comments like “Oh, she has glasses” or the belief that no one looks beautiful wearing them were common. However, over time, societal attitudes changed, and glasses transformed from a clinical aid to a fashion statement. Through this reading, I realised that this shift is not just about evolving social perspectives but also about the revolutionary change in spectacle design. Modern glasses are so stylish, with diverse frames and colors, that even people without any vision impairment now wear them purely as fashion accessories. This evolution speaks volumes about how disability can be embraced rather than hidden. It exemplifies that disability does not need to equate to discretion, why should we be invisible in our differences?

Moreover, it’s encouraging to see how design progress extends beyond spectacles to products like wireless earphones that made even hearing aids look in style, transforming assistive technology into mainstream accessories.

More than simply designing for disability, companies like Apple have shown how to create products like ipod that work seamlessly for all users, disabled or not. This approach represents the peak of design philosophy; one that emphasises minimalism, accessibility, and universal appeal without differentiating users by ability.

What I deeply take away from this reading is how disability acts as a powerful force in pushing design boundaries. It challenges conventional ideas and fosters innovation, driving designers to think creatively and inclusively.

W10: Instrument

Inspiration

The xylophone has always fascinated me. I loved watching the vibrant melodies come to life as each bar was tapped. This inspired me to create a digital version using everyday materials, giving the classic xylophone a modern, interactive twist.

Concept

The idea was simple yet playful: use Aluminum foil as the xylophone buttons. Each strip of foil represents a note, and tapping on it triggers a sound. To bring in the concept of tuning (something I deeply appreciate from my experience playing the violin) we incorporated a potentiometer. This allows the user to adjust the pitch of each note in real-time, just as a musician tunes their instrument before performing. By combining tactile interaction with the flexibility of pitch control, we aimed to create an instrument that feels both familiar and innovative.

 

Code I’m Most Proud Of

int potVal = analogRead(potPin);
float multiplier = map(potVal, 0, 1023, 60, 180) / 100.0;

if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    delay(50);
} else {
    noTone(buzzerPin);
}

What makes this snippet special is how it turns a simple analog input into musical expression. By mapping the potentiometer to a frequency multiplier, each foil strip produces a different tone that can be adjusted on the fly. Using constrain() ensures the sounds remain within a safe, audible range. It was rewarding to see how these functions, which we learned in class, could be combined to create a tactile, musical experience.

Future Improvements

Right now, the instrument plays a sound as long as the foil is touched. In the future, I’d like to add note duration control so that pressing a strip produces a single tone, similar to how a piano note behaves, with a possible fade-out effect when the note ends. This would make the interaction feel more natural and musical.

Another exciting improvement could be a wireless “stick” that triggers the foil strips remotely. This would allow the musician to move freely and perform more expressively, opening up new possibilities for live interaction and playability.

 

W10: Reading Reflection

The Future of Interaction Design

Reading this piece immediately took me back to Steve Jobs’ keynote when he unveiled the iPhone and boldly declared that we don’t need a stylus; that our fingers are the best pointing device we’ll ever have. Jobs’ vision, at that time, was revolutionary because it simplified interaction and made technology more accessible. He recognised how naturally intuitive our sense of touch is which is the same quality the author values but he focused on usability on physical feel.

While the author criticises “Pictures Under Glass” for robbing us of sensory depth, I see it as a meaningful trade-off. It allowed us to consolidate multiple tools into one, replacing the clutter of physical devices with a single screen that could transform into anything we needed. The flatness of the glass became the canvas of endless interfaces. Even if it dulled the sensation of texture, it heightened the sense of control, mobility, and creative possibility.

That said, I agree that the future can move beyond this limitation. The author’s call to embrace our full tactile and bodily potential opens an exciting direction for technology. What if screens could morph in texture, shape, and resistance depending on the app in use, a photo that feels like paper, a drum pad that vibrates ? That would merge Jobs’ vision of simplicity with the author’s longing for physical depth.

Perhaps, then, “Pictures Under Glass” wasn’t the end of interaction design but a stepping stone.

Moving forward from his response to the comments, I really agreed with the author’s take on the “iPad bad” comment. I liked how he clarified that the iPad is actually good for now. It was a revolutionary invention that changed how we interact with technology. But I also agree with his warning that if, twenty years from now, all we have is the same flat, glassy device with minor haptic improvements, then it would be bad. His comparison to black-and-white film before color photography made a lot of sense to me. It’s a reminder that innovation should keep evolving rather than settling for what feels advanced in the moment.

W9: Assignment

Concept

Parking lots can often be a frustrating experience, especially when it’s hard to tell whether a spot is free or occupied without driving around aimlessly. I wanted to create a simple, interactive system using Arduino that mimics real-world parking indicators: a yellow light that changes brightness when a car is moving in or out, and a red light that turns on when a spot is occupied. This way, drivers can quickly see which spots are available and which are taken, making the parking process smoother and more intuitive.

Implementation

To achieve this, I used an ultrasonic sensor to detect the movement of cars. The sensor works by sending out a pulse from the trigger pin, which bounces off an object and returns to the echo pin. The Arduino then calculates the distance based on the time it takes for the pulse to return. I mapped this distance to the brightness of a yellow LED, so that the closer a car gets to the parking spot, the brighter the yellow light becomes. A slide switch allows us to manually indicate when a car is parked: flipping the switch turns on a red LED and turns off the yellow light, clearly showing that the spot is occupied. Two 330-ohm resistors ensure the LEDs operate safely without drawing too much current.

cardemo

Code I’m proud of

// Trigger pulse 
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);digitalWrite(trigPin, LOW);

// Read echo
duration = pulseIn(echoPin, HIGH);
distance = duration * 0.0343 / 2.0;

I’m particularly proud of the code I wrote for this project. Writing it taught me a lot about how ultrasonic sensors work and how to use the trigger and echo functionality effectively.

Future Developments

For future development, the system could be expanded to include a green LED, which would light up to indicate available parking spots. In that scenario, the green light would show availability, the yellow LED would indicate movement, and the red LED would signal when a spot is taken. Eventually, this could be automated further so that the sensor alone detects whether a car is parked, eliminating the need for the manual switch. Overall, this project was a great exercise in combining sensors, outputs, and user interaction to create a functional and visually intuitive system.

W9: Reading Reflections

Physical Computing’s Greatest Hits (and misses)

While reading this piece, I found myself fascinated by how imagination can stretch beyond the limits of what we typically perceive as possible. The example of the waves of leaves particularly resonated with me. It was such a beautiful and unexpected way to translate nature into sound and movement. I would have never imagined something like that, yet it reminded me that creativity often begins with seeing the ordinary through a new lens. This concept really reflects what this course encourages us to do: to move beyond traditional boundaries and explore how abstract ideas can become tangible experiences. It even made me think about how we could merge this with technology, perhaps building something like a domino-inspired instrument that creates a tune from a movement.

Another concept that stood out to me was Dance Dance Revolution. I’ve always loved dancing and even enjoyed playing the this type of game in fun zones, where timing and coordination create a sense of both challenge and joy. Reading about it made me think of how such ideas could evolve into more interactive art experiences. We can probably utilise this concept to build a “twister” game such that everytime someone is out it creates a buzz noise.

Overall, this reading reminded me that creativity is not confined to art or technology alone, it’s in how we connect both. The examples encouraged me to think more experimentally and to consider how imagination can be designed into playful, sensory experiences that engage both mind and body.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

I completely agree with what the author is saying in this reading. If you are creating an immersive, interactive experience, you need to let the audience truly be part of it: to explore, engage, and form their own interpretations. That process of interaction is what reveals how deeply people are willing to think about your project and how many different meanings it can evoke. Each person’s response becomes part of the artwork itself, showing you perspectives you may never have considered.

An immersive experience, in a way, is like an open-ended question. There can be multiple interpretations, each valid in its own context. You can build theories around what you intend to express, but you should always leave your audience curious about what the ground truth really is. That curiosity is what keeps the experience alive even after the interaction ends. As a creator, you can guide emotions subtly through design and environment, but once you begin instructing the audience, it stops being interactive and becomes prescriptive. True interactivity lies in that delicate balance between guidance and freedom where the audience feels both engaged and uncertain.

W8: Her Code Got Humans On The Moon Reflection

It’s revolutionary to see the scale of what Margaret Hamilton achieved. She didn’t just break gender stereotypes, she essentially founded an entire discipline that grew into a billion-dollar industry: software engineering. While we all remember Neil Armstrong as the first man to step on the moon, we rarely think about the person who made that step possible. Reading about Hamilton made me realise how much unseen effort lies behind every historic moment.

As a woman in computer science, a field still largely dominated by men, her story feels deeply personal and inspiring. It’s empowering to see someone who not only challenged norms but also redefined “engineering.”

One part of the reading that resonated with me on a technical level was Hamilton’s insistence on anticipating and handling errors. When I first started learning to code, I used to find “try,” “except,” and “catch error” statements frustrating and unnecessary. I would think, why not just tell users not to make mistakes? But Hamilton’s experience showed the flaw in that thinking. Even an astronaut, among the most trained and intelligent individuals, made an oversight that could have led to mission failure. That moment completely reframed my understanding: robust systems are not built on the assumption that people won’t perform error, but on the expectation that they inevitably will.

This reading reminded me that testing, error handling, and designing for failure are not tedious parts of coding, they’re acts of responsibility and necessity. Margaret Hamilton’s story shows that great engineering is not just about writing functional code but about preventing failure, protecting people, and thinking ahead. It’s a mindset I want to carry into every project I work on.

W8: Emotion and Design writing response

I strongly agree with Norman’s idea that beautiful things often appear more usable than others. His argument immediately reminded me of a simple economic distinction: need vs want. The “need” reflects a product’s functionality, while the “want” represents the emotional desire or aesthetic appeal that creates the illusion of greater usability.

A recent experience illustrates this perfectly. My phone’s screen-guard and cover had broken, making it look worn out and, to me, almost unusable. I even considered buying a new phone, not because it stopped working, but because it looked unattractive. However, as soon as I replaced the cover, the phone suddenly felt smooth, neat, and functional again. Nothing changed technically, yet my perception of usability improved. This small incident made Norman’s point about emotional design feel remarkably real. It stresses on how our positive affect can shape our judgment of an object’s performance.

This also made me wonder: why do we, as humans, lean so strongly toward attractiveness over function? Is it instinctive, a natural response to seek pleasure in what pleases the eye? Consider the popular Longchamp tote bag that have taken over university campuses. They are stylish and easily recognisable, yet lack practical compartments, making it difficult to organise essentials like a laptop or documents. Despite this, they remain a trend. Perhaps this reflects what Norman calls the emotional pull of design. We forgive functional flaws when an object evokes a certain feeling or identity.

Yet, aesthetics are subjective; what one finds beautiful, another may not. This raises an important question for designers: how should one balance usability with aesthetics when beauty itself cannot be universally defined? Norman suggests that effective design lies in harmonising both, where aesthetic pleasure enhances, but does not replace, functionality. Maybe it is acceptable, to some degree, for design to create an illusion of usability through beauty, as long as that illusion inspires engagement rather than deception.

In the end, I believe the power of design lies in its ability to connect both heart and mind to make people feel good while helping them do well. Beauty without function is momentary, but function without beauty rarely delights. The challenge, as Norman describes, is to design for both.