Final Project

1. Concept

Duet Dance is an interactive, dual-player dance-themed survival game that blends physical input, digital animation, and music into a single cohesive experience. The project uses Arduino as the physical interface and p5.js as the visual environment, creating a tangible game where players control two dancing characters using a physical rotating disc.

Instead of conventional keyboard or mouse input, movement is driven entirely by a hardware wheel connected to a potentiometer. As the wheel rotates, Arduino measures the analogue values, converts them into angle readings, and sends them to p5.js through serial communication. These readings position two characters on a circular track, mirroring a ballroom “duet dance.”

During gameplay, petals fall from above, and the player must rotate the wheel to prevent the dancers from colliding with falling objects. The game combines:

  • Physical movement (wheel rotation)
  • Audio reactivity (LEDs flashing on musical peaks)
  • Visual animation (dancers rotating gracefully)
  • Survival gameplay

This creates a playful duet between physical interaction and digital performance.

2. Images of the Project

figure 1: User Interface of the project

fig 2: Circuit of the project

3. Schematic

4. User testing videos

final-user-testing

5. Implementation

Interaction:

The core principle behind Duet Dance is physically embodied interaction. Instead of pressing buttons, players rotate a disc, mimicking the motion of turning a vinyl record or spinning a dance partner. This physicality makes the movement more intuitive, rhythmic, and engaging.

  • User rotates the wheel
  • Arduino reads potentiometer
  • p5.js moves the dancers
  • Falling petals appear
  • LEDs respond to music beats
  • Player attempts to survive as long as possible

Arduino:

On the hardware side, Arduino handles:

  • Input: Potentiometer reading (0–1023)
  • Output: Red and green LEDs toggled according to music beats
  • Serial Communication: Continuous two-way messaging with p5.js

The Arduino handles three primary responsibilities within the system. First, it continuously reads the analog input from the potentiometer and sends the corresponding values to the p5.js sketch, ensuring smooth and accurate control of the rotation mechanic. Second, it receives beat indicators from p5.js sent as a boolean (either a 0 or 1). It uses these signals to toggle between the red and green LEDs, creating a synchronized lighting effect that reacts to the music. Finally, it maintains a consistent serial handshake with p5.js, ensuring stable, real-time communication between the physical hardware and the digital interface.

// start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int beat = Serial.parseInt();  // 0 or 1
    if (Serial.read() == '\n') {
      if (beat == 1) {
        // toggle LED state
        currentColor = !currentColor;
      }
      digitalWrite(redLedPin, currentColor ? LOW : HIGH);
      digitalWrite(greenLedPin, currentColor ? HIGH : LOW);
      
      
      int sensor = analogRead(A0);
      delay(5);
      Serial.println(sensor);
    }
  }

Github Link

P5.js:

The p5.js sketch is responsible for managing the entire visual and interactive experience of the game. It handles the visual display, including the background, dancers, and falling petals, while also performing FFT analysis and peak detection to synchronize elements of the game with the music.

Core game logic such as the start screen, active gameplay, and game-over state is controlled within the sketch, alongside character rotation driven by real-time data received from the Arduino. The system also performs collision detection, updates the score, and sends beat signals back to the Arduino to trigger LED responses.

Several key mechanics shape the gameplay: the potentiometer value is mapped from 0 to 1023 to a full 0 to 2π rotation for smooth and fast circular motion; the two characters are placed directly opposite each other at 180 degrees; new falling objects spawn every set interval; collisions with petals end the game while successful dodging increases the score; and the LEDs blink in response to musical peaks detected through FFT analysis.

if (!port.opened()) {
    text("Disconnected - press space to connect", 250, 30);
  } else {
    
    ////////////////////////////////////
    //READ FROM ARDUINO HERE
    ////////////////////////////////////
    let data = port.readUntil("\n");
    if (data.length > 0) {
      // We received a complete line, split the message
      let fromArduino = split(trim(data), ",");
      if (fromArduino.length == 1) {
        angle = int(fromArduino[0]);
      }

      //////////////////////////////////
      //SEND TO ARDUINO HERE (handshake)
      //////////////////////////////////
      let sendToArduino = (sendBeat ? 1 : 0) + "\n";
      port.write(sendToArduino);
    }
    sendBeat = false;
  }

Arduino to p5.js Communication

The communication between Arduino and p5.js relies on a continuous serial handshake that enables smooth, real-time interaction. Arduino constantly sends potentiometer readings (formatted as sensor-data\n) to p5.js, which reads these values using readUntil(“\n”) to determine the dancers’ rotational position. On the other side, p5.js performs FFT-based sound peak detection and sends beat (either a 0 or 1) back to the Arduino. Upon receiving these indicators, the Arduino responds by toggling the red and green LEDs, creating lighting feedback synchronized with the music. This bidirectional exchange ensures that hardware inputs and digital outputs remain tightly synchronized throughout gameplay.

6. Challenges Faced

One of the major challenges in this project was ensuring that the potentiometer values were transmitted from Arduino to p5.js without delay. In early iterations, the rotation on-screen lagged behind the physical movement of the wheel because there was no communication loop but only a conditional statement. The Arduino was sending data inconsistently, and p5.js was not reading each line of serial input efficiently. To solve this, I refined the serial communication structure by implementing a clear, continuous handshake and using while loop on the arduino side to avoid partial or broken readings. This created a steady flow of data between the two systems, allowing the dancers’ rotation to update smoothly in response to real-time physical movement.

Another challenge involved the physical setup, specifically, securely attaching the potentiometer to the wheel so that its rotation accurately corresponded to what happened on-screen. Initially, the potentiometer would slip and the wheel would rotate independently of the potentiometer, resulting in no characters moving. I experimented with several mounting methods, including tape, cardboard supports, and temporary adhesives, but none provided the stability needed. Eventually, I created a more secure mechanical design and connection using wires to attach potentiometer to the wheel. This ensured the wheel turned the potentiometer directly and consistently. Once this was fixed, the rotation displayed in p5.js finally matched the physical motion of the disc, creating a reliable control mechanism.

A third challenge emerged when integrating the audio analysis into the project. While testing the sound input, I realized that the peak detection in p5.js was not registering any audio spikes using the default settings of p5.PeakDetect(). Even when the music clearly had strong beats, the system failed to detect them, which meant the LEDs were not responding to sound as intended. After investigating further, I found that the default sensitivity and frequency range were too broad for the type of audio being used. To address this, I manually adjusted the parameters and set a more appropriate threshold.

7. Code I’m Proud of:

One part of my code that I am particularly proud of is the implementation of p5.js’s sound analysis tools, especially because we never learned these functions in class and I had to explore and understand them independently. I incorporated an FFT (Fast Fourier Transform) object and paired it with a p5.PeakDetect function. I configured with specific frequency and threshold values to isolate beats in the music track. By linking the FFT to the audio input  and using peakDetect.onPeak(triggerBeat), I created a custom callback that activates whenever a peak is detected. Inside the callback, the triggerBeat() function sets sendBeat = true, which then sends a signal to the Arduino to toggle the LEDs in sync with the music. I am proud of this implementation because it demonstrates my ability to extend the project beyond what was taught, integrate real-time audio analysis, and create a visual LED effect using it.

fft = new p5.FFT();
  peakDetect = new p5.PeakDetect(50, 150, 0.15);
  fft.setInput(song);
  peakDetect.onPeak(triggerBeat);

function triggerBeat() {
  sendBeat = true;
}

8. Link to Resources

  1. https://editor.p5js.org/creativecoding/sketches/BkBnH1qlN: This reference helped me understand how FFT and peak detection work in p5.js, allowing me to incorporate real-time music analysis into my project. It guided me in configuring frequency ranges, thresholds, and callbacks to detect beats accurately.
  2. https://www.youtube.com/watch?v=8kGl2Yd-clI: This video was essential for learning how to build a functional steering-wheel mechanism using a potentiometer. It showed how to connect and stabilize the hardware so it accurately reflects movement on-screen.

9. Future Improvements

Looking ahead, one potential improvement for Duet Dance is to introduce multiple difficulty levels that adjust both the speed and frequency of the falling objects. This would make the game more engaging for a wider range of players, from beginners to experienced users, and add an element of progressive challenge that encourages repeated play. Another area for enhancement is the hardware interface itself; currently, the single potentiometer limits the type of movement the player can perform. Incorporating a more advanced input device could allow for smoother, more precise control and even open possibilities for multiplayer interactions, further enriching the physical-digital experience.

10. IM Showcase

The project was presented at the NYU Abu Dhabi Interactive Media End of the Semester Showcase. These are some of the pictures and videos from the exhibition.

user1 user2 user3 user4 user5 user6 user7

Final Project- User Testing

User testing played an important role in identifying both design and technical issues. I began by testing the project myself. Originally, the controller was designed as a steering wheel, but I quickly realised that due to the limited space between the wheel and the box, the movement felt restricted. Replacing the wheel with a flat record-disc style controller made the rotation smoother and more comfortable. This change improved the overall usability before involving any external testers.

For the first round of user testing, the participant was able to understand the gameplay immediately and interact with the disc without any issues. This confirmed that the core interaction was clear and functional.

user 1

The second user test, however, revealed a major issue. At one point in the game, the disc stopped turning, and the player became frustrated since they were aiming for a high score. I later realised that this was due to the physical limitation of the potentiometer, which cannot rotate infinitely. Because there was no visual cue indicating the rotation limit, the user assumed the controller had broken. This highlighted the need to add a clear indicator to show when the disc has reached its endpoint and that the user needs to rotate in the opposite direction.

user2

By the third user test, I explained this limitation to the participant beforehand, and the interaction went smoothly. They played without problems and suggested adding a character, such as a doll or a real figure on the screen, to make the visuals more engaging.

user3

Overall, the user testing process helped refine both the physical design of the controller and the communication within the game interface. The feedback guided improvements that made the interaction more intuitive and reliable.

Final Project Progress

Finalised Concept for the Project

My final project is a dual-player wheel controlled dance survival game that uses both Arduino and p5.js to create a physically interactive digital experience. A physical wheel attached to a rotary encoder or potentiometer acts as the primary input device. When the player rotates the wheel, the Arduino reads real-time angle data and sends it to p5, which controls the horizontal position of two characters represented by circles inside a circular arena.

Random obstacles fall from the top of the p5 canvas, and the player must rotate the wheel to move both characters and avoid collisions. If either character touches a falling object, the game ends

Arduino Program:

Inputs:

  1. Potentiometer: measures the angle of the physical wheel. Maps the analogue input to degree values.
  2. This value is sent to p5 via Serial.

Outputs:

  1. Red / Green LED- for the disco effect of lights. Activates according to the song beats

P5 Program:

    1. Draw two player circles on the arena perimeter or surface.
    2. Spawn falling objects at random intervals and positions.
    3. Detect collisions between objects and characters.
    4. Read wheel angle data from Arduino
    5. Smoothly rotate the two characters horizontally based on the mapped value.
    6. Track the best score
    7. Play a song in the background and send instructions to Arduino to light up LEDs according to the beats

Assignment Exercises

Exercise 1

For this exercise, we decided to use Ultrasonic sensor because we thought it will interesting to control the ellipse with our hand kind of like in video games Wii. Wherein based on your physical hand movement the objects move on the screen. To proceed with this idea, we first made a connection to the Arduino on P5.js, similar to how we did it in class. Then to control the x-axis we first read the serial write in P5.js and then mapped the value read from 0 to windowWidth. Another thing we noticed while performing this assignment is that the movement for ellipse was very abrupt and so we used a function called ‘lerp’. I had previously used this function for one of the P5 assignments and so I recalled that to smooth distance we can use this function. This function basically generates a value between my last x position to the current measured distance at certain interval (0.1 in this case). This makes sure that the circle moves in a smooth movement. Below I have attached my code to P5 and Arduino along with the demonstration video.

P5.js

Arduino

assignment1-demo

Exercise 2

For the second exercise, we chose to create a visual representation of the conversion from one side to another. Inspired by the chess clock, where time shifts between players while playing. We designed the LEDs so that as one grows brighter, the other dims, in turn symbolizing the passing of control from one side to the other. We started by first establishing the serial connection between Arduino and p5.js, similar to the previous exercise and what we did in class. In the p5 we created a slider ranging from 0 to 255, which we used to determine the brightness of LED 1, then for LED 2 we then set the inverse of that value so that as one increased, the other decreased. We continuously sent these two mapped values through serial in the format “value1,value2” and read them on the Arduino side to update the LED brightness using analogWrite. This setup allowed us to control both LEDs simultaneously from the browser and visually see the transition between them. Below are the p5.js and Arduino code along with the demonstration video.

P5.js

Arduino

Arduinoexercise2-demo

Exercise 3

To complete this exercise we made changes to the gravity ball code provided. We first made sure that the LED lights up everytime the ball bounces to do so we maintained a state variable so everytime the ball touches ground we switched the state to HIGH else we set the state to LOW. This helped us achieve the goal. For the second module of this exercise, we decided to use our concept in exercise 1 i.e. ultrasonic sensor to control the wind movement of the ball. We looked at our distance from ultrasonic sensor and then decided to set a threshold of 50. So if the distance is >50, we set the wind speed to -3, else 3. This helped to move the ball in different directions and control it with our hand. We have provided the P5.js and Arduino code below for your references.

P5.js

Arduino

exercise3-demo

Week 11: Final Project Idea

For my final project, I am designing a physically interactive dual-player dance survival game that integrates both Arduino and p5.js.

At the core of the interaction is a physical wheel connected to an Arduino. As the player rotates the wheel, the Arduino continuously reads the wheel’s angular direction using potentiometer (not sure). This sensing becomes the “listening” component of the system. The Arduino then sends this data to p5.js, where it controls the movement of a circle containing two characters.

In the p5 sketch, objects fall randomly from the top of the screen, and the player must rotate the wheel to shift the characters left or right, dodging the falling obstacles. The challenge increases because the wheel controls both characters simultaneously: if either one is hit, the game ends. This creates a dynamic where the player must keep track of two positions at once.

Furthermore, to communicate from p5 to Arduino I plan on lighting a green LED light everytime the player successfully dodges obstacle and red light for when game is over.

Week 11: Reading Reflection

Fashion versus discretion is a central theme in design for disability. Traditionally, assistive products like glasses or hearing aids were often designed to be discreet, hidden away to avoid stigma or social attention. But this reading and my own experience show that disability does not have to mean invisibility or shame.

Having worn glasses almost my entire life, I recall how they were initially seen through the lens of social stigma. Comments like “Oh, she has glasses” or the belief that no one looks beautiful wearing them were common. However, over time, societal attitudes changed, and glasses transformed from a clinical aid to a fashion statement. Through this reading, I realised that this shift is not just about evolving social perspectives but also about the revolutionary change in spectacle design. Modern glasses are so stylish, with diverse frames and colors, that even people without any vision impairment now wear them purely as fashion accessories. This evolution speaks volumes about how disability can be embraced rather than hidden. It exemplifies that disability does not need to equate to discretion, why should we be invisible in our differences?

Moreover, it’s encouraging to see how design progress extends beyond spectacles to products like wireless earphones that made even hearing aids look in style, transforming assistive technology into mainstream accessories.

More than simply designing for disability, companies like Apple have shown how to create products like ipod that work seamlessly for all users, disabled or not. This approach represents the peak of design philosophy; one that emphasises minimalism, accessibility, and universal appeal without differentiating users by ability.

What I deeply take away from this reading is how disability acts as a powerful force in pushing design boundaries. It challenges conventional ideas and fosters innovation, driving designers to think creatively and inclusively.

W10: Instrument

Inspiration

The xylophone has always fascinated me. I loved watching the vibrant melodies come to life as each bar was tapped. This inspired me to create a digital version using everyday materials, giving the classic xylophone a modern, interactive twist.

Concept

The idea was simple yet playful: use Aluminum foil as the xylophone buttons. Each strip of foil represents a note, and tapping on it triggers a sound. To bring in the concept of tuning (something I deeply appreciate from my experience playing the violin) we incorporated a potentiometer. This allows the user to adjust the pitch of each note in real-time, just as a musician tunes their instrument before performing. By combining tactile interaction with the flexibility of pitch control, we aimed to create an instrument that feels both familiar and innovative.

 

Code I’m Most Proud Of

int potVal = analogRead(potPin);
float multiplier = map(potVal, 0, 1023, 60, 180) / 100.0;

if (activeIndex >= 0) {
    int freq = int(baseFreqs[activeIndex] * multiplier);
    freq = constrain(freq, 100, 5000);

    tone(buzzerPin, freq);
    delay(50);
} else {
    noTone(buzzerPin);
}

What makes this snippet special is how it turns a simple analog input into musical expression. By mapping the potentiometer to a frequency multiplier, each foil strip produces a different tone that can be adjusted on the fly. Using constrain() ensures the sounds remain within a safe, audible range. It was rewarding to see how these functions, which we learned in class, could be combined to create a tactile, musical experience.

Future Improvements

Right now, the instrument plays a sound as long as the foil is touched. In the future, I’d like to add note duration control so that pressing a strip produces a single tone, similar to how a piano note behaves, with a possible fade-out effect when the note ends. This would make the interaction feel more natural and musical.

Another exciting improvement could be a wireless “stick” that triggers the foil strips remotely. This would allow the musician to move freely and perform more expressively, opening up new possibilities for live interaction and playability.

 

W10: Reading Reflection

The Future of Interaction Design

Reading this piece immediately took me back to Steve Jobs’ keynote when he unveiled the iPhone and boldly declared that we don’t need a stylus; that our fingers are the best pointing device we’ll ever have. Jobs’ vision, at that time, was revolutionary because it simplified interaction and made technology more accessible. He recognised how naturally intuitive our sense of touch is which is the same quality the author values but he focused on usability on physical feel.

While the author criticises “Pictures Under Glass” for robbing us of sensory depth, I see it as a meaningful trade-off. It allowed us to consolidate multiple tools into one, replacing the clutter of physical devices with a single screen that could transform into anything we needed. The flatness of the glass became the canvas of endless interfaces. Even if it dulled the sensation of texture, it heightened the sense of control, mobility, and creative possibility.

That said, I agree that the future can move beyond this limitation. The author’s call to embrace our full tactile and bodily potential opens an exciting direction for technology. What if screens could morph in texture, shape, and resistance depending on the app in use, a photo that feels like paper, a drum pad that vibrates ? That would merge Jobs’ vision of simplicity with the author’s longing for physical depth.

Perhaps, then, “Pictures Under Glass” wasn’t the end of interaction design but a stepping stone.

Moving forward from his response to the comments, I really agreed with the author’s take on the “iPad bad” comment. I liked how he clarified that the iPad is actually good for now. It was a revolutionary invention that changed how we interact with technology. But I also agree with his warning that if, twenty years from now, all we have is the same flat, glassy device with minor haptic improvements, then it would be bad. His comparison to black-and-white film before color photography made a lot of sense to me. It’s a reminder that innovation should keep evolving rather than settling for what feels advanced in the moment.

W9: Assignment

Concept

Parking lots can often be a frustrating experience, especially when it’s hard to tell whether a spot is free or occupied without driving around aimlessly. I wanted to create a simple, interactive system using Arduino that mimics real-world parking indicators: a yellow light that changes brightness when a car is moving in or out, and a red light that turns on when a spot is occupied. This way, drivers can quickly see which spots are available and which are taken, making the parking process smoother and more intuitive.

Implementation

To achieve this, I used an ultrasonic sensor to detect the movement of cars. The sensor works by sending out a pulse from the trigger pin, which bounces off an object and returns to the echo pin. The Arduino then calculates the distance based on the time it takes for the pulse to return. I mapped this distance to the brightness of a yellow LED, so that the closer a car gets to the parking spot, the brighter the yellow light becomes. A slide switch allows us to manually indicate when a car is parked: flipping the switch turns on a red LED and turns off the yellow light, clearly showing that the spot is occupied. Two 330-ohm resistors ensure the LEDs operate safely without drawing too much current.

cardemo

Code I’m proud of

// Trigger pulse 
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);digitalWrite(trigPin, LOW);

// Read echo
duration = pulseIn(echoPin, HIGH);
distance = duration * 0.0343 / 2.0;

I’m particularly proud of the code I wrote for this project. Writing it taught me a lot about how ultrasonic sensors work and how to use the trigger and echo functionality effectively.

Future Developments

For future development, the system could be expanded to include a green LED, which would light up to indicate available parking spots. In that scenario, the green light would show availability, the yellow LED would indicate movement, and the red LED would signal when a spot is taken. Eventually, this could be automated further so that the sensor alone detects whether a car is parked, eliminating the need for the manual switch. Overall, this project was a great exercise in combining sensors, outputs, and user interaction to create a functional and visually intuitive system.

W9: Reading Reflections

Physical Computing’s Greatest Hits (and misses)

While reading this piece, I found myself fascinated by how imagination can stretch beyond the limits of what we typically perceive as possible. The example of the waves of leaves particularly resonated with me. It was such a beautiful and unexpected way to translate nature into sound and movement. I would have never imagined something like that, yet it reminded me that creativity often begins with seeing the ordinary through a new lens. This concept really reflects what this course encourages us to do: to move beyond traditional boundaries and explore how abstract ideas can become tangible experiences. It even made me think about how we could merge this with technology, perhaps building something like a domino-inspired instrument that creates a tune from a movement.

Another concept that stood out to me was Dance Dance Revolution. I’ve always loved dancing and even enjoyed playing the this type of game in fun zones, where timing and coordination create a sense of both challenge and joy. Reading about it made me think of how such ideas could evolve into more interactive art experiences. We can probably utilise this concept to build a “twister” game such that everytime someone is out it creates a buzz noise.

Overall, this reading reminded me that creativity is not confined to art or technology alone, it’s in how we connect both. The examples encouraged me to think more experimentally and to consider how imagination can be designed into playful, sensory experiences that engage both mind and body.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

I completely agree with what the author is saying in this reading. If you are creating an immersive, interactive experience, you need to let the audience truly be part of it: to explore, engage, and form their own interpretations. That process of interaction is what reveals how deeply people are willing to think about your project and how many different meanings it can evoke. Each person’s response becomes part of the artwork itself, showing you perspectives you may never have considered.

An immersive experience, in a way, is like an open-ended question. There can be multiple interpretations, each valid in its own context. You can build theories around what you intend to express, but you should always leave your audience curious about what the ground truth really is. That curiosity is what keeps the experience alive even after the interaction ends. As a creator, you can guide emotions subtly through design and environment, but once you begin instructing the audience, it stops being interactive and becomes prescriptive. True interactivity lies in that delicate balance between guidance and freedom where the audience feels both engaged and uncertain.