Musical Instrument

Group members: Genesis & Nikita

Concept:

video

“WALL-E’s Violin” is inspired by the character WALL-E from the popular animated movie. The idea behind “WALL-E’s Violin” is to mimic WALL-E expressing emotion through music, specifically through a violin — a symbol of sadness and nostalgia. A cardboard model of WALL-E is built with an ultrasonic distance sensor acting as his eyes. A cardboard cutout of a violin and bow are placed in front of WALL-E to represent him holding and playing the instrument.

As the user moves the cardboard stick (the “bow”) left and right in front of WALL-E’s “eyes” (the ultrasonic sensor), the sensor detects the distance of the bow. Based on the distance readings, different predefined musical notes are played through a piezo buzzer. These notes are chosen for their slightly melancholic tones, capturing the emotional essence of WALL-E’s character

To add a layer of animation and personality, a push-button is included. When pressed, it activates two servo motors attached to WALL-E’s sides, making his cardboard arms move as if he is emotionally waving or playing along. The mechanical hum of the servos blends into the soundscape, enhancing the feeling of a robot expressing music not just through tones, but with motion — like a heartfelt performance from a lonely machine.

How it Works:

  • Ultrasonic Sensor: Acts as WALL-E’s eyes. It continuously measures the distance between the robot and the cardboard bow moved in front of it.
  • Note Mapping: The Arduino interprets the measured distances and maps them to specific, predefined musical notes. These notes are chosen to sound melancholic and emotional, fitting WALL-E’s character.
  • Piezo Buzzer: Plays the corresponding musical notes as determined by the distance readings. As the user sways the bow, the pitch changes in real time, simulating a violin being played.
  • Servo Motors + Button: Two servo motors are connected to WALL-E’s arms. When the button is pressed, they animate his arms in a waving or playing motion. The sound of the servos adds a robotic texture to the performance, further humanizing the character and blending physical movement with audio expression.

Code:

#include <Servo.h>
#include <math.h>

// Pin Assignments

const int trigPin   = 12;   // Ultrasonic sensor trigger
const int echoPin   = 11;   // Ultrasonic sensor echo
const int buzzerPin = 8;    // Buzzer output
const int buttonPin = 2;    // Button input for servo tap
const int servoPin  = 9;    // First servo control pin (default starts at 0°)
const int servoPin2 = 7;    // Second servo control pin (default starts at 180°)

Servo myServo;    // First servo instance
Servo myServo2;   // Second servo instance

// Bittersweet Melody Settings

int melody[] = {
  294, 294, 370, 392, 440, 392, 370, 294
};
const int notesCount = sizeof(melody) / sizeof(melody[0]);


// Sensor to Note Mapping Parameters

const int sensorMin = 2;     // minimum distance (cm)
const int sensorMax = 30;    // max distance (cm)
int currentNoteIndex = 0;  

// Vibrato Parameters 

const float vibratoFrequency = 4.0;  
const int vibratoDepth = 2;          

// Timer for vibrato modulation

unsigned long noteStartTime = 0;

void setup() {
  // Initialize sensor pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  
 // Initialize output pins
  pinMode(buzzerPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);  
  
  // Initialize the first servo and set it to 0°.
  myServo.attach(servoPin);
  myServo.write(0);
  
   // Initialize the second servo and set it to 180°..
  myServo2.attach(servoPin2);
  myServo2.write(180);
  
 // Initialize serial communication for debugging
  Serial.begin(9600);
  
// Get the initial note based on the sensor reading
  currentNoteIndex = getNoteIndex();
  noteStartTime = millis();
}

void loop() {
 // Get the new note index from the sensor
  int newIndex = getNoteIndex();
  
 // Only update the note if the sensor reading maps to a different index
  if (newIndex != currentNoteIndex) {
    currentNoteIndex = newIndex;
    noteStartTime = millis();
    Serial.print("New Note Index: ");
    Serial.println(currentNoteIndex);
  }
  
 // Apply vibrato to the current note.
  unsigned long elapsed = millis() - noteStartTime;
  float t = elapsed / 1000.0;  // Time in seconds for calculating the sine function
  int baseFreq = melody[currentNoteIndex];
  int modulatedFreq = baseFreq + int(vibratoDepth * sin(2.0 * PI * vibratoFrequency * t));
  
// Output the modulated tone
  tone(buzzerPin, modulatedFreq);
  
  // Check the button press to trigger both servos (movement in opposite directions).
  if (digitalRead(buttonPin) == LOW) {
    tapServos();
  }
  
  delay(10); 
}

// getNoteIndex() measures the sensor distance and maps it to a note index.
int getNoteIndex() {
  int distance = measureDistance();
// Map the distance (between sensorMin and sensorMax) to the range of indices of the array (0 to notesCount-1).
  int index = map(distance, sensorMin, sensorMax, 0, notesCount - 1);
  index = constrain(index, 0, notesCount - 1);
  return index;
}

// measureDistance() triggers the ultrasonic sensor and calculates the distance in cm.
int measureDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  return constrain(distance, sensorMin, sensorMax);
}

// tapServos() performs a tap action on both servos:
// - The first servo moves from 0° to 90° and returns to 0°.
// - The second servo moves from 180° to 90° and returns to 180°.
void tapServos() {
  myServo.write(90);
  myServo2.write(90);
  delay(200);
  myServo.write(0);
  myServo2.write(180);
  delay(200);
}

 

Circuit:

Future Improvements:

    • Add more notes or an entire scale for more musical complexity.
    • Integrate a servo motor to move the arm or head of WALL-E for animation.
    • Use a better speaker for higher-quality sound output.
    • Use the LED screen to type messages and add to the character.

 

Week 3: Generative artwork using Object-Oriented Programming.

Concept

Prior to this assignment being due, I was going through some of my childhood story books and thats when The Pied Piper of Hamelin inspired me to create this artwork using OOP. In the original story, the Piper plays his magical tune, and all the mice follow him (and later, all the children of the town as well)

Pied Piper IllustrationBut I wanted to keep things happy and do just that part of mice following him in my preferred aesthetic- a night sky. In my version, a glowing dot represents the Piper, and mice randomly appear and start following it. But they’re not just mindless followers—if you move your mouse, you can actually distract them!

When the mice get close to the Piper, they start glowing, like they’re enchanted by some unseen force. To make things even more atmospheric, the background keeps shifting, creating a dreamy, night-sky effect.

 The Artwork

Week 10

const int ldrPin = A0;         // LDR connected to analog pin A0
const int buttonPin = 2;       // Button connected to digital pin 2
const int speakerPin = 9;      // Speaker connected to digital pin 9
const int ledPin = 13;         // LED connected to pin 13

// Dramatically different frequencies (non-musical)
int notes[] = {100, 300, 600, 900, 1200, 2000, 3000};

void setup() {
  pinMode(buttonPin, INPUT);         // Button logic: HIGH when pressed
  pinMode(speakerPin, OUTPUT);     
  pinMode(ledPin, OUTPUT);         
  Serial.begin(9600);              
}

void loop() {
  int buttonState = digitalRead(buttonPin); // Read the button

  if (buttonState == HIGH) {
    int lightLevel = analogRead(ldrPin);         // Read LDR
    int noteIndex = map(lightLevel, 0, 1023, 6, 0); // Bright = low note
    noteIndex = constrain(noteIndex, 0, 6);      // Keep within range
    int frequency = notes[noteIndex];            // Pick frequency

    tone(speakerPin, frequency);                 // Play note
    digitalWrite(ledPin, HIGH);                  // LED on

    Serial.print("Light: ");
    Serial.print(lightLevel);
    Serial.print(" | Frequency: ");
    Serial.println(frequency);
  } else {
    noTone(speakerPin);            // No sound
    digitalWrite(ledPin, LOW);     // LED off
  }

  delay(100);
}


 

 

 

Week 10: Follow-up article

Going through the responses to Victor’s rant, I found it interesting how many people agreed with the idea that touchscreens are a dead end, but still struggled to imagine what a better alternative would look like. It’s almost like we all know something is missing, but we’re too deep inside the current system to clearly picture a different path. I noticed some people pointed to things like haptic feedback or VR as potential improvements, but even those seem to stay within the same basic mindset — still about looking at and manipulating screens, just in fancier ways. It made me realize how hard it is to break out of an existing mental model once it becomes the norm.

What also stood out to me is that a lot of the responses weren’t dismissive or defensive — they were actually pretty hopeful. Even though Victor’s tone was a bit harsh in the original rant, the responses seemed to take it as a genuine challenge rather than just criticism. That feels important, because it shows that many designers do want to think bigger; they just need help finding new tools or ways of thinking. It made me think that maybe progress in interaction design isn’t about inventing some magic new device overnight, but about slowly shifting how we even define interaction in the first place.

Week 10: A Brief Rant on the Future of Interaction Design

As I was reading A Brief Rant on the Future of Interaction Design, what really struck me was how much we’ve just accepted the idea that “the future” means shiny screens everywhere. Bret Victor makes a strong point that even though technology looks cooler and sleeker, the way we interact with it hasn’t fundamentally changed — it’s still just tapping and swiping on glass. It’s kind of depressing when you think about it, because the excitement around “new technology” mostly ignores the fact that humans are physical, three-dimensional beings. We have hands that are capable of so much subtlety, but all we do is poke at flat rectangles. Victor’s frustration feels justified — it’s like we’ve totally surrendered to convenience at the cost of creativity and human potential.

At the same time, I found myself wondering: is it really fair to expect interaction design to be radically different when so much of our world (work, entertainment, communication) has moved into the digital space? Maybe part of the reason we keep using screens is because they’re the simplest way to deal with abstract information. But still, Victor’s examples of more tactile, nuanced designs made me realize we’re probably limiting ourselves by not even trying to imagine alternatives. It’s like we’re stuck optimizing what already exists instead of exploring what could be fundamentally new. After reading this, I feel like a good interaction designer shouldn’t just make apps easier to use, but should rethink what “using” even means.

Week 10: Musical Instrument

Concept:

video

“WALL-E’s Violin” is inspired by the character WALL-E from the popular animated movie. The idea behind “WALL-E’s Violin” is to mimic WALL-E expressing emotion through music, specifically through a violin — a symbol of sadness and nostalgia. A cardboard model of WALL-E is built with an ultrasonic distance sensor acting as his eyes. A cardboard cutout of a violin and bow are placed in front of WALL-E to represent him holding and playing the instrument. 

As the user moves the cardboard stick (the “bow”) left and right in front of WALL-E’s “eyes” (the ultrasonic sensor), the sensor detects the distance of the bow. Based on the distance readings, different predefined musical notes are played through a piezo buzzer. These notes are chosen for their slightly melancholic tones, capturing the emotional essence of WALL-E’s character.

To add a layer of animation and personality, a push-button is included. When pressed, it activates two servo motors attached to WALL-E’s sides, making his cardboard arms move as if he is emotionally waving or playing along. The mechanical hum of the servos blends into the soundscape, enhancing the feeling of a robot expressing music not just through tones, but with motion — like a heartfelt performance from a lonely machine.

How it Works:

  • Ultrasonic Sensor: Acts as WALL-E’s eyes. It continuously measures the distance between the robot and the cardboard bow moved in front of it.
  • Note Mapping: The Arduino interprets the measured distances and maps them to specific, predefined musical notes. These notes are chosen to sound melancholic and emotional, fitting WALL-E’s character.
  • Piezo Buzzer: Plays the corresponding musical notes as determined by the distance readings. As the user sways the bow, the pitch changes in real time, simulating a violin being played.
  • Servo Motors + Button: Two servo motors are connected to WALL-E’s arms. When the button is pressed, they animate his arms in a waving or playing motion. The sound of the servos adds a robotic texture to the performance, further humanizing the character and blending physical movement with audio expression.

Code:

#include <Servo.h>
#include <math.h>

// Pin Assignments

const int trigPin   = 12;   // Ultrasonic sensor trigger
const int echoPin   = 11;   // Ultrasonic sensor echo
const int buzzerPin = 8;    // Buzzer output
const int buttonPin = 2;    // Button input for servo tap
const int servoPin  = 9;    // First servo control pin (default starts at 0°)
const int servoPin2 = 7;    // Second servo control pin (default starts at 180°)

Servo myServo;    // First servo instance
Servo myServo2;   // Second servo instance

// Bittersweet Melody Settings

int melody[] = {
  294, 294, 370, 392, 440, 392, 370, 294
};
const int notesCount = sizeof(melody) / sizeof(melody[0]);


// Sensor to Note Mapping Parameters

const int sensorMin = 2;     // minimum distance (cm)
const int sensorMax = 30;    // max distance (cm)
int currentNoteIndex = 0;  

// Vibrato Parameters 

const float vibratoFrequency = 4.0;  
const int vibratoDepth = 2;          

// Timer for vibrato modulation

unsigned long noteStartTime = 0;

void setup() {
  // Initialize sensor pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  
 // Initialize output pins
  pinMode(buzzerPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);  
  
  // Initialize the first servo and set it to 0°.
  myServo.attach(servoPin);
  myServo.write(0);
  
   // Initialize the second servo and set it to 180°..
  myServo2.attach(servoPin2);
  myServo2.write(180);
  
 // Initialize serial communication for debugging
  Serial.begin(9600);
  
// Get the initial note based on the sensor reading
  currentNoteIndex = getNoteIndex();
  noteStartTime = millis();
}

void loop() {
 // Get the new note index from the sensor
  int newIndex = getNoteIndex();
  
 // Only update the note if the sensor reading maps to a different index
  if (newIndex != currentNoteIndex) {
    currentNoteIndex = newIndex;
    noteStartTime = millis();
    Serial.print("New Note Index: ");
    Serial.println(currentNoteIndex);
  }
  
 // Apply vibrato to the current note.
  unsigned long elapsed = millis() - noteStartTime;
  float t = elapsed / 1000.0;  // Time in seconds for calculating the sine function
  int baseFreq = melody[currentNoteIndex];
  int modulatedFreq = baseFreq + int(vibratoDepth * sin(2.0 * PI * vibratoFrequency * t));
  
// Output the modulated tone
  tone(buzzerPin, modulatedFreq);
  
  // Check the button press to trigger both servos (movement in opposite directions).
  if (digitalRead(buttonPin) == LOW) {
    tapServos();
  }
  
  delay(10); 
}

// getNoteIndex() measures the sensor distance and maps it to a note index.
int getNoteIndex() {
  int distance = measureDistance();
// Map the distance (between sensorMin and sensorMax) to the range of indices of the array (0 to notesCount-1).
  int index = map(distance, sensorMin, sensorMax, 0, notesCount - 1);
  index = constrain(index, 0, notesCount - 1);
  return index;
}

// measureDistance() triggers the ultrasonic sensor and calculates the distance in cm.
int measureDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  return constrain(distance, sensorMin, sensorMax);
}

// tapServos() performs a tap action on both servos:
// - The first servo moves from 0° to 90° and returns to 0°.
// - The second servo moves from 180° to 90° and returns to 180°.
void tapServos() {
  myServo.write(90);
  myServo2.write(90);
  delay(200);
  myServo.write(0);
  myServo2.write(180);
  delay(200);
}

 

Circuit:

Future Improvements:

  • Add more notes or an entire scale for more musical complexity.
  • Integrate a servo motor to move the arm or head of WALL-E for animation.
  • Use a better speaker for higher-quality sound output.
  • Use the LED screen to type messages and add to the character.

Week 10 — Assignment

Concept

Our musical instrument is based off a combination of a both a mechanical and digital noise machine–controlled with a surprisingly satisfying potentiometer that allows us to modify the beats per minute (BPM), and a button that changes the octave of our instrument. We had fun with the relationship between the buzzer and our mechanical servo which produces the other noise, where the two are inversely proportional! In other words,  as the BPM of the buzzer increases, the servo slows down! And vice versa.

Figuring out the right code combination was a bit tricky at first, as we first made the mistake of nesting both the servo trigger and the buzzer tone in the same beat conditional. This meant that we fundamentally could not separate the motor from the buzzer tones. To resolve this, we ended up using two triggers–a buzzer and a motor trigger–which both run independently.

The code below shows how we ended up resolving this problem. Once we implemented this fix, we were able to celebrate with our somewhat perplexing instrument, that requires the user to complement the buzzers various beeping with the servos mechanical changes.

// --- 3. Perform Actions based on Timers ---

// --- Action Group 1: Metronome Click (Buzzer) ---
if (currentMillis - previousBeatMillis >= beatInterval) {
  previousBeatMillis = currentMillis; // Store the time of this beat

  // Play the click sound
  tone(BUZZER_PIN, currentFreq, CLICK_DUR);

  // Print beat info
  Serial.print("Beat! BPM: ");
  Serial.print(currentBPM);
  Serial.print(" | Freq: ");
  Serial.print(currentFreq);
  Serial.print(" | Beat Interval: ");
  Serial.print(beatInterval);
  Serial.println("ms");
}

// --- Action Group 2: Servo Movement Trigger ---
if (currentMillis - previousServoMillis >= servoInterval) {
  previousServoMillis = currentMillis; // Store the time of this servo trigger

  // Determine Target Angle
  int targetAngle;
  if (servoMovingToEnd) {
    targetAngle = SERVO_POS_END;
  } else {
    targetAngle = SERVO_POS_START;
  }

  // Tell the servo to move (NON-BLOCKING)
  myServo.write(targetAngle);

  // Print servo info
  Serial.print("Servo Trigger! Target: ");
  Serial.print(targetAngle);
  Serial.print("deg | Servo Interval: ");
  Serial.print(servoInterval);
  Serial.println("ms");

  // Update state for the next servo trigger
  servoMovingToEnd = !servoMovingToEnd; // Flip the direction for next time
}

 

Reading Response 7 – Future of Interaction Design (Week 10)

Reading 1: A Brief Rant on the Future of Interaction Design
Bret Victor’s “A Brief Rant on the Future of Interaction Design” challenges the prevailing notion that touchscreens represent the pinnacle of user interface design. He critiques the “Pictures Under Glass” paradigm, highlighting how it neglects the tactile and manipulative capabilities of human hands. Victor emphasizes that our hands are not just for pointing and tapping but are essential tools for feeling and manipulating our environment. He argues that current interfaces fail to leverage these capabilities, leading to a diminished user experience. This perspective prompts a reevaluation of how we design technology, suggesting that future interfaces should engage more deeply with our physical senses to create more intuitive and effective tools.

Reading 2: Follow-Up
In the follow-up responses to his rant, Victor addresses common criticisms and clarifies his intentions. He acknowledges that his piece was meant to highlight a problem rather than provide a solution, aiming to inspire further research into more tactile and dynamic interfaces. Victor compares the current state of technology to early black-and-white photography—revolutionary at the time but lacking in certain dimensions. He encourages exploration into areas like deformable materials and haptic holography, emphasizing the need for interfaces that can be seen, felt, and manipulated. I feel like this response reinforces the idea that while current technologies have their merits, there is significant room for innovation that more fully engages our human capabilities.

Week 10 – Musical Instrument

Concept:

We decided to use two digital switches (push switches) to play two different notes, with the LDR effectively acting as an analogue volume adjustment mechanism. The video demonstrates how this feedback from the LDR changes the volume, and if you focus when the light intensity pointed towards the LDR is decreased, there is a very small noise.

Demo (Circuit):

Demo (Video):

Arduino Code:

// Define pins for the buttons and the speaker
int btnOnePin = 2;
int btnTwoPin = 3;
int speakerPin = 10;

void setup() {
  // Initialize both button pins as inputs with built-in pull-up resistors
  pinMode(btnOnePin, INPUT_PULLUP);
  pinMode(btnTwoPin, INPUT_PULLUP);
  
  // Configure the speaker pin as an output
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  // Check if the first button is pressed
  if (digitalRead(btnOnePin) == LOW) {
    tone(speakerPin, 262);  // Play a tone at 262 Hz
  }
  // Check if the second button is pressed
  else if (digitalRead(btnTwoPin) == LOW) {
    tone(speakerPin, 530);  // Play a tone at 530 Hz
  }
  // No button is pressed
  else {
    noTone(speakerPin);     // Turn off the speaker
  }
}

Challenges:

The initial concept started out with a Light Dependent Resistor and the Piezo speaker/buzzer. We faced issues as the readings from the LDR did not behave as expected, there was an issue with the  sound produced, and the change in music produced was not adequate.

We also faced challenges with the programming, as the noise production was inconsistent. We fixed this by adjusting the mapping of the notes to produce more distinct frequencies for each independent push button (red vs yellow). for 262 and 540 Hz respectively.

Done by: Zayed Alsuwaidi (za2256) and Zein Mukhanov (zm2199)

Week 10 – Musical Instrument – Mary had a Little Piano

Groupmates : Liya and Shamma

CONCEPT and working :

For our group assignment, we built a simple digital musical instrument using an Arduino board. Our project was inspired by the children’s song Mary Had a Little Lamb, and we recreated the melody using four push buttons for four of the notes, C, D, E and G respectively. Each button acts as a digital sensor, triggering specific musical notes when pressed. The tones are played through a mini speaker connected to the Arduino, allowing the tune to be heard clearly. This created a very basic piano-style interface, with each button mapped to one of the notes in the song.

In addition to the buttons, we used a potentiometer as our analog sensor. This allowed us to control the frequency of the sound in real time. As the knob is turned, the pitch of the notes changes slightly, giving the player the ability to customize how the melody sounds. We mapped the frequency from 500-1000 Hz for this. It made the experience more interactive and demonstrated how analog inputs can add expressive control to digital systems. We also added some labels to the buttons so that it would be easier to play the music.

videos of testing :

(Sorry in advance, I’m not a singer, I just needed to show the song)

https://youtube.com/shorts/6OhDN7k7KAc?si=WBNaPPqKeTkQCIui

https://youtube.com/shorts/vH0wLT3W5Jk?si=3K7I34cpMZWwC9Ly

The code :

const int buttonPins[4] = {3, 5, 8, 9}; // buttons 

//frequency for each button 
int frequencies[4] = {262, 293, 330, 392}; //C, D , E , G notes
int potValue = 0; //to store potentiometer value

void setup() {
  //initialising buttons pins
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);
  }
  
  //for debugging
  Serial.begin(9600);
  
  //speaker pin for output
  pinMode(12, OUTPUT);
}

void loop() {
  //read the potentiometer value
  potValue = analogRead(A0);
  
  //map the potentiometer value to a frequency range 500-1000
  int adjustedFrequency = map(potValue, 0, 1023, 500, 1000);
  
  //for button and the corresponding note
  for (int i = 0; i < 4; i++) {
    if (digitalRead(buttonPins[i]) == LOW) { // Button is pressed (LOW because of INPUT_PULLUP)
      tone(12, frequencies[i] + adjustedFrequency);
      Serial.print("Button ");
      Serial.print(i+1);
      Serial.print(" pressed. Frequency: ");
      Serial.println(frequencies[i] + adjustedFrequency); //serial monitor 
      delay(200);
    }
  }
  
  //to stop the tone when buttons arent being pressed
  if (digitalRead(buttonPins[0]) == HIGH && digitalRead(buttonPins[1]) == HIGH &&
      digitalRead(buttonPins[2]) == HIGH && digitalRead(buttonPins[3]) == HIGH) {
    noTone(12);
  }
}

PROBLEMS / IMPROVEMENTS :

As for problems or challenges, we didnt really have any specific problems with the circuit other than loose wires or something which was fixed after debugging and checking again. Something we understood from working together is that having two different perspectives helps a lot in solving problems and finding ideas.

We see a lot of potential for expanding this project in the future. One idea is to add a distance sensor to control volume based on hand proximity, making it even more dynamic. Another would be adding LEDs that light up with each button press to provide a visual cue for the notes being played. We’re also considering increasing the number of buttons to allow more complex songs, and possibly adding a recording function so users can capture and replay their melodies.

It was a fun and educational project that helped us better understand the relationship between hardware inputs and interactive sound output. It was exciting to bring a classic tune to life through code, sensors, and a mini speaker!