Raya Tabassum: Digital Piano with Percussion Effects

Concept:

The Digital Piano with Percussion Effects is an innovative musical instrument that blends traditional piano elements with modern sensor technology to create a unique and interactive musical experience. This project uses an array of digital push buttons connected to an Arduino board to simulate a piano keyboard, where 8 different buttons trigger 8 distinct musical notes (‘C D E F G A B C#’ or ‘Sa Re Ga Ma Pa Dha Ni Sa’). In addition to the keyboard, the instrument incorporates an ultrasonic distance sensor, which introduces a dynamic layer of percussion sounds. These sounds vary depending on the distance of the player’s hand. Furthermore, a potentiometer is integrated to alter the pitch of the notes dynamically, offering the ability to manipulate the sound palette expressively.

Components Used:

  • Arduino Uno
  • Breadboard (x2)
  • Jumper Wires
  • Piezo Buzzer (x2)
  • Push Buttons (x8)
  • Potentiometer
  • 10k ohm resistors (x8)
  • Ultrasonic Sensor

Video:

Code:

int buzzerPin = 12;
int buzzer2 = 13;
int potPin = A0;
int keys[] = {2, 3, 4, 5, 6, 7, 8, 9};
// Frequencies for notes (C4 to C5)
int notes[] = {262, 294, 330, 349, 392, 440, 494, 523}; 
int trigPin = 10;
int echoPin = 11;
int bassDrum = 200; 
int snare = 250; 
int hiHat = 300;


void setup() {
  pinMode(buzzerPin, OUTPUT);
  pinMode(buzzer2,OUTPUT);
  pinMode(2,INPUT);
  pinMode(3,INPUT);
  pinMode(4,INPUT);
  pinMode(5,INPUT);
  pinMode(6,INPUT);
  pinMode(7,INPUT);
  pinMode(8,INPUT);
  pinMode(9,INPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  Serial.begin(9600);
}

void loop() {
  int potValue = analogRead(potPin);
  int volume = map(potValue, 0, 1023, 0, 255); // Map the potentiometer value to a volume range

  // Measure distance using the ultrasonic sensor
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2; // Calculate the distance

  Serial.print(distance);
  Serial.println(" cm");


  bool isAnyButtonPressed = false;
  for (int i = 0; i < 8; i++) {
    int modifiedNote = map(potValue, 0, 1023, notes[i] / 2, notes[i] * 2);
      if (digitalRead(keys[i]) == HIGH) {
          tone(buzzerPin, modifiedNote, 100);
          isAnyButtonPressed = true;
          break; // Stop the loop once a button is found pressed
      }
  }

  if (!isAnyButtonPressed) {
    noTone(buzzerPin);
  }
  if (distance < 10) {
    tone(buzzer2, bassDrum, 100);
  } else if (distance >= 10 && distance < 20) {
    tone(buzzer2, snare, 100);
  } else if (distance >= 20 && distance < 30) {
    tone(buzzer2, hiHat, 100);
  } else {
    noTone(buzzer2);
  }
    delay(100);
}

In the loop, the program first reads the potentiometer value and uses it to modify the frequency of the piano notes. Depending on the button pressed, it plays a modified note frequency. If no buttons are pressed, it stops any ongoing tone. Depending on the distance detected, it chooses a percussion sound to play, simulating a drum kit with different sounds for different ranges.

Work Process and Challenges: 

For this assignment I was paired up with Snehil. We both worked together to brainstorm the primary idea. First we only had the concept to build the digital piano with push buttons and then we combined the ultrasonic sensor to measure distance and make the percussion sounds to level up the musical instrument. For further improvement Snehil made the potentiometer work to change the pitch of the notes. We were first having trouble to make the ‘for loop’ work for the push buttons to make sound. We were also at first using only one piezzo for the whole instrument and the piano wasn’t working then. So we had to troubleshoot many times to fix those problems.

Final instrument with the potentiometer working:

Reading response / Final proposal – Hamdah Alsuwaidi

Reading response:

The narrative woven through the author’s critique and subsequent responses to reader feedback forms a cohesive argument for a transformative approach to interaction design. At the core of this discussion is a pointed critique of the reliance on touchscreens, or what the author refers to as “Pictures Under Glass.” This approach, the author argues, falls short of engaging the full spectrum of human abilities, especially the tactile and manipulative capacities intrinsic to the human experience.

Bringing expertise from a background in designing future interfaces, the author offers a credible perspective on the limitations of current technological trends and makes a clarion call for a reevaluation of our design ethos. The focus is on creating interfaces that not only address problems but also resonate with the rich physical faculties innate to users, suggesting that tools should amplify human capabilities and serve as an extension of human experience rather than its reduction.

The response to critiques emphasizes that the author’s original “rant” was not a dismissal of current devices like iPhones or iPads but a push for acknowledging their current utility while recognizing the imperative for future developments to transcend these existing paradigms. The author envisages a future where technology could tangibly represent and manipulate the environment in ways yet to be discovered, leveraging long-term research.

Voice interfaces, while useful, are seen as insufficient for fostering the depth of creativity and understanding that comes from a more tactile and hands-on interaction. Similarly, gestural interfaces and brain interfaces that circumvent the full use of the body are met with skepticism. The author stresses the need for interfaces that don’t make us sedentary or immobile, advocating against creating a future where interaction is disembodied.

In a profound reflection on the accessibility and apparent simplicity of touchscreens, the author warns against mistaking ease of use for comprehensive engagement, noting that while a two-year-old can swipe a touchscreen, such interaction doesn’t fully capitalize on adult capabilities.

Combining these perspectives, the author’s message is a wake-up call for designers, technologists, and visionaries. It’s an urging to reimagine interaction with the digital world in a way that brings our physical interaction with technology into a harmonious synergy, maximizing rather than diluting our human potential. This combined narrative is not just a response to current technological trends but a hopeful blueprint for the future of interaction design that is as rich and complex as the capabilities it aims to serve.

Final proposal:

Concept:
An Interactive Digital Jukebox, merges the nostalgic allure of classic jukeboxes with the functionalities of modern digital technology. Utilizing Arduino Uno for hardware control and p5.js for interactive web-based visualizations, this project aims to deliver a user-friendly music player that offers both physical and web interfaces for song selection, volume control, and music visualization.

Project Description:
The Interactive Digital Jukebox allows users to select and play music tracks from a stored library, offering real-time audio visualizations and track information through a p5.js . The system combines simple physical components for user interaction with sophisticated web-based graphics, creating a multifaceted entertainment device.

Components:
– SD Card Module: For storage and retrieval of MP3 music files.
– LCD Screen: To display track information directly on the device.
– Push Buttons: For navigating the music library and controlling playback.
– Speakers/Headphone Jack: To output audio.
– Potentiometer: To manually adjust the volume.

Functionality:
1. Music Playback: Music files stored on the SD card will be accessed and controlled via Arduino, with support for basic functions like play, pause, skip, and back.
2. User Interface: The p5.js interface will display a list of tracks, currently playing song details, and interactive playback controls.
3. Visual Feedback: The web interface will include dynamic visualizations corresponding to the music’s audio features, such as volume and frequency spectrum.
4. Physical Controls: Physical interactions will be enabled through push buttons or a touch screen connected to the Arduino.
5. Volume Control: A potentiometer will allow users to adjust the sound volume, with changes reflected both physically and on the web interface.

Implementation Steps:
1. SD Card Reader Setup: Install the SD card module to the Arduino and load it with MP3 files. Program the Arduino to use an MP3 decoder library for playback control.
2. Control Interface Development: Configure the input mechanisms (buttons, potentiometer) and ensure their functional integrity for user interaction.
3. Web Interface Creation: Develop the visual and interactive components of the p5.js interface, including song listings, playback controls, and visualization areas.
4. Audio Visualization Integration: Utilize the `p5.sound` library to analyze audio signals and generate corresponding visual effects on the web interface.
5. System Integration: Establish robust communication between Arduino and the p5.js application, likely via serial communication.

By bringing together the tactile satisfaction of traditional jukeboxes with the visual appeal and interactivity of modern interfaces, this project stands to offer not just a practical entertainment solution but also an educational tool that demonstrates the integration of different technologies. It promises to be an excellent demonstration of skills in both electronics and software development, appealing to a broad audience and fostering a deeper appreciation for the fusion of old and new media technologies.

 

 

 

 

 

Reading Reflection – Week 11 – Saeed Lootah

A Brief Rant on The Future of Interactive Design

Firstly, its clear that the author has strong feelings about the visions that people are suggesting for the future of technology but I felt that the author described what he felt should change clearly. I personally really liked the example of a hammer where the handle is made for the human and on the other end is the actual tool. It’s a simple object but it made me realize the importance of human centered design. The author had a large emphasis on hands and basically the torso. I liked the distinction he had made between touchscreen devices and how its just a “picture under glass” meaning that there was no tactile feedback based on what part of the screen you were pressing or based on what program you were using. A personal anecdote of mine is that touch-typing on a laptop or just any physical keyboard is much, much easier than on a phone for example. It’s hard to tell which key you are pressing an often the only way of getting it right is by sheer luck. While he did not give any specific solution (which was a common response to the article) it was still necessary to outline the problem no matter how obvious it seemed.

It’s a common trope that in the future we will be interacting with holograms and swiping or touching imaginary screens in the air which we can only see when wearing some kind of glasses, which is similar to what apple are attempting with the apple vision pro. While it makes a lot of sense I’m not sure how it can be made more tactile. For one there has to be a sense of weight when interacting with something to make it more intuitive and some kind of feedback at our fingertips. I imagine a way of making some kind of tactile feedback at our fingertips could be by using some kind of glove which magically gives feedback (maybe in the future the solution will be obvious but right now its magic, at least for me). In any case I found the reading interesting and there is a lot to digest and consider despite its conciseness (I think thats a word).

Final Project Proposal – ArSL to Arabic Translation System

Growing up, I always experienced a communication barrier with my grandfather’s brother, who is hard of hearing. At family gatherings, only a select few who understand Arabic Sign Language (ArSL) could successfully communicate with him. This situation has been frustrating, as he has many adventures and stories that remain unshared and misunderstood by most of our family.

While there are systems available that translate American Sign Language (ASL) into English, the representation of Arabic in the technological domain of sign language translation is lacking. This disparity has not only limited the communication within diverse linguistic communities but also shows the urgent need for inclusive technology that bridges linguistic and sensory gaps.

My goal is to develop a real-time translation system for Arabic Sign Language using pose estimation combined with proximity sensing. The goal is to enable direct communication for ArSL users by translating their sign language into written Arabic. It would be nice to use machine learning models that specialize in pose estimation but I would need to do more research.

Final Project Idea “Love Level” – Shaikha AlKaabi

Project Idea: Love Level Game

  

The Love Sensor game is an interactive experience designed for two players, each placing one hand on a heart rate monitor. This innovative game uses heart rate data to measure and display the level of affection or excitement between the participants. The faster the heartbeats, the higher the presumed love connection. 

How It Works: 

  1. Heart Rate Monitoring: Each player places a hand on the heart rate monitor, which captures their pulse in real time.
  2. Data Processing: An Arduino board processes the heart rate data, assessing the intensity and changes in each player’s heartbeat.
  3. Dynamic Feedback: The processed data is then sent to a computer where p5.js generates visual and auditive outputs. These outputs are designed to dynamically respond to the heart rate data. For instance, visuals willgrow more vibrant as the players’ heart rates increase, symbolizing a stronger connection. Audio feedback may also escalate, creating a rich sensory environment that reflects the intensity of the game.
  4. Interactive Experience: The game aims to create a playful and engaging experience that not only entertains but also visually and audibly illustrates the emotional bond between the players, making it a fascinating exploration of human interactions.

  

Week 11 Reading – Jihad Jammal

Reflecting on the author’s critique of what is dubbed “Pictures Under Glass,” I find a compelling challenge to our current acceptance of interface norms. The disparity highlighted between the tactile richness of our everyday physical interactions and the simplified, touch-based interfaces that dominate our devices urges us to question whether we are curbing our innovative potential. By adhering so strictly to touchscreen technology, we risk neglecting developments in interfaces that could better harness human physicality and sensory feedback.

This critique serves as a springboard for considering alternative interaction paradigms that prioritize enhancing our sensory and physical engagement with technology. While touchscreens undoubtedly provide accessibility and ease, there’s a significant opportunity to explore more immersive and intuitive interfaces. Such technologies would not only extend our capabilities but also deepen our interaction with the digital world, suggesting a future where technology complements rather than constrains our interactions with our environment. This shift could pave the way for more meaningful and natural user experiences that align closely with human behavior and physicality.

Arduino-Powered Joystick Controller Concept

The project’s primary goal is to design and construct a custom joystick game controller using Arduino. This controller will enhance player interaction with a browser-based video game developed in p5.js, offering a more intuitive and engaging gameplay experience compared to standard keyboard or mouse controls.

Controller Features:
  1. Joystick for Directional Control:
    • The joystick will allow players to control movement within the game with greater precision and fluidity. It will capture analog input to provide a smoother response compared to digital keys.
  2. Tactile Feedback Buttons:
    • Additional buttons will be incorporated for game-specific actions (e.g., jumping, shooting). These will use tactile switches to give a satisfying click feel, ensuring immediate and comfortable response.
  3. Vibration Feedback:
    • Incorporate vibration motors that activate during specific game events, such as collisions or other significant interactions, to provide physical feedback that enhances the immersion.
  4. LED Feedback System:
    • LEDs will be programmed to react to different situations in the game, such as game alerts, low health, or achievement notifications, adding another layer of feedback and interaction.
Technical Components:
  • Arduino Board: Acts as the central processing unit for the joystick controller, reading inputs from the joystick and buttons, and managing outputs like LED indicators and vibration motors.
  • Analog Joystick Module: Captures detailed directional inputs from the player, translating them into game movements.
  • Vibration Motors: To provide immediate tactile feedback during critical game events, enhancing the physical experience of the gameplay

Week 11_Music – Saeed, Khalifa, Jihad

Concept:

The concept of this project was to create a mini drum pad, or what is equivalent to one, with the hardware we have available. The device would use buttons to trigger different buzzer sounds, mimicking the functionality of a traditional drum pad. Each button on the device would correspond to a different sound, with the frequency of these sounds adjustable via a potentiometer. This allows the user to modify the pitch of the tones.

Code:

// Defining pins assignments for buttons and buzzers
const int buttonPin1 = 2;
const int buttonPin2 = 3;
const int buttonPin3 = 4;
// Coded with the Aid of ChatGPT
const int buttonPin4 = 5; // Monitoring and playbacks button
// Coded with the Aid of ChatGPT
const int buzzerPin1 = 8;
const int buzzerPin2 = 9;
const int buzzerPin3 = 10;
const int potPin = A0; // Potentiometer connected to A0 for frequency control

// Variables to manage button states and debounce timing
int buttonState1 = 0;
int lastButtonState1 = 0;
int buttonState2 = 0;
int lastButtonState2 = 0;
int buttonState3 = 0;
int lastButtonState3 = 0;
int buttonState4 = 0;
int lastButtonState4 = 0;

unsigned long lastDebounceTime1 = 0;
unsigned long lastDebounceTime2 = 0;
unsigned long lastDebounceTime3 = 0;
unsigned long lastDebounceTime4 = 0;
unsigned long debounceDelay = 50; // Debounce delay in milliseconds

// Struct to hold buzzer activation data including the pin and frequency
struct BuzzerAction {
  int buzzerPin;
  int frequency;
};

// Coded with the Aid of ChatGPT
BuzzerAction record[100]; // Array to store each buzzer activation
int recordIndex = 0; // Index for recording array
//Coded with the Aid of ChatGPT

void setup() {
  // Initialize all button and buzzer pins
  pinMode(buttonPin1, INPUT);
  pinMode(buttonPin2, INPUT);
  pinMode(buttonPin3, INPUT);
// Coded with the Aid of ChatGPT
  pinMode(buttonPin4, INPUT);
// Coded with the Aid of ChatGPT
  pinMode(buzzerPin1, OUTPUT);
  pinMode(buzzerPin2, OUTPUT);
  pinMode(buzzerPin3, OUTPUT);
  pinMode(potPin, INPUT); // Setups potentiometer pin as input
}

void loop() {
  // Reads current state of buttons
  int reading1 = digitalRead(buttonPin1);
  int reading2 = digitalRead(buttonPin2);
  int reading3 = digitalRead(buttonPin3);
// Coded with the Aid of ChatGPT
  int reading4 = digitalRead(buttonPin4);
// Coded with the Aid of ChatGPT
  int potValue = analogRead(potPin); // Reads potentiometer value
  int frequency = map(potValue, 0, 1023, 200, 2000); // Maps potentiometer value to frequency range

  // Handle button 1 press and recording
  debounceAndRecord(reading1, &lastButtonState1, &buttonState1, &lastDebounceTime1, buzzerPin1, frequency);

  // Handle button 2 press and recording
  debounceAndRecord(reading2, &lastButtonState2, &buttonState2, &lastDebounceTime2, buzzerPin2, frequency);

  // Handle button 3 press and recording
  debounceAndRecord(reading3, &lastButtonState3, &buttonState3, &lastDebounceTime3, buzzerPin3, frequency);

  // Handles button 4 for playback
  if (reading4 != lastButtonState4) {
    lastDebounceTime4 = millis();
  }
  if ((millis() - lastDebounceTime4) > debounceDelay) {
    if (reading4 != buttonState4) {
      buttonState4 = reading4;
      if (buttonState4 == HIGH) {
        for (int i = 0; i < recordIndex; i++) {
          // Play each recorded buzzer action with the specific frequency recorded
          tone(record[i].buzzerPin, record[i].frequency, 200);
          delay(250); // Short delay between each buzzer action for clarity
        }
        recordIndex = 0; // Resets record index after playback
      }
    }
  }

  // Update last button states for next loop iteration
  lastButtonState1 = reading1;
  lastButtonState2 = reading2;
  lastButtonState3 = reading3;
  lastButtonState4 = reading4;
}
// Coded with the Aid of ChatGPT
// Function to handle button debouncing and recording buzzer actions
void debounceAndRecord(int reading, int *lastButtonState, int *buttonState, unsigned long *lastDebounceTime, int buzzerPin, int frequency) {
  if (reading != *lastButtonState) {
    *lastDebounceTime = millis(); // Reset debounce timer
  }
  if ((millis() - *lastDebounceTime) > debounceDelay) {
    if (reading != *buttonState) {
      *buttonState = reading; // Updates button state
      if (*buttonState == HIGH && recordIndex < sizeof(record) / sizeof(record[0])) {
        record[recordIndex++] = {buzzerPin, frequency}; // Records the buzzer activation
        tone(buzzerPin, frequency, 200); // Plays buzzer at recorded frequency
      }
    }
  }
  *lastButtonState = reading; // Updates last button state for debouncing
// Coded with the Aid of ChatGPT
}

 

Hardware Configuration: The system is designed with four button inputs and three buzzer outputs. Additionally, a potentiometer is used to control the frequency of the buzzer sounds.

Button Functionality: Buttons 1 to 3 are connected to buzzers and are responsible for triggering sounds with variable frequencies determined by the potentiometer. Button 4 is designated for playback. It plays back a sequence of sounds that have been recorded based on earlier interactions with buttons 1 to 3.

Frequency Control: The frequency of the sounds is dynamically adjusted using a potentiometer. The analog value from the potentiometer is mapped to a specified frequency range (200 Hz to 2000 Hz), which determines how the buzzers sound.

Debouncing: To ensure reliable button press detection without noise interference, the code implements debouncing logic. This involves measuring the time since the last button state change and updating the state only if this interval exceeds a predefined threshold (50 milliseconds).

Recording and Playback (Aided by Chatgpt)

Recording: When a button (1 to 3) is pressed, the action (which buzzer is activated and at what frequency) is recorded in an array. This includes storing both the pin of the buzzer and the frequency at which it was activated.

Playback: When button 4 is pressed, the system iterates over the recorded actions and plays them sequentially. Each action triggers the corresponding buzzer to sound at the recorded frequency for a short duration.

Loop and Functions : The main loop continuously checks the state of each button and the potentiometer, updating the frequency accordingly. A helper function, debounceAndRecord, is used to handle the logic for both debouncing and recording the buzzer actions associated with each button press.

 

Video of Project:

Reflection and ideas for future work or improvements:

Integrating a small display screen would significantly improve its functionality, further enhancing the project. This screen would provide real-time visual feedback on button presses and frequency outputs, allow users to scroll through and select different sounds or presets, and serve as a simple interface for directly programming the device. The potential for further development and refinement holds exciting prospects. The integration of a display screen and the addition of more customizable buttons are immediate steps that will enhance the device’s usability and versatility. Further innovations could include wireless connectivity for easy integration with other music production software or the addition of sensors
to enable gesture-based controls, which would offer an even more dynamic and immersive user experience. Several key insights stand out after reflecting on what this project has taught us. First, the practical challenges of hardware interfacing taught us the importance of robust design and a
solid plan for creating it. There is also a need for effective wire management and physical housing to enhance device durability and aesthetics.

Week 11 Reading Response

The Brief Rant is a thought out argument for changing our perspective towards our current approach to interaction design and calling the need for other mediums. It focuses on the limits of our current technology and the implications it has on us.

The analogy between kid’s book depriving adults of the complexity of english vocab same goes to touchscreens to restricting interaction design and our capabilities.

In considering another real-world example, my mind goes to VR and AR. Although these technologies have advanced considerably in the past couple of years, there’s still lots of room for growth in terms of interaction design. Picture this: VR systems could undergo enhancement with haptic feedback technology, where it would enbale more immersion.

The other reading also highlights how adeptly we use our hands in everyday tasks, like opening jars and making sandwiches, contrasting this natural interaction with the detached experience of using flat screens. It questions why we settle for interfaces that don’t harness the richness of human touch and manipulation. The author is calling for a interactive design movement that I do see coming in the next couple years hopefully!

Week 11: Musical Instrument – DJ Set [Sarah & Rama]

The concept:

For this week’s assignment Sarah and I created a DJ set! The set is built using two potentiometers, a digital switch, a piezo buzzer, servo motors, and two LEDs (for visual feedback). When connected to power, the set produces a melody that the player (DJ) can manipulate by using the first potentiometer (which controls the pitch) and the second potentiometer (which controls the tempo). The red LED blinks in sync with the tempo set by the second potentiometer. The yellow LED’s brightness is proportional to the pitch (the higher the pitch, the higher the brightness). The servo motor resembles the turntables on an actual DJ set. You can activate it by pressing the digital switch. Upon activation, the servo motor will rotate at a speed equivalent to the beat’s tempo.

Implementation

Effectively, the DJ set plays a sequence of notes in succession, stored in a global 2D array, whose pitch class and the tempo of the beat are controlled by the potentiometers. In switching successively between tempos and pitch classes, the player generates different tunes and overall musical moods. The player can also switch on the servo motor, whose position incrementally increases at the same rate set by the tempo the player chose. We utilize two Arduinos, one controlling the sound manipulation between different tempos and pitches using the potentiometers (as well as the LED) and the other controlling the servo motor via a switch. The first Arduino sender program sends the position of the servo motor at time intervals corresponding to the current rate set by the tempo, using the Inter-Integrated Circuit (I2C) Protocol, over to the second Arduino. The receiver program on the second Arduino receives the position and updates the location of the servo, conditional on the button state being on. As the first receiver program also needed to synchronize sound with the blinking of the LEDs, it was essential to extrapolate the concept of blinking without delay to the playing of tones by the buzzer.

Code Snippets

The sender Arduino program reads the two potentiometer values, using the first value to control the pitch (by mapping the value to the range 0-3, corresponding to the range of sequences in the global notes array ) and the second to control the tempo/rate of playing a note. The first value is used to index the 2D array, notes[][], which stores the sequence of notes in different pitches. It also sets the brightness of the yellow LED. The second is used to index the duration[] array, which specifies the rate at which notes are played. The notes in the array notes[potVal1] are played in succession, with the next note in the sequence playing if the specified duration has passed. The state of the red LED is updated to blink according to the rate of the current melody being played. Finally, the position of the servo motor is also updated and sent to the second Arduino program.

note++; // cycle through the sequence 
if (note >= 5) {
note = 0; // Reset note index
}
// direction goes forward if position is at 0 
if (position <= 0){
servoDir = 1;
}
// direction goes backward if position is at 160 
else if (position >= 160){
servoDir = -1; 
}
position = (position+servoDir*20)%180; 
Wire.beginTransmission(8); // begin transmission to receiver
Wire.write(position); // send over the positon of the servo motor 
Wire.endTransmission(); // stop transmitting
Wire.requestFrom(8, 6); // request 6 bytes from receiver device #8
delay(1); 
}
}

The receiver Arduino program receives the position from the sender and updates the position if the switch state is set to 1 (the switch has been pressed to turn on the motor). Since the transmission occurs at a rate equivalent to the rate of the music, the motor will move according to the current tempo of the beat.

#include 
#include 
Servo servo; // initialize servo 
int x = 0; // variable storing data transmission 
int position = 0; // position of the servo motor 
const int switchPin = 7; // switch pin 
int buttonState = 0; // current button state 
int prevButtonState=0; // previous button state 
bool servoMove = false; // keeps track of the switch state 
void setup() {
Serial.begin(9600);
pinMode(switchPin, INPUT); 
servo.attach(9);
Wire.begin(8); 
Wire.onReceive(receiveEvent); // initialize event triggered on receipt of data 
Wire.onRequest(requestEvent); // initialize event on being requested data 
}
void receiveEvent(int bytes) {
x = Wire.read();
// validate received data
if (x >= 0 && x <= 180) {
position = x; // x directly maps to servo position
}
}
void requestEvent()
{
Wire.write("Hello ");
}
void loop() {
buttonState = digitalRead(switchPin); 
// maintain the state of the switch and use to determine if the motor should move
if (buttonState == HIGH && prevButtonState == LOW){
servoMove = !servoMove; 
};
// smoothly move the servo towards the desired position
if (servoMove){
if (position != servo.read()) {
if (position > servo.read()) {
servo.write(servo.read() + 1);
} else {
servo.write(servo.read() - 1);
}
delay(1); 
};
}
prevButtonState = buttonState; 
}
Circuit Schematic

Here’s a schematic of our circuitry:

Have a look:


Reflections and Challenges

One of the things we struggled with, largely due to both of us lacking knowledge of musical compositions, is choosing the right sequence of tones that generate a coherent melody. With a combination of trial and error and some research, we found a suitable sequence. We also faced the challenge of one of our LED lights not turning on when we wired the servo motor to the same circuit. Instead of adding a second power source to connect the servo motor to, we opted to utilize I2C since we had an additional Arduino, which proved to be a useful exercise. Overall, we were happy with the final product, but I think it would be nice to extend this project further and give the player a little more creative control as the current setup is quite restrictive in terms of what final melodies the player can generate. For instance, we can have additional buzzers, each producing a different tone, controlled by switches that the users can use to make their own tunes from scratch over a set beat (something like a MIDI controller? ) .