Week 10: Musical Instrument

Concept
The concept behind this project is inspired by the theremin, one of the earliest electronic musical instruments. Unlike the traditional theremin that uses electromagnetic fields, this version employs an ultrasonic sensor to detect hand movements and converts them into musical notes from the A minor pentatonic scale.

Schematic

Code snippet
The implementation uses an Arduino with an ultrasonic sensor (HC-SR04) and a piezo buzzer. Here’s the key components of the code:

const int SCALE[] = {
    147, 165, 196, 220, 262, 294, 330, 392, 440,
    523, 587, 659, 784, 880, 1047, 1175, 1319, 1568,
    1760, 2093, 2349
};

The scale array contains frequencies in Hertz, representing notes in the A minor pentatonic scale, spanning multiple octaves. This creates a musical range that’s both harmonious and forgiving for experimentation.

The system operates in two phases:

Calibration Phase

void calibrateSensor() {
    unsigned long startTime = millis();
    while (millis() - startTime < CALIBRATION_TIME) {
        int distance = measureDistance();
        maxDistance = max(maxDistance, distance);
    }
}

Performance Phase

void loop() {
    int currentDistance = measureDistance();
    int mappedNote = map(currentDistance, MIN_DISTANCE, maxDistance, 
                        SCALE[0], SCALE[SCALE_LENGTH - 1]);
    int nearestNote = findNearestNote(mappedNote);
    tone(PIEZO_PIN, nearestNote);
    delay(30);
}

Demo

When powered on, the instrument takes 5 seconds to calibrate, determining the maximum distance it will respond to. Moving your hand closer to the sensor produces higher pitches, while moving away produces lower ones. The pentatonic scale ensures that all notes work harmoniously together, making it easier to create pleasing melodies.

Reflection and Future Improvements
Current Limitations:

  • The response time has a slight delay due to sensor readings
  • Sound quality is limited by the piezo buzzer
  • Only supports single notes at a time

Potential Enhancements:

  1. Replace the piezo with a better quality speaker
  2. Add an amplifier circuit for improved sound output
  3. Incorporate multiple sensors for more control dimensions

Week 10: Jingle Bells – Speed Variation

Concept 

In this assignment, I collaborated with @Ruslan and we both love Christmas. The famous song Jingle Bells brings memories of the the times. So we explored various possibilities and decided to come up with speed variation of the Jingle Bells melody with respect to distance.

Here is the demonstration Video:

Schematic 

Here is the Schematic  for our Arduino connections:

Code:

In the implementation of our our idea, we searched for possible combinations of the notes and durations to match the Jingle Bells melody and stored them in an array. We then implemented the code mapping distance with durations. The variations in durations for each note make it seem playing faster or slower. Here is the code:

#include "pitches.h"
#define ARRAY_LENGTH(array) (sizeof(array) / sizeof(array[0]))

// Notes and Durations to match the Jingle Bells 
int JingleBells[] = 
{
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4,
};

int JingleBellsDurations[] = {
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4
};

const int echoPin = 7;
const int trigPin = 8;;
const int Speaker1 = 2;
const int Speaker2 = 3;
int volume;

void setup() 
{
// Initialize serial communication:
  Serial.begin(9600);
  pinMode(echoPin, INPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(Speaker1,OUTPUT);
}

void loop() 
{
  long duration,Distance;
  
// Distance Sensor reading
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  Distance = microsecondsToCentimeters(duration);

// Map Distance to volume range (0 to 255)
  volume = map(Distance, 0, 100, 0, 255);  
  volume = constrain(volume, 0, 255); 

// Play melody with adjusted volume


 playMelody(Speaker1 , JingleBells, JingleBellsDurations, ARRAY_LENGTH(JingleBells), volume);
  
// Debug output to Serial Monitor
  Serial.print("Distance: ");
  Serial.print(Distance);
  Serial.print("    Volume: ");
  Serial.print(volume);
  Serial.println();
}
// Get Centimeters from microseconds of Sensor
long microsecondsToCentimeters(long microseconds) 
{
  return microseconds / 29 / 2;
}
// PlayMelody function to accept volume and adjust note duration
void playMelody(int pin, int notes[], int durations[], int length, int volume) 
{
  for (int i = 0; i < length; i++) 
  {
// Adjust the note Duration based on the volume
    int noteDuration = (1000 / durations[i]) * (volume / 255.0);  

// Play the note with adjusted Durations
    tone(pin, notes[i], noteDuration);
// Delay to separate the notes
    delay(noteDuration * 1.3);  
    noTone(pin); 
  }
}
Reflections

Reflecting on this project, I learned a lot about working with notes, melodies. I was interested my the fact that  even complex music arrangements are made up of simple notes. The song “Jingle Bells” in particular really made me appreciate the structure of music on a new level. Each note represents a small part of the song, and adjusting the timing or pitch make the whole melody.

Working with @Ruslan made the process even more interesting, as we were both curious and explored various aspects before coming up with the final decision on music’s speed. I hope to continue working on musical notes in future projects.

Week 10: Reading Response

Bret Victor’s “A Brief Rant on the Future of Interaction Design” makes us think about how we describe and imagine “interaction.” Victor criticises the way we create interactions by saying that we should not limit them to finger swipes on touchscreens and instead use our hands and other senses.

He says that one of the main problems with the business is that it only cares about “pictures under glass,” while touchscreens are now used for everything. He says that this method doesn’t use the full ability of human hands, which can do a lot more than just swipe or tap. This made me think: How often have I just used touchscreens or buttons as “interactive” features without thinking about how they use or more importantly limit our physical abilities?

This interpretation also begs a crucial question: “How much have we actually improved the ways in which the ‘interactive system’ gives us feedback when we interact with something?” Actually, we are not even near enough to create a significant kind of interactive system. This is so because we neglected haptic feedback. Our hands are a collection of many kinds of sensors: heat, pressure, force, electrical, and so on. Although Victor’s perfect is employing hands in a whole spectrum of natural movements, I think haptic feedback may help to design interaction going forward.

Finding the substitutes for actual haptic input interests me as an engineering student. To replicate physical input, I may use motor vibrations, tension-based devices, or even resistance-based sensors. That is why in my creative switch project, I used a pulley mechanism to lift the connecting switch, which is to invite the user into engaging in an interactive physical system and is able to feel a sense of ‘weight’.

Week 10 – Reading Response

Believe it or not, I nearly believed that Magic Ink link would be “super-brief.”

Well, in terms of disillusioning, I guess the readings did a great job. On the other hand, I wasn’t really interested (at least at the beginning) in the topic – yes, it’s obvious from the first paragraph what the reading is up for. And my mere response to that would be, ‘Okay, maybe that’s not a promising vision, but I still want us to achieve it someday.’ However, beyond the ‘rant’ itself, what intrigued me was the idea of ‘the conventional means of interfacing the brain to the world (i.e., the body).’ Essentially, from my perspective, that is my first impulse to get in touch with IM: how are we going to interact (input and output info) as human beings in the future?

I always told people that I don’t like to read—but I’m forced to do so just because it holds (arguably) the most dense info on this planet. That’s in terms of ‘the media’—whatever that delivers information. At least for now, according to the scope of info that we can quantify, text still has its edge. (Honestly, it’s really sad news for me to acknowledge at the beginning – like a couple of years ago – that probably music that is based on audio (2 parameters) and paints based on images (arguably three parameters?) as the art forms I love has their inherent limit to express—even if we haven’t (or maybe already) reached).

On the other hand, what about humans? I mean, what about our bodies? I would say that people who strive to devise new media and convey and people who strive to plunge into our cognitive system are two teams that approach the same theme from two angles. I cannot answer if ‘bypassing’ the body is a good thing or not. But, for now, I would say the body is still a component of what we call ‘human.’

Week 10 – Reading Response

Bret Victor’s “A Brief Rant on the Future of Interaction Design” and its follow-up article present an interesting, if not particularly deep, critique of current interaction design trends. Victor’s main argument focuses on the limitations of “Pictures Under Glass” – flat touchscreen interfaces that dominate modern technology. He contends these interfaces fail to fully utilize our hands’ tactile capabilities, which are adept at feeling and manipulating objects.While not groundbreaking, Victor’s observation challenges us to think beyond the status quo of interaction design. He makes a valid point about prioritizing tactile feedback in future interfaces, arguing that our sense of touch is fundamental to how we interact with the world.Victor calls for more research into tangible interfaces, dynamic materials, and haptics, suggesting that truly revolutionary interfaces will come from long-term research efforts rather than incremental improvements to existing technology. This emphasis on pushing boundaries through research is noteworthy, even if not explored in great depth.The follow-up responses highlight that while devices like the iPad are revolutionary, they shouldn’t be the end goal of interface design. They also suggest that dynamic tactile mediums capable of physically representing almost anything are a worthy aspiration for future interfaces.Overall, Victor’s message prompts us to consider the full range of human sensory capabilities in developing new interaction models, encouraging us to imagine more intuitive and expressive interfaces.

Week 10: A Brief Rant On The Future Of Interaction Design

The reading by Bret Victor describes how interactive design has shifted towards the digital world, rather than focusing on the simplicity of tools and capabilities of physical actions. Through the concept of Pictures Under Glass, he describes how there is a growing disconnect between human interaction between activities on screen and in reality. By focusing on primatial features of human hands, they allow us to perceive the world around us. Following the points he mention of human hands, I fully agree that human learn and understand the world around us through senses. We can’t completely interact with the world fully around us by using one sense  such as vision, touch, smell, taste, and hearing.

While I agree technology has benefited society in many ways. I do not want a completely digital world where everything is behind a glass screen or through Victor concept of picture behind glass. I think it’s critical for humans to understand the world us, otherwise we lose the compassion and sense of worth of whatever it is we are interacting with. Without our sense of touch, our capabilities as human diminishes because we cannot grasp or use the tools around us. Likewise, if we don’t physical see an object, it becomes increasingly difficult to learn about and becomes near impossible to appreciate it.

Week 10 – Echoes of Light

Concept

Our project, Ibrahim and I, “Echoes of Light,” emerged from a shared interest in using technology to create expressive, interactive experiences. Inspired by the way sound changes with distance, we aimed to build a musical instrument that would react naturally to light and proximity. By combining a photoresistor and distance sensor, we crafted an instrument that lets users shape sound through simple gestures, turning basic interactions into an engaging sound experience. This project was not only a creative exploration but also a chance for us to refine our Arduino skills together.

Materials Used

  • Arduino Uno R3
  • Photoresistor: Adjusts volume based on light levels.
  • Ultrasonic Distance Sensor (HC-SR04): Modifies pitch according to distance from an object.
  • Piezo Buzzer/Speaker: Outputs the sound with controlled pitch and volume.
  • LED: Provides an adjustable light source for the photoresistor.
  • Switch: Toggles the LED light on and off.
  • Resistors: For the photoresistor and LED setup.
  • Breadboard and Jumper Wires

    Code

The code was designed to control volume and pitch through the analog and digital inputs from the photoresistor and ultrasonic sensor. The complete code, as documented in the previous sections, includes clear mappings and debugging lines for easy tracking.

// Define pins for the components
const int trigPin = 5;               // Trigger pin for distance sensor
const int echoPin = 6;              // Echo pin for distance sensor
const int speakerPin = 10;           // Speaker PWM pin (must be a PWM pin for volume control)
const int ledPin = 2;                // LED pin
const int switchPin = 3;             // Switch pin
const int photoResistorPin = A0;     // Photoresistor analog pin

// Variables for storing sensor values
int photoResistorValue = 0;
long duration;
int distance;

void setup() {
  Serial.begin(9600);                // Initialize serial communication for debugging
  pinMode(trigPin, OUTPUT);          // Set trigger pin as output
  pinMode(echoPin, INPUT);           // Set echo pin as input
  pinMode(speakerPin, OUTPUT);       // Set speaker pin as output (PWM)
  pinMode(ledPin, OUTPUT);           // Set LED pin as output
  pinMode(switchPin, INPUT_PULLUP);  // Set switch pin as input with pull-up resistor
}

void loop() {
  // Check if switch is pressed to toggle LED
  if (digitalRead(switchPin) == LOW) {
    digitalWrite(ledPin, HIGH);      // Turn LED on
  } else {
    digitalWrite(ledPin, LOW);       // Turn LED off
  }

  // Read photoresistor value to adjust volume
  photoResistorValue = analogRead(photoResistorPin);
  
  // Map photoresistor value to a range for volume control (0-255 for PWM)
  // Higher light level (LED on) -> lower photoresistor reading -> higher volume
  int volume = map(photoResistorValue, 1023, 0, 0, 255); // Adjust mapping for your setup

  // Measure distance using the ultrasonic sensor
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in cm
  distance = duration * 0.034 / 2;

  // Set frequency based on distance in the range of 2-30 cm
  int frequency = 0;
  if (distance >= 2 && distance <= 30) {
    frequency = map(distance, 1, 100, 20000, 2000); // Closer = higher pitch, farther = lower pitch
    tone(speakerPin, frequency);
    analogWrite(speakerPin, volume);  // Apply the volume based on photoresistor reading
  } else {
    noTone(speakerPin);  // Silence the speaker if the distance is out of range
  }

  // Debugging output
  Serial.print("Photoresistor: ");
  Serial.print(photoResistorValue);
  Serial.print("\tVolume: ");
  Serial.print(volume);
  Serial.print("\tDistance: ");
  Serial.print(distance);
  Serial.print(" cm\tFrequency: ");
  Serial.println(frequency);

  delay(100); // Short delay for sensor readings
}

Video Demonstration

In our video demonstration, we showcase how the instrument responds to changes in light and proximity. We toggle the LED to adjust volume and move a hand closer or farther from the ultrasonic sensor to change pitch, demonstrating the instrument’s sensitivity and interactive potential.

Week10

Reflections

The project successfully combines multiple sensors to create a reactive sound device. The integration of volume and pitch control allows for intuitive, responsive sound modulation, achieving our goal of designing an engaging, interactive instrument.

Improvements:
To improve this instrument, we would enhance the melody range, creating a more refined and versatile sound experience. This could involve using additional sensors or more sophisticated sound generation methods to provide a broader tonal range and a richer melody.

Week 10: Musical Instrument

Concept

For this week’s assignment we spent a lot of time brainstorming ideas and developing different components of the instrument. Originally, we considered a light sensor activated lullaby machine (at night it plays a lullaby and an alarm in the morning) but weren’t sure if that actually constituted an instrument. Therefore, we decided to keep it more straightforward with an electric keyboard design that you can adjust the speed or “consistency” of the notes playing.

Materials
  • Arduino Uno
  • Jumper wires
  • 7 button switches
  • Speaker
  • SPST Switch
  • Potentiometer
Schematic

Design

This project’s two main components are 7 buttons to play the notes and a potentiometer to control the speed of the notes playing. The green button to the left (closest to our potentiometer) represents middle C and then each button plays the consecutive note going up that scale. In our design, we also implemented a an SPST switch to control whether or not you could play anything on it, with the intention of mimicking an electric keyboard’s power button. A unique component to our design is the fact that we used both of our Arduino breadboards in order to better organize and manage the aesthetics of the design. Additionally, by still using one actual Arduino Uno we were able to avoid any complications with synching the information which was convenient in terms of time efficiency.

On the implementation side our biggest challenge was getting the volume to change with the adjustments of the potentiometer. After a bit of troubleshooting and heading back to the drawing board we decided to switch our original play and allow the potentiometer to control the speed of the notes. We intended to recycle as much code as we could from class and ultimately ended up use components from both Arduino’s tone() and button example codes that were reviewed in class. After switching between trying to change the pitch, speed, and volume of the notes with the potentiometer we decided on speed simply because we felt that the other adjustments weren’t noticeable enough for our intended purpose.

Code
#include "pitches.h"
 
// button pins
const int B_key = 4; 
const int A_key = 5;
const int G_key = 6; 
const int F_key = 7;
const int E_key = 8;
const int D_key = 9; 
const int C_key = 10; 
 
const int SWITCH_PIN = 12;  // switch pin
const int BUZZER_PIN = 2; // buzzer pin
const int POT_PIN = A0; // potentiometer pin 
 
// notes
int melody[] = {
  NOTE_B5, NOTE_A5, NOTE_G4, NOTE_F4, NOTE_E4, NOTE_D4, NOTE_C4
};
 
// Array for button pins
int buttonPins[] = { A_key, B_key, C_key, D_key, E_key, F_key, G_key };
 
void setup() { // initalization of the buttons, buzzer, and switch pins 
  for (int i = 0; i < 7; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP); // sets button pins as inputs with internal pull-up resistors
  }
 
  pinMode(BUZZER_PIN, OUTPUT); // sets the buzzer pin as an output
  pinMode(SWITCH_PIN, INPUT_PULLUP);  // sets switch as inputs with internatinoal pull-up resistors
}
 
void loop() {
  int switchState = digitalRead(SWITCH_PIN);    // reads the state of the switch
  int potValue = analogRead(POT_PIN);   // reads the potentiometer value (0-1023)
 
  int speedDelay = map(potValue, 0, 1023, 50, 500);  // maps potentiometer value to speed delay range (50-500 ms)
 
  // if the switch is HIGH the button functionality is enabled
  if (switchState == HIGH) {
    // continously checks each button state
    for (int i = 0; i < 7; i++) {
      int buttonState = digitalRead(buttonPins[i]);
        
      // if the button is pressed,  play the corresponding note
      if (buttonState == LOW) {
        tone(BUZZER_PIN, melody[i], 200); // play note for 200ms
        delay(speedDelay); // speed delay based on the position/ value of the potentiometer 
      }
    }
  }
  delay(50);
}

 

Final Project music.mov – Google Drive

Conclusion and Reflection

All in all, we are both rather pleased with the outcome of this weeks assignment. With it being our first assignment working with another person, it was an interesting opportunity to compare techniques in regards to simpler things like breadboard organization or schematic design preferences. We were able to use both of our strengths in different ways and support each other in areas that we were weaker. In terms of future developments we had a few smaller ideas such as adding LEDs to correspond with the keys, or allowing the user to control the pitch of the notes. Looking at the bigger picture, we even discussed what it would look like to allow the user to choose what kind of electric keyboard sounds they wanted to play (going beyond the use of the speakers we have). Although our project is rather simple, we were truly tested when it came to debugging our code but really appreciated the experience of going at it together.

Week 10 Reading Response

In the article A Brief Rant on the Future of Interaction Design, Bret Victor critiques the current state of interactive design by highlighting our reliance on “pictures under glass” aka electronic technology using screens. He argues that they over usage of them limits the way we interact with the digital world. Victor argues that by only using our fingers to complete so many different tasks, we lack the tactile experience. An ultimately cut ourselves short of the full potential of our hands, which are intended for so much more complex and creative things. Furthermore, he envisions a future where technology engages all of our senses and thus, feels more real, bridging the gap between our physical existence and digital experience. His perspective is interesting to me because it sheds light on how we as a society are not reaching our full potential, making us creatively lazy.  I also really enjoyed his argument because it made me think of the show Black mirror, which critiques society in a similar manner and provides dystopian snapshots of what we may be on track to become.

The second article is an additional piece providing reflection on what the audience thought and Victor’s responses to that. I particularly enjoyed this section because it posed a lot of interesting opinions and arguments that I, myself thought of and agreed with while reading both articles. All in all, I think this gave an unique look at how we must balance the present of technology in interactive media and design. It is a tool with endless capabilities but we cannot let the excitement of the unknown limit us from the potential of what we still can grow. I hope to apply these principals in my design process as I find ways to combine modern and innovative technology with traditional and reliable physical experiences.

Week 10: Musical Instrument

Concept 

For this week’s assignment we spent a lot of time brainstorming ideas and developing different components of the instrument. Originally, we considered a light sensor activated lullaby machine (at night it plays a lullaby and an alarm in the morning) but weren’t sure if that actually constituted an instrument. Therefore, we decided to keep it more straightforward with an electric keyboard design that you can adjust the speed or “consistency” of the notes playing.

Materials

  • Arduino Uno
  • Jumper wires
  • 7 button switches
  • Speaker
  • SPST Switch
  • Potentiometer

Schematic

Design 

This project’s two main components are 7 buttons to play the notes and a potentiometer to control the speed of the notes playing. The green button to the left (closest to our potentiometer) represents middle C and then each button plays the consecutive note going up that scale. In our design, we also implemented a an SPST switch to control whether or not you could play anything on it, with the intention of mimicking an electric keyboard’s power button. A unique component to our design is the fact that we used both of our Arduino breadboards in order to better organize and manage the aesthetics of the design. Additionally, by still using one actual Arduino Uno we were able to avoid any complications with synching the information which was convenient in terms of time efficiency.

On the implementation side our biggest challenge was getting the volume to change with the adjustments of the potentiometer. After a bit of troubleshooting and heading back to the drawing board we decided to switch our original play and allow the potentiometer to control the speed of the notes. We intended to recycle as much code as we could from class and ultimately ended up use components from both Arduino’s tone() and button example codes that were reviewed in class. After switching between trying to change the pitch, speed, and volume of the notes with the potentiometer we decided on speed simply because we felt that the other adjustments weren’t noticeable enough for our intended purpose.

Code

#include "pitches.h"
 
// button pins
const int B_key = 4; 
const int A_key = 5;
const int G_key = 6; 
const int F_key = 7;
const int E_key = 8;
const int D_key = 9; 
const int C_key = 10; 
 
const int SWITCH_PIN = 12;  // switch pin
const int BUZZER_PIN = 2; // buzzer pin
const int POT_PIN = A0; // potentiometer pin 
 
// notes
int melody[] = {
  NOTE_B5, NOTE_A5, NOTE_G4, NOTE_F4, NOTE_E4, NOTE_D4, NOTE_C4
};
 
// Array for button pins
int buttonPins[] = { A_key, B_key, C_key, D_key, E_key, F_key, G_key };
 
void setup() { // initalization of the buttons, buzzer, and switch pins 
  for (int i = 0; i < 7; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP); // sets button pins as inputs with internal pull-up resistors
  }
 
  pinMode(BUZZER_PIN, OUTPUT); // sets the buzzer pin as an output
  pinMode(SWITCH_PIN, INPUT_PULLUP);  // sets switch as inputs with internatinoal pull-up resistors
}
 
void loop() {
  int switchState = digitalRead(SWITCH_PIN);    // reads the state of the switch
  int potValue = analogRead(POT_PIN);   // reads the potentiometer value (0-1023)
 
  int speedDelay = map(potValue, 0, 1023, 50, 500);  // maps potentiometer value to speed delay range (50-500 ms)
 
  // if the switch is HIGH the button functionality is enabled
  if (switchState == HIGH) {
    // continously checks each button state
    for (int i = 0; i < 7; i++) {
      int buttonState = digitalRead(buttonPins[i]);
        
      // if the button is pressed,  play the corresponding note
      if (buttonState == LOW) {
        tone(BUZZER_PIN, melody[i], 200); // play note for 200ms
        delay(speedDelay); // speed delay based on the position/ value of the potentiometer 
      }
    }
  }
  delay(50);
}

Final Project 

Conclusion and Reflection

All in all, we are both rather pleased with the outcome of this weeks assignment. With it being our first assignment working with another person, it was an interesting opportunity to compare techniques in regards to simpler things like breadboard organization or schematic design preferences. We were able to use both of our strengths in different ways and support each other in areas that we were weaker. In terms of future developments we had a few smaller ideas such as adding LEDs to correspond with the keys, or allowing the user to control the pitch of the notes. Looking at the bigger picture, we even discussed what it would look like to allow the user to choose what kind of electric keyboard sounds they wanted to play (going beyond the use of the speakers we have). Although our project is rather simple, we were truly tested when it came to debugging our code but really appreciated the experience of going at it together.