Week 11: (Pi / Darko) Overengineered Guitar – An Arduino Based Musical Instrument

Introduction

This is the “Overengineered Guitar” by Pi Ko and Darko Skulikj.

We massacred an acoustic guitar with a single string into an automated musical device that can be played akin to a theremin.

Concept – A Cringy Skit

Darko does not know how to play the Pirates of the Caribbean theme song on guitar, so he decided to turn to Arduino for help.

Demonstration and Media

The video demo of the instrument is shown below.

The sound sucks 😡. Moral of the story : Pi should play the guitar instead of the machine 🫵😠🎸.

Concept

In principle, the “Overengineered Guitar” is a one-stringed setup acoustic guitar for playing by using an array of sensors and servo motors. It has a push button digital sensor and an analog ultrasonic sensor. The main idea is to control the musical notes by hand over the ultrasonic sensor. Then a propeller does the mechanical plucking, controlled through a push button.

This provides the user the opportunity to play the predefined sequence from “He’s a Pirate” from Pirates of the Caribbean over and over again.

Components

  • Arduino Uno: Serves as the main controlling box for all the sensors and actuators used.
  • Servo Motors (5x): Five servo motors are being attached all across the fretboard, pressing their respective frets according to the desired note.
  • Ultrasonic Sensor: Used to toggle the press on the fretboard by the motors.
  • Digital pushbutton: It is pressed down to start up the propeller, which plucks the string.
  • Propeller motor and DC Motor: It gives the mechanical pluck on the guitar string.
  • L293D motor driver IC: Takes care of the high current requirement for the propeller.
  • External Power Supply: This ensures that the system power is being properly distributed among the various components without having to necessarily overpower the Arduino.

(Arduino, ultrasonic sensor, switch and L293D IC)

(Propellor on DC Motor)

The motors are attached to the arduino as below.

Arduino Pin Motor/Music Note ID Open Servo Angle Press Servo Angle
11 5 180 0
10 4 0 180
6 3 180 0
5 2 180 0
3 1 180 0
N/A 0 (open string) N/A (no servo) N/A (no servo)

Challenges

Our main challenge involved managing power to ensure that each component the required current. Also, the wiring was a challenge since there were a lot of wires.

(Wire management 😎)

Task Allocation

Darko took care of the wiring and code for the ultrasonic sensor and the switch using non-blocking code. The rest is filled by Pi.

Code

The code is below. It has debouncing to guarantee reliable operation of the switch.

#include <Servo.h>

// Define a struct to hold servo data
struct ServoData {
  int pin;
  int openAngle;
  int pressAngle;
  Servo servo;
};

// Create an array of servos for each note
ServoData servos[] = {
  {3, 180, 0}, // Note 1
  {5, 180, 0}, // Note 2
  {6, 180, 0}, // Note 3
  {10, 0, 180}, // Note 4
  {11, 180, 0} // Note 5
};
const int numServos = sizeof(servos) / sizeof(ServoData);

// Note durations in milliseconds
int noteDurations[] = {500, 500, 2000, 500, 2000, 500, 1000, 500, 500, 1000};
int noteSequence[] = {0, 1, 2, 3, 4, 5, 3, 2, 1, 2};
const int numNotes = sizeof(noteSequence) / sizeof(int);

unsigned long previousMillis = 0;  // Stores last update time
int currentNoteIndex = 0;          // Index of the current note being played

// Push Button and Propeller control
const int buttonPin = 4; // Pushbutton pin
const int ledPin = 13; // LED pin (for debugging)
int enA = 9; // Enable pin for motor
int in1 = 8; // Motor control pin
int buttonState = 0; // Current button state
// Define the pins for the ultrasonic sensor
const int trigPin = 13;
const int echoPin = 12;

// Define variables for the duration and the distance
long duration;
int distance;


void setup() {
  // Setup for servos
  for (int i = 0; i < numServos; i++) {
    servos[i].servo.attach(servos[i].pin);
    servos[i].servo.write(servos[i].openAngle);
  }

 // Define pin modes for ultrasonic
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
  // Setup for button and propeller
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT);
  pinMode(enA, OUTPUT);
  pinMode(in1, OUTPUT);
  analogWrite(enA, 255); // Set propeller speed
  digitalWrite(in1, LOW); // Initially disable propeller
}

void loop() {
  unsigned long currentMillis = millis();
  // Darko - Switch
  // Improved button reading with debouncing
  int readButton = digitalRead(buttonPin);
  if (readButton != buttonState) {
    delay(50); // Debounce delay
    readButton = digitalRead(buttonPin);
    if (readButton == HIGH) {
      digitalWrite(ledPin, HIGH);
      digitalWrite(in1, HIGH); // Enable propeller
    } else {
      digitalWrite(ledPin, LOW);
      digitalWrite(in1, LOW); // Disable propeller
    }
    buttonState = readButton;
  }

  // Darko - Ultrasonic
  // Clear the trigPin condition
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  // Sets the trigPin HIGH (ACTIVE) for 10 microseconds
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  // Reads the echoPin, returns the sound wave travel time in microseconds
  duration = pulseIn(echoPin, HIGH);
  
  // Calculating the distance
  distance = duration * 0.034 / 2; // Speed of sound wave divided by 2 (go and back)
  

  if(distance<=12){

  // Handling servo movements based on timing
  if (currentMillis - previousMillis >= noteDurations[currentNoteIndex]) {
    // Move to the next note
    if (noteSequence[currentNoteIndex] != 0) {
      // Release the previous servo, if any
      int prevNote = (currentNoteIndex == 0) ? -1 : noteSequence[currentNoteIndex - 1];
      if (prevNote != -1 && prevNote != 0) {
        servos[prevNote - 1].servo.write(servos[prevNote - 1].openAngle);
      }
      // Press the current servo
      int currentNote = noteSequence[currentNoteIndex];
      if (currentNote != 0) {
        servos[currentNote - 1].servo.write(servos[currentNote - 1].pressAngle);
      }
    } else {
      // Release all servos for open string
      for (int i = 0; i < numServos; i++) {
        servos[i].servo.write(servos[i].openAngle);
      }
    }

    previousMillis = currentMillis; // Update the last actuated time
    currentNoteIndex++;
    if (currentNoteIndex >= numNotes) {
      currentNoteIndex = 0; // Restart the sequence
    }
  }
  }
}

 

 

Week 11: A Brief Rant on the Future of Interaction Design & Follow-Up Article

In my Spring semester of last year, I took a class called “Touch” that opened my eyes not only to the primacy of touch, but also to our consistent ignorance of the primacy of touch. Our tactile sensory system is the first to develop in the womb and is arguably the last to deteriorate as we get older. It is also the only reciprocal sense – we can only touch something if it touches us – providing us with a truly connective interface to interact with the diverse environments around us. The sense of touch is also extremely vital for normal cognitive and social development. Yet, prototypes of future technological interfaces reduce the full range of interaction that touch affords us or eliminate it altogether. The future vision for AR/XR technologies seems to be doing away with the tactile. Pictures-Under-Glass, the moniker Victor aptly coins and uses in this week’s readings, technologies already offer us limited tactile feedback that only our fingertips enjoy. We now see the future of AR/XR interfaces, such as headsets utilizing eye movement tracking systems, heading toward a direction where the visual is prioritized. Our rich haptic system is left to atrophy in the process.

It is indeed worrying that we might reach a time when kids struggle to use a pen or open lids. In attempting to connect to the world via modern-day interfaces, it seems like we are disconnecting from it in important ways. There must have been interfaces that we could have envisioned that integrate all of our varied sensory systems, had we invested the time to make that a priority. Victor’s rant and his follow-up article were both written in 2011. 13 years later, his call to invest in research that centers the full range of human capabilities in the design of technological interfaces seems to have gone unanswered. We have yet to make our capabilities a priority in design, but that could change if we collectively dare to choose a different way of envisioning tomorrow’s technologies.

 

 

Afra binjerais – Week 11 reading response

Reading 1:

I found great resonance in reading the assessment of the shortcomings of the interfaces used by modern technology, particularly the widespread “Pictures Under Glass” method. It caused me to reevaluate how I use technology and how these regular interactions lack the tactile depth. I became dissatisfied with the devices that rule my life after reading the author’s claim that our future interfaces will not be visionary; instead, they will only perpetuate the problems with the current system.
I was forced to consider how my devices, which are primarily touchscreens, feel impersonal and distant after reading the criticism. My laptop, phone, and even the smart devices at home operate with simple, repetitive motions that don’t have the complex feedback loop that comes with more tactile interactions. This insight made me uncomfortable since it brought to light a sensory deprivation that I was experiencing when interacting with the digital world. Compared to handling tangible objects like books or tools, I’m now more conscious of the shallowness of the experience and the flatness of the screens I swipe across on a regular basis.

Reading 2:

In reflecting on “A Brief Rant on the Future of Interaction Design,” a key question emerges: How can designers and technologists create interactive technologies that truly enhance human capabilities and foster a deeper connection with our physical environment, rather than simply replicating or replacing traditional physical interactions?

This reading essentially aligns with an argument for a more deliberate approach to technology, one that honors and strengthens rather than ignores our cognitive and physical abilities. It serves as a reminder that, whether we are using, designing, or developing technology, our relationship with it should not be passive. Rather, we should support and push for advancements that genuinely increase human potential in terms of interacting with the digital world. This mode of thinking encourages us to imagine a future where technology supports and enhances human talents in more comprehensive and meaningful ways, which is in line with a larger philosophical dilemma concerning the role technology should play in our lives.

Week 10 Reading Response

I enjoyed Tigoe’s piece about creating one’s work and then giving it the freedom to be experienced by the public. In the world of interactive art, instead of laying out your art with a manual, it’s about setting the stage and letting the audience be the audience. The main goal here is to direct a play, where the users aren’t told what to think or feel, but are provided with cues and props to navigate their journey. I would 100% prefer having the chance to figure out a task myself than to read a manual or to hear an explanation. It’s like teaching a friend a new card game and saying, “You’ll catch on as we play”. Of course, there will be a trial and error or some confusion but once you’re there, there’s an “ohhh” moment which is my favourite to experience and witness. This reading made me redefine interactivity in my head, it’s almost like figuring out someone’s piece is so stimulating it could in itself be the peak of the experience.

In the second reading, I liked the gloves work. Just as the author voices the importance of listening to the audience’s reactions and responses to the artwork, the discussion about sensor integration in gloves underscores the importance of understanding and responding to the gestures and movements of the user. This parallels the dynamic exchange described in the first reading, where meaning unfolds through the engagement between creator and audience.
Overall, both readings focus on the importance of creating interactive experiences that allow users to engage and contribute to the ongoing painting not a final product, whether through exploring an interactive artwork or experimenting with musical expression using gloves.

Pi : Week 11 Reading – The Feel Factor: Escaping the Flatland of Modern Interfaces

Whenever I do the Interactive Media reading assignments, I almost always disagree or criticize the author, or make fun of their claims. But this week will be different. Bret Victor is one of the only two people I idolized when I was a child and one day hope to be like. Whatever those two men (the childhood heroes) say, I will always agree with them 😉 .

I work in the field of haptics – the science of touch. My supervisor once explained that when a human uses a particular tool for long, he becomes too much accustomed to that tool. For example, think of the carpenters, painters, guitarists, etc.

This continues to the extent that the tool becomes a part of his body, and his means to sense the world.

But in order for that tool to become an extension of his body, the tool must give the user “good tangible feedback.” And that kind of feedback is something digital devices truly lack. Try using an actual ruler to measure the length of something… Now try measuring the same distance using the measurement app in the iPad, which is technically intricate, but very cumbersome to use. It just does not feel like a ruler.

For one of my personal projects, I draw circuit diagrams with Chinese traditional ink on rice paper—to imagine an alternate universe, where electricity and the industrial revolution have started from the Orient.

In this project, I tried and evaluated two methods: the iPad method and the literal ink method. Somebody whose eyes are not trained will tell you that the results are nearly the same. Thu,  in reality, both methods work. However, for my artistic satisfaction, the iPad approach was a disaster. Though I am using a state of the art Chinese brush simulator, I could not feel the resistance of the paper, the ink was flowing, the brush was scratching the paper, and so on, ad infinitum.

The approach towards the iPad was simply soulless. Thus, Bret’s vision—or better said, his criticism on the lack of vision in the current design trends—had a deep echo with my experiences. “Our hands feel things and our hands manipulate things. Why shoot for anything less than a dynamic medium that we can see, feel, and manipulate?” This should be written on the wall of each tech company urging to birth the next iteration of soulless gadgets into existence.

No manner of me waving a stylus around is going to recreate the sensory feedback of ink travelling down to paper through my ink brush.

In short, the iPad stylus will never become an extension to my body, like the real paintbrush does.

“Hands feel things, and hands manipulate things,” Bret points out ; I couldn’t but smile to see that most modern designs spectacularly reject this in a fashion way over the top.

I really love Bret’s equivalent of the whole “Pictures Under Glass” idea to how utterly ridiculous it would have been if someone had once said, “black-and-white is the future of photography.”

In conclusion, Bret’s rant isn’t just a criticism; it’s a roadmap for future innovators.

Bret did not give us a solution, but a question to ponder. This question alone should challenge us to look for something beyond the conventional in thinking about how we make our interactions with technology as instinctive as moving our limbs, as natural as using your hands.

Week 10 Response – Tom Igoe

Tom Igoe’s articles on physical computing and interactive art describe the growing relationship between technology and user engagement. In Physical Computing’s Greatest Hits (and Misses), Igoe talks about the relationship between creativity and technology, highlighting how simple designs can provoke complex interactions and reflections. Additionally, in Making Interactive Art: Set the Stage, Then Shut Up and Listen, he advocates for a minimalistic approach in guiding the audience’s experience by using the art of subtlety in interactive design.

Igoe’s philosophy resonates deeply with me; it challenges the beauty of discovery within the constraints of design and technology, reminding creators to trust their audience’s intuitive interactions with their work. .

Week 10 Assignment – “Who’s First?”

For this assignment, I had a hard time coming up with a creative idea. My twin nephews were over at the time, and I could hear them arguing about something. When I checked up on them, they were fighting over whoever was faster than the other, so I decided to make something to settle that (Love you, Ahmed and Jassim).

The concept is pretty simple:

It’s a two-player game. One player gets the photoreceptor, and the other gets the button. Whoever can cause their light to flash first proves that they’re faster than the other.

Materials Used:

  • 2x 330 resistors
  • 2x 10k resistors
  • 8x wires
  • 2x LEDs (one Red, one Yellow)
  • 1x button
  • 1x photoresistor

Setup:

I used examples from previous classes to piggyback on. The slides were very helpful to me.

The website I used will NOT be used again. Although I faced many problems with the resistors, it got the job done.


Code:

 

int photoSensorPin = A1;
int buttonPin = 2;
int ledPin1 = 3;
int ledPin2 = 4;
void setup() {
  pinMode(photoSensorPin, INPUT);
  pinMode(buttonPin, INPUT);
  pinMode(ledPin1, OUTPUT);
  pinMode(ledPin2, OUTPUT);
}
void loop() {
  int sensorValue = analogRead(photoSensorPin);
  int buttonState = digitalRead(buttonPin);
  if (buttonState == HIGH) {
    digitalWrite(ledPin1, LOW);
    digitalWrite(ledPin2, HIGH);
  } else if (sensorValue < 450) {
    digitalWrite(ledPin1, HIGH);
    digitalWrite(ledPin2, LOW);
  } else {
    digitalWrite(ledPin1, LOW);
    digitalWrite(ledPin2, LOW);
  }
}

 

Demo:

IMG_0577

Reflection:

Making this game was a pretty fun challenge for me. This language is very new to me, and I’m still having issues getting used to it. I would like to take this a step further and possibly use other forms of input devices to accurately guess who’s faster (Ahmed won BTW).

 

 

 

Raya Tabassum: Reading Response 5

Tom Igoe’s reflections on recurring themes in physical computing projects highlight a vibrant field where creativity and technology meet. He encourages embracing projects even if they seem “done” because each iteration can bring new insights and innovations. From theremin-like instruments, which challenge creators to add meaningful physical interactions, to gloves and video mirrors that mix simplicity with potential for deeper engagement, these themes showcase the dynamic range of physical computing. It’s a call to think differently about what might seem mundane or overdone and to find one’s voice within well-trodden paths. This deeper engagement with the familiar challenges us to discover the unseen potentials in everyday ideas.

The other article by him named “Making Interactive Art: Set the Stage, Then Shut Up and Listen,” highlights a fundamental shift required for artists transitioning into interactive art. He stresses the importance of allowing the audience to engage and interact with the artwork without preconceived notions influenced by the artist’s own explanations. This approach champions a more open-ended exploration where the viewer’s personal experience and interpretation are paramount. This concept connects seamlessly with his views in “Physical Computing’s Greatest Hits (and misses),” where he encourages embracing well-worn themes with fresh perspectives. Both writings emphasize the creative potential in letting go of control—whether it’s about reinterpreting common project themes or allowing interactive artworks to speak for themselves through audience interaction.

Raya Tabassum: Emotional Room Temperature Indicator

Concept: This project combines analog and digital inputs to control LEDs in an innovative way. It’s an “Emotional Room Temperature Indicator”, which uses temperature data and user input to reflect the room’s “mood” with light.

Components Used:

  • Arduino Uno
  • TMP36 Temperature Sensor (Analog Sensor)
  • Push-button switch (Digital Sensor)
  • Standard LED (Digital LED)
  • RGB LED (Analog LED)
  • 220-ohm resistors for LEDs
  • 10k-ohm resistor for the button switch
  • Breadboard
  • Jumper wires

Implementation: I made a reel with description and uploaded as a YouTube shorts embedded below:

Programming:

#define LED_PIN 3
#define BUTTON_PIN 7
int redPin = 9; 
int greenPin = 10; 
int bluePin = 11;
int sensor = A0;

byte lastButtonState = LOW;
byte ledState = LOW;

unsigned long debounceDuration = 50; // millis
unsigned long lastTimeButtonStateChanged = 0;

void setup() {
  pinMode(LED_PIN, OUTPUT);
  pinMode(BUTTON_PIN, INPUT);
  pinMode(redPin, OUTPUT); 
  pinMode(greenPin, OUTPUT); 
  pinMode(bluePin, OUTPUT); 
  pinMode(sensor, INPUT);  // Reading sensor data

  Serial.begin(9600);  // Initialize serial communication at 9600 bits per second
}

int givetemp() {
  int reading = analogRead(sensor);
  float voltage = reading * 5.0 / 1024.0;
  float temperatureC = (voltage - 0.5) * 100;
  Serial.print(voltage); 
  Serial.print(" volts  -  ");
  Serial.print(temperatureC); 
  Serial.println(" degrees C  -  ");
  return (int)temperatureC;  // Convert float temperature to int before returning
}

void loop() {
  int currentTemp = givetemp();  // Call once and use the result multiple times

  if (currentTemp >= 20 && currentTemp <= 30) {
    digitalWrite(greenPin, HIGH);
    digitalWrite(bluePin, LOW);
    digitalWrite(redPin, LOW);
  } else if (currentTemp < 20) {
    digitalWrite(greenPin, LOW);
    digitalWrite(bluePin, HIGH);
    digitalWrite(redPin, LOW);
  } else if (currentTemp > 30) {
    digitalWrite(greenPin, LOW);
    digitalWrite(bluePin, LOW);
    digitalWrite(redPin, HIGH);
  }

  if (millis() - lastTimeButtonStateChanged > debounceDuration) {
    byte buttonState = digitalRead(BUTTON_PIN);
    if (buttonState != lastButtonState) {
      lastTimeButtonStateChanged = millis();
      lastButtonState = buttonState;
      if (buttonState == HIGH) {  // Assumes active HIGH button
        ledState = !ledState;  // Toggle the state
        digitalWrite(LED_PIN, ledState);
      }
    }
  }
}

Actual temperature of the room:

The TMP36 senses the room temperature. Depending on the “comfort” range (set here as 20°C to 30°C), the RGB LED changes colors — blue for cool, red for hot, and green for just right.
The push-button allows the user to set the mood of the room. When pressed, the standard yellow LED lights up to acknowledge that the room is at a comfortable temperature, while the RGB LED shows the current temperature-based mood of the room. The user can thus get immediate visual feedback on whether the room’s temperature is within a comfortable range or not.

Week #10 Assignment

Concept: 
For this assignment, I decided to implement the things that I have already learned in regards to the ultrasonic sensor and add the button. I wanted to see how far I can go with my knowledge.
Implementation:
The implementation was pretty easy, as I have used the previous assignment as an example. I just added the button and changed the code and voila!
End result:

Code:

//Intro to IM - Stefania Petre
// Define LED pins
int ledPin[3] = {8};

// Define Ultrasonic sensor pins
const int trigPin = 2; // or any other unused digital pin
const int echoPin = 3; // or any other unused digital pin

const int buttonPin = 13;
int buttonState = HIGH;
int lastButtonState = HIGH;
long lastDebounceTime = 0;
long debounceDelay = 50;
int pushCounter = 0;
int numberOfLED = 3;

void setup() {
  pinMode(buttonPin, INPUT);
  digitalWrite(buttonPin, HIGH); // Activate internal pull-up resistor
  
  // Set up LED pins
  for (int i = 0; i < numberOfLED; i++) {
    pinMode(ledPin[i], OUTPUT);
  }
  
  // Set up Ultrasonic sensor pins
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an OUTPUT
  pinMode(echoPin, INPUT);  // Sets the echoPin as an INPUT
}

void loop() {
  int reading = digitalRead(buttonPin);

  // Check if the button state has changed
  if (reading != lastButtonState) {
    // Reset the debounce timer
    lastDebounceTime = millis();
  }

  // Check if the debounce delay has passed
  if ((millis() - lastDebounceTime) > debounceDelay) {
    // If the button state has changed, update the button state
    if (reading != buttonState) {
      buttonState = reading;

      // If the button state is LOW (pressed), increment pushCounter
      if (buttonState == LOW) {
        pushCounter++;
      }
    }
  }

  // Update the last button state
  lastButtonState = reading;

  // Turn off LED
  for (int i = 0; i < numberOfLED; i++) {
    digitalWrite(ledPin[i], LOW);
  }

  // Perform Ultrasonic sensor reading
  long duration, distance;
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  distance = (duration * 0.0343) / 2; // Calculate distance in cm

  // Perform actions based on distance measured
  if (distance < 30) {
    // Turn on LED
    digitalWrite(ledPin[0], HIGH);
 

  // Delay before next iteration
  delay(100); // Adjust as needed
}

Comments:

Even though I got it to work, I still would have liked it to change colors depending on the distance from the sensor. I will try to implement that next time!