Week 10: Reading reflection

The portrayal of advanced and seemingly magical technologies in movies, such as those in the Marvel cinematic universe, often sparks fascination and a desire to bring such innovations into reality. The way characters interact with sophisticated devices, particularly through hand gestures, presents a captivating vision of the future. These depictions emphasize the incredible capabilities of the human hand, showcasing it as an organ with immense potential beyond its conventional functions. Hands become channel for unlocking extraordinary powers and controlling cutting-edge technologies, blurring the line between science fiction and reality. The realization of such innovations would not only revolutionize our daily interactions but also underscore the profound importance of the hand as a tool for both mundane tasks and, potentially, for accessing a realm of possibilities that were once considered purely imaginative. The fusion of technology and human anatomy serves as a testament to the boundless creativity and innovation that continues to drive our collective imagination.

The example of child can’t tie his shoelaces, but can use the iPad, depicts an analogy between the way tools are designed for adults and children and the complexity of literature aimed at different age groups. The comparison highlights the notion that tools designed for adults should leverage the full capabilities of mature minds and bodies, just as literature for adults delves into deeper complexities compared to children’s literature. The reference to Shakespeare and Dr. Seuss serves to exemplify this point, suggesting that while a child may not grasp the nuances of Shakespearean works, they can easily understand the simplicity of Dr. Seuss. The analogy extends to tools, emphasizing that limiting interaction to a single finger, as seen in some interfaces, is akin to restricting literature to a basic vocabulary. The argument suggests that such simplified tools might be accessible to children or individuals with certain disabilities, but fully functional adults deserve and can benefit from more sophisticated and nuanced interfaces that make use of their developed cognitive and physical capacities. It prompts a consideration of the balance between accessibility and the potential richness of interaction in the design of tools for adults.

WEEK 10 ASSIGNMENT (PIERRE & DIANA) PIANO/SYNTHESIZER

 

 

CONCEPT:

For this project, our goal was to craft a musical instrument within the constraints of limited resources available. Despite the restricted options, we brainstormed various concepts and ultimately settled on developing a hybrid piano/synthesizer. To accomplish this, we repurposed buttons as keys by integrating two types of sensors, analog and digital. Using a breadboard, we configured the keys and delved into the coding phase. Each button was programmed to generate distinct notes, connected to a borrowed speaker from the IM lab for clearer and better sound quality.

 

As we progressed with the initial setup, the need for an analog sensor arose. We incorporated a potentiometer, not to alter the current, but as a knob to modulate the notes produced by the four buttons. The potentiometer’s range was segmented into three sections, and within the programming loop, an ‘if-else’ construct acted akin to cases, adjusting the notes played in response to the potentiometer’s adjustments.

 

Code:
#include "pitches.h"


const int speaker = 12;


const int tone_G = 2;
const int tone_A = 3;
const int tone_B = 4;
const int tone_C = 5;
const int tone_D = 9;
const int tone_E = 10;
const int tone_F = 11;




int buttonState_G = 0;
int buttonState_A = 0;
int buttonState_B = 0;
int buttonState_C = 0;
int buttonState_D = 0;
int buttonState_E = 0;
int buttonState_F = 0;


int potPin = A0;
int potVal = 0;




void setup() {
 // iterate over the notes of the melody:
 pinMode(tone_G, INPUT);
 pinMode(tone_A, INPUT);
 pinMode(tone_B, INPUT);
 pinMode(tone_C, INPUT);
 pinMode(tone_D, INPUT);
 pinMode(tone_E, INPUT);
 pinMode(tone_F, INPUT);
 
 }


void loop() {
 // no need to repeat the melody.


 potVal = analogRead(potPin);




 buttonState_G = digitalRead(tone_G);
 buttonState_A = digitalRead(tone_A);
 buttonState_B = digitalRead(tone_B);
 buttonState_C = digitalRead(tone_C);
 buttonState_D = digitalRead(tone_D);
 buttonState_E = digitalRead(tone_E);
 buttonState_F = digitalRead(tone_F);


if (potVal < 341)  // Lowest third of the potentiometer's range (0-340)
 {   
 if (buttonState_G == HIGH) {
   // play the sound
   tone(speaker,NOTE_G4);
 }
 else if (buttonState_A == HIGH) {
   // play the sound
   tone(speaker,NOTE_AS4);
 }
 else if (buttonState_B == HIGH) {
   // play the sound
   tone(speaker,NOTE_F4);
 }
 else if (buttonState_C == HIGH) {
   // play the sound
   tone(speaker,NOTE_C3);
 }


 else
  {
   noTone(speaker);
   }
 }


 else if (potVal < 682)  // Lowest third of the potentiometer's range (0-340)
 {
   if (buttonState_G == HIGH) {
   // play the sound
   tone(speaker,NOTE_A4); //random note not matching the name
 }
 else if (buttonState_A == HIGH) {
   // play the sound
   tone(speaker,NOTE_B4); //random note not matching the name
 }
 else if (buttonState_B == HIGH) {
   // play the sound
   tone(speaker,NOTE_C4); //random note not matching the name
 }
 else if (buttonState_C == HIGH) {
   // play the sound
   tone(speaker,NOTE_D4); //random note not matching the name
 }


 else
  {
   noTone(speaker);
   }


 }


 else
 {
   if (buttonState_G == HIGH) {
   // play the sound
   tone(speaker,NOTE_AS4);
 }
 else if (buttonState_A == HIGH) {
   // play the sound
   tone(speaker,NOTE_FS4);
 }
 else if (buttonState_B == HIGH) {
   // play the sound
   tone(speaker,NOTE_GS4);
 }
 else if (buttonState_C == HIGH) {
   // play the sound
   tone(speaker,NOTE_CS3);
 }


 else
  {
   noTone(speaker);
   }
 }
  
 }




 

 

 

Week 10: Musical Instrument

Team members: Javeria and Nafiha

For our assignment, we drew inspiration from a synthesizer and a sampler to create our own musical instrument. Our instrument incorporates three buttons, a piezo buzzer, a potentiometer, and a bunch of wires and resistors. It is designed such that each button triggers a distinct melody, and by adjusting the potentiometer, the pitch is modified, consequently altering the played melodies.

In terms of improving our instrument, one potential feature could be incorporating additional sound effects through the use of the potentiometer. However, overall, working on this assignment was really fun, and we’re pretty pleased with the outcome.

Week 10 Instrument

Video: https://youtu.be/d6cVEQlEJnk

For our group assignment, we weren’t sure in what way a sensor could be implemented to generate musical notes, so we first brainstormed on an instrument on which we could base this assignment: the accordion. To replicate the keys, we decided to use switches (buttons) and for “bellows” we decided on the flex sensor. We first planned out our schematic, first mapping out the switches, resistors then the analog (connecting) wires. However when actually delving into the construction of the breadboard and subsequently the coding aspect, we ran into a few problems that required us to improvise. We first realized through using the serial monitor that the output generated by the flex sensor was incredibly jittery and inconsistent. It was hard to make out a constant value range. Hence we decided to use an alternative sensor: the photoresistor. Ultimately our approach was predominantly to improvise. Once we had resolved our main code, we decided to utilize the LCD, on a whim, to show “:)” or “:(” pertaining to specific conditions. This required us to do some research on how the LCD is connected to the breadboard, and the code used to display characters: https://docs.arduino.cc/learn/electronics/lcd-displays

HOW IT WORKS:
3 buttons play different notes. The range in tone of the note varies according to the photoresistor – the more the light is conceived, the higher the note is played. Subsequently, the LCD shows “:)” when the tone frequency, determined by the analog reading from the flex sensor, is higher than the previous frequency; otherwise, it displays “:(“. This comparison is specific to each color (red, green, blue), and the frequency values are mapped accordingly.

Schematic:

Arduino Code:

// include the library code:
#include <LiquidCrystal.h>

// initialize the library by associating any needed LCD interface pin
// with the arduino pin number it is connected to
const int rs = 12, en = 11, d4 = 5, d5 = 4, d6 = 3, d7 = 2;
LiquidCrystal lcd(rs, en, d4, d5, d6, d7);

int buzzerPin = 8;
int redpin = A0;
int greenpin = A1;
int bluepin = A2;
int phopin = A3;
float prev = 0;

void setup() {
  // put your setup code here, to run once:
  pinMode(buzzerPin, OUTPUT);
  pinMode(redpin, INPUT);
  pinMode(greenpin, INPUT);
  pinMode(bluepin, INPUT);
  pinMode(phopin, INPUT);  //
  lcd.begin(16, 2);
  Serial.begin(9600);
}

void loop() {
  // put your main code here, to run repeatedly:
  int redState = digitalRead(redpin);
  int greenState = digitalRead(greenpin);
  int blueState = digitalRead(bluepin);
  int flexState = analogRead(phopin);  // 350 to 1000
  float redvariance = 130.8 + map(flexState, 350, 1050, 0, 130.8);
  float greenvariance = 261.6 + map(flexState, 350, 1050, 0, 261.6);
  float bluevariance = 523.2 + map(flexState, 350, 1050, 0, 523.2);
  if (redState == HIGH) {
    tone(buzzerPin, redvariance);
    if (higherThanPrev(prev, redvariance)) {
      lcd.print(":)");
    } else {
      lcd.print(":(");
    }
    prev = redvariance;
    delay(100);
  } else if (greenState == HIGH) {
    tone(buzzerPin, greenvariance);
    if (higherThanPrev(prev, greenvariance)) {
      lcd.print(":)");
    } else {
      lcd.print(":(");
    }
    prev = greenvariance;
    delay(100);
  } else if (blueState == HIGH) {
    tone(buzzerPin, bluevariance);
    if (higherThanPrev(prev, bluevariance)) {
      lcd.print(":)");
    } else {
      lcd.print(":(");
    }
    prev = bluevariance;
    delay(100);
  } else {
    noTone(buzzerPin);
  }
  lcd.clear();
}

bool higherThanPrev(float prev, float now) {
  return prev < now;
}

Overall we were quite pleased with the end result, even more so with our LCD addition. However, we felt as though it was difficult to classify our project as a musical instrument since, despite the complex interplay between analog/ digital sensors, the sounds produced were less “musical” in a sense. Furthermore, we realized that whilst the tone of the sound produced by the blue switch was very clearly affected by the amount of light perceived by the photoresistor, this was not the case for the red switch. It was much harder to distinguish a change in the tone of the sound. We believe it is because the red button signals the C note in the 3rd octave, and the blue one signals the C note in the 5th octave. Since the frequency of octaves is calculated with the formula “Freq = note x 2N/12”, the changes in frequencies mapped to the notes will be more significant as octaves increase. For future improvements, especially with regards to its musicality, perhaps we could have each button play a series of notes, hence the 3 switches would produce a complete tune. Rather than mapping ranges for the photoresistor, we could choose a (more than/ less than) specific value. For example, in a dark room, the complete tune would be played in a lower key whilst in a bright room, the tune would play in a higher key.

Week 10 : Assignment (Mudi and Mariam)

Concept :
For our musical instrument, we decided to craft an innovative instrument using an Ultrasonic sensor, a button, and a Buzzer. To kick off the musical vibes, just gently hold down the button. Now, here’s where it gets interesting when you wave your hand in front of the ultrasonic sensor at varying distances it unveils a different array of notes!

 

IMG_4039

 

 

 

 

Code snippet :

int trig = 10;
int echo = 11;
int buttonPin;
long duration;
long distance;
void setup() {
pinMode(echo, INPUT);
pinMode(trig, OUTPUT);
Serial.begin(9600);
}
void loop() {
digitalWrite(trig, LOW); //triggers on/off and then reads data
delayMicroseconds(2);
digitalWrite(trig, HIGH);
delayMicroseconds(10);
digitalWrite(trig, LOW);
duration = pulseIn(echo, HIGH);
distance = (duration / 2) * .0344; //344 m/s = speed of sound. We're converting into cm
int notes[7] = {261, 294, 329, 349, 392, 440, 494}; //Putting several notes in an array
// mid C D E F G A B
buttonPin = analogRead(A0); 
if (distance < 0 || distance > 50 || buttonPin < 100) { //if not presed and not in front
noTone(12); //dont play music
}
else if ((buttonPin > 100)) { //if pressed
int sound = map(distance, 0, 50, 0, 6); //map distance to the array of notes
tone(12, notes[sound]); //call a certain note depending on distance
}
}

 

 

 

Challenges:
I wouldn’t call this one a challenge but more of a hiccup really was that we found ourselves repeatedly unplugging and replugging them due to connectivity issues and the Arduino kept on giving errors.

Week 10 – Reading Reflection

A Brief Rant on the Future of Interaction Design

This guy really doesn’t like screens, so much so that he calls screens “Pictures Under Glass”, meaning that they provide no connection to the task that one is performing. And to be honest, that is true. Even while typing this response, I can feel the keys under my fingertips, knowing where each key is without looking at it. I can tell exactly how much pressure I need to apply on each key for it to type something. I hear a clicking sound whenever I press a key, and I can anticipate which key produces which sound before I even press it with my fingers. I honestly love my keyboard because of how it feels to the touch, even though it’s not exactly very dynamic. I can’t say the same for my phone screen.

This guy suggests that screens were never meant to be the final all-encompassing technology of interaction design, but they are merely a transitional technology. After all, what else in the natural world do we use the gesture of sliding our fingers for? Even the act of making a sandwich requires more interaction than screens, the supposed Interface of the Future.

I find this vision of the future of interaction design very interesting, because I’ve never really thought about how much screens on handheld devices actually ignore our very hands before. I am curious to explore some other visions and ideas that consider this incapacity of screens and provide other exciting possibilities for the purpose of interaction design.

 

Responses

This guy mentions the idea of finger-blindness here which really caught my eye. Finger blindness, as he puts it, is when you don’t use your hands to their full potential for doing what they were meant to do (not swiping away at screens) making it harder to attach meaning and value to the act of touching something, which is a scary thought. Other than that, I love how this guy addresses these responses with humor and also makes sense at the same time. He reiterates that screens aren’t necessarily bad – for now. It’s just that we shouldn’t be limiting ourselves to just screens when we think about future advancements in technology but come up with other receptive devices that appreciate the intricacy of our hands. After all, humans deserve so much more than just screens.

Noctune Noir Week 10

Group Members: Batool Al Tameemi, Arshiya Khattak

Concept: For this project, we knew we wanted to implement something piano-esque, i.e. pressing buttons and implementing different sounds. Essentially, it is a simple Arduino circuit that uses three buttons and a buzzer, playing a note every time a button is pressed. However, we wanted to make the concept a little more fun, so we decided to add a photocell (it was also a cool way to add more notes than just 3). When the photocell is uncovered, the buttons play higher frequency notes (A flat, G, and F) and when it’s covered, it plays lower frequency sounds (D, C, and B). The idea is in daylight the keys play more upbeat-sounding tones, while at night they make more sombre-sounding noises.

Video & Implementation

 

TinkerCad Diagram:

The code for this project was relatively straightforward. The frequency equivalent in Arduino for the notes were taken from this article.

int buzzPin = 8;
int keyOne = A1;
int keyTwo = A2;
int keyThree = A5;
int lightPin = A0;
int brightness = 0;
int keyOneState, keyTwoState, keyThreeState;

int flexOneValue;

void setup() {
  Serial.begin(9600);
}

void loop() {
  brightness = analogRead(lightPin);
  keyOneState = digitalRead(keyOne);
  keyTwoState = digitalRead(keyTwo);
  keyThreeState = digitalRead(keyThree);
  Serial.println(brightness);
  if (brightness < 45) {
    if (keyOneState == HIGH) {
      // tone(buzzPin, 262);
      playTone(587); //D flat
    } else if (keyTwoState == HIGH) {
      playTone(523); //C
    } else if (keyThreeState == HIGH) {
      playTone(494);
    }
  } else {
    if (keyOneState == HIGH) {
      // tone(buzzPin, 1000, 400);
      playTone(831); //A
    } else if (keyTwoState == HIGH) {
      playTone(784); //G
    } else if (keyThreeState == HIGH) {
      playTone(698);
    }
  }
}


void playTone(int frequency) {
  int duration = 200;
  tone(buzzPin, frequency, duration);
  delay(duration);
  noTone(buzzPin);
}

Future Highlights and Improvements

One thing that we both thought would be cool to implement on a longer form of this project would be to have different levels of brightness play different notes, rather than just having two states. It would also be cool to incorporate different forms of input, such as the flex sensor (we tried to use it but it was a bit buggy so we scrapped the idea).

week 10: musical instrument

Group Members: Batool Al Tameemi, Arshiya Khattak

Concept: For this project, we knew we wanted to implement something piano-esque, i.e. pressing buttons and implementing different sounds. Essentially, it is a simple Arduino circuit that uses three buttons and a buzzer, playing a note every time a button is pressed. However, we wanted to make the concept a little more fun, so we decided to add a photocell (it was also a cool way to add more notes than just 3). When the photocell is uncovered, the buttons play higher frequency notes (A flat, G, and F) and when it’s covered, it plays lower frequency sounds (D, C, and B). The idea is in daylight the keys play more upbeat-sounding tones, while at night they make more sombre-sounding noises.

Video & Implementation

 

The code for this project was relatively straightforward. The frequency equivalent in Arduino for the notes were taken from this article.

int buzzPin = 8;
int keyOne = A1;
int keyTwo = A2;
int keyThree = A5;
int lightPin = A0;
int brightness = 0;
int keyOneState, keyTwoState, keyThreeState;

int flexOneValue;

void setup() {
  Serial.begin(9600);
}

void loop() {
  brightness = analogRead(lightPin);
  keyOneState = digitalRead(keyOne);
  keyTwoState = digitalRead(keyTwo);
  keyThreeState = digitalRead(keyThree);
  Serial.println(brightness);
  if (brightness < 45) {
    if (keyOneState == HIGH) {
      // tone(buzzPin, 262);
      playTone(587); //D flat
    } else if (keyTwoState == HIGH) {
      playTone(523); //C
    } else if (keyThreeState == HIGH) {
      playTone(494);
    }
  } else {
    if (keyOneState == HIGH) {
      // tone(buzzPin, 1000, 400);
      playTone(831); //A
    } else if (keyTwoState == HIGH) {
      playTone(784); //G
    } else if (keyThreeState == HIGH) {
      playTone(698);
    }
  }
}


void playTone(int frequency) {
  int duration = 200;
  tone(buzzPin, frequency, duration);
  delay(duration);
  noTone(buzzPin);
}

Future Highlights and Improvements

One thing that we both thought would be cool to implement on a longer form of this project would be to have different levels of brightness play different notes, rather than just having two states. It would also be cool to incorporate different forms of input, such as the flex sensor (we tried to use it but it was a bit buggy so we scrapped the idea).

week 10: reading reflection

Last semester, I took a core class called Touch that discussed the very phenomenon Bret Victor is discussing in A Brief Rant on the Future of Interactive Design. In the class, we took time to understand the sheer capability of human touch and the richness of sensory experiences. Inevitably, we also discussed how modern technology is a disservice to the human tactile and proprioceptive experiences, and could even be contributing to the dullness of our sensory capabilities. Needless to say, I believe Victor’s annoyance at the ‘Pictures Under Glass’ world is very understandable and even justified. This might sound like a claim coming from an angry old boomer, but I genuinely believe most people in today’s world (unless they make the effort to seek it) have extremely limited haptic experiences. We live in an era where the current technology renders our haptic system obsolete. I’d even say that this lack is felt to some capacity in individuals, as you often find people gravitating to objects that are old-fashioned, or even cumbersome, just for the vibes or the aesthetic. Mechanical keyboards, polaroid and film cameras, and even flip-phones. This is because humans were inherently made to touch and feel.

In his follow up, Victor also brings up a point about how using touch-screens is easy because they have been dumbed down enough to be used by toddlers. He says that adults deserve “so much more”, implying that multimodal experiences need to be made for adults. However, I find it important to point out that any sort of primitive interactive can be performed by a child – it’s just that in this day and age, pictures under the glass are all we have. Children deserve more as well, and there should be research on technology as such.

Week 10 | Musical Instrument

Instrument: piano.

Team members:  Fady John (fje3683) and Victor Alves Gomes Nadu (va2269).

Concept

For our instrument project, we have decided to create a piano that plays different tones of the musical scale depending on the measured distance that someone is from the sensor. To build the instrument on the breadboard, we utilized an ultrasonic sensor, LEDs, resistors, and jumper wires, creating an interactive and dynamic musical experience. The analog ultrasonic sensor was responsible for measuring the distance, allowing for an intuitive interaction with the piano. As the user moves closer or farther away, the musical output changes, providing a unique take on the instrument. As for the LEDs, they add a visual element to the project besides also offering an understanding of the distance-based musical scale since they light up corresponding to the played notes.

Circuit Schematic

Figure 1

Code

In the code, we defined the specific pins used for each LED and provided the notes in the musical scale in the form of an array. Specifically, we have used the tones from C4 to C5 in the musical scale as shown in figure 3 below.

Figure 2
Figure 3

 

 

 

Reflections and improvements

Initially, we planned to create drums, but we ended up settling on a piano since that was a more feasible and practical instrument. As for improvements, currently, the distance sensor is not that accurate, which is something that could be worked on. Other than that, we have managed to create something fun and so we are proud of our work.