Reading Week 11

This reading connects immersion to interactivity because Victor actually argues that real interaction should go beyond just touching a screen, “pictures under glass” like phones, tablets, and flat screens reduce human interaction to simple sliding and tapping, while ignoring how our hands actually work through touching, gripping, holding, and movement. I found this really interesting because it made me think about how much we rely on technology that removes physical experience instead of improving it. It also reminded me of the movie WALL-E, where humans stop moving, stop engaging with the world, and become fully dependent on screens and machines. This made me question if we are slowly doing the same thing in real life by ignoring our own human capabilities?

I actually made a project for my Understanding Interactive Media class based on this reading, and I called it Felt. My project focuses on immersion and accessibility because while the article asks us to question how technology limits human capabilities, screens and tech. can also create access for people with disabilities. This made me ask: how can we design interaction in a way that creates both immersion and inclusion? Felt is built on the idea that immersion is not visual, it’s actually physical and it goes beyond the use of vision and the reliance on it. (we see this a lot in many different installations like temLab) I chose to focus on blind people that already experience space differently because they rely more on sound, touch, memory, and balance. For sighted people, vision takes over making us forget how much the rest of the body can do.

Week 11 – Thank you, my arduino or Alternatively: Recreating the Synthesizer in Thank you, my twilight (FLCL)

When I heard the sound of the button in class on Thursday, the intro of this song INSTANTLY popped in my head (first 10-15 seconds, becomes a repeating riff):

Hence, you can imagine where I’m going with this.

Concept:

This felt very self-indulgent, but I’m a piano/keyboard and electric guitar/bass player, hence I wanted to make something I could actually play by pressing things, like the instruments. While I didn’t originally have any intention for analog sensors, I thought I could use the potentiometer to control the “piano” sounds and make them vibrate a bit. I also wanted to be able to play the intro to this song with the notes, and also have an option to listen to it through the instrument (when I’m too lazy to play myself… haha). Usually I have more to say for the concept, but this time, I felt very monkey-brained, especially since I still get confused with arduino: I hear sound, I associate to something else, I make based on this, ta-daa!

Circuit Demonstrations (please ignore the fact that there’s no cursor, I recorded these on my iPad):

Song Button:

Individual Buttons:

Process:

This took me quite a while to do manually, but let’s go step-by-step.

ONE: Find the notes of the song (or part of the song) you want.

While I would have liked to do this with my piano, I 1) don’t have my piano with me and 2) don’t have enough time, so I went online and searched for a sheet music of the intro part of the song.

I found this on musescore, and found out how many notes were there (8 notes: C6, B♭5, A5, G5, F5, E5, D5 and D♭5). From there, I then wrote down the order in which each note came, and how long each note was.

I split each bar by color, and circled all the notes which were quavers (1/2 shorter than the un-circled notes). Then, using this chart, I also marked each frequency. After figuring this out, I then started creating the circuit.

TWO: … make the circuit.

Making the circuit was pretty straightforward. I did opt for a larger breadboard than I usually do (just to fit all the keys), and one thing that did frustrate me was the spacing of the ground and voltage dots on the board (DIAGONAL WIRES???). I had to play around with the spacing of the buttons quite a lot, but otherwise, everything fit well.

THREE: Spend a few hours coding. And coding. And coding a bit more. Oh, wait, you missed a comma- I’ll break my code down lines-by-lines, mainly the parts that make this the instrument (or else I’ll end up breaking everything down).

I defined each frequency I calculated as the note to prevent me from having to type each decimal again and again. I used speedMultiplier because later on in the code, I messed up the speed at which to play the notes (so just temporarily ignore that). StaccatoMultiplier was so I could reduce the length of the note, as the original song has the sound very short and crisp for most notes. I then assigned all of the hardware attached to the Arduino.

#define C6 1046.5
#define B5 932.33
#define A5 880
#define G5 783.99
#define F5 698.46
#define E5 659.26
#define D5 587.33
#define DF5 554.37

float speedMultiplier = 0.9;
float staccatoMultiplier = 0.6;

const int buttons[8]      = {4, 5, 6, 7, 8, 9, 10, 11};
const int baseNotes[8]    = {C6, B5, A5, G5, F5, E5, D5, DF5};
const int PIEZO           = 13;
const int POT             = A0;
const int BTN_PLAY_INTRO  = 3;
const int BTN_STOP_INTRO  = 2;

I coded the song for the second last button (on the circuit). (I was really proud of this part) I had to write down the order of notes first, and then pick out durations for each note in ms. (400 is crotchets, 200 is quavers).

const int introNotes[] = {
  C6, A5, G5, F5, A5, 
  E5, F5, E5, DF5,
  D5, E5, F5, D5, F5, 
  G5, B5, A5, G5, F5, G5,
  C6, A5, G5, F5, A5, 
  E5, F5, E5, DF5,
  D5, E5, F5, D5, F5, 
  G5, B5, A5, G5, F5, G5
};

const int introDurations[] = {
  400, 400, 200, 200, 400,
  400, 400, 400, 400,
  200, 200, 400, 400, 400,
  400, 400, 200, 200, 200, 200,
  400, 400, 200, 200, 400,
  400, 400, 400, 400,
  200, 200, 400, 400, 400,
  400, 400, 200, 200, 200, 200
};

const int INTRO_LEN = 40; // total of 40 notes
int introTimings[40]; // array to store when each note starts
int totalIntroTime = 0;
bool playingIntro = false; // to make sure it doesn't play without pressing button
unsigned long introStartTime = 0;

For setup(), I made sure that the speaker was silent at startup. When I was originally coding, every time I would run the simulation, my ears would blast at the random sounds coming, and I needed to remove that. runningTime was when the song would start playing, and to calculate when to play each note after this, the code would calculate such that:

void setup() {
  noTone(PIEZO);
  for (int i = 0; i < 8; i++)
    pinMode(buttons[i], INPUT);
  pinMode(PIEZO, OUTPUT);
  pinMode(BTN_PLAY_INTRO, INPUT);
  pinMode(BTN_STOP_INTRO, INPUT);

  int runningTime = 0;
  for (int i = 0; i < INTRO_LEN; i++) { // for each of the 40 notes
    introTimings[i] = runningTime; // remember when THIS note starts
    runningTime += (int)(introDurations[i] * speedMultiplier); // because I needed to fix the speed LOL
  }
  totalIntroTime = runningTime;
}

I noticed I was having an issue where when I would play each note in the song, I would have a note in between which didn’t fit. To fix this, I used this:

int currentNote = -1; // start with no note
      for (int i = 0; i < INTRO_LEN; i++) {
        if (elapsed >= introTimings[i] && elapsed < (introTimings[i] + (int)(introDurations[i] * speedMultiplier * staccatoMultiplier))) {
          currentNote = i;
          break;
        }
      }
      
      if (currentNote == -1) {
        noTone(PIEZO); // silence if no note matches current time
      } else {
        tone(PIEZO, (int)introNotes[currentNote]);
      }

The loop would find a note that matches the current time for the next note, and if no note matches the time window for the specific note, it would continue not playing (remains -1) and make it silent until the next note.

Then, here, I edited the vibrato of the note with it (how shaky or pure the note is). This code has that, and also code to stop the song from playing if you play any other note on the piano. The vibrato is only heard on the individual button notes, not the programmed song.

// stopping the music 

for (int i = 0; i < 8; i++) {
      if (digitalRead(buttons[i]) == HIGH) {
        playingIntro = false;
        noTone(PIEZO);
        break;
      }
    }
  }
// individual button mode (no music)

if (!playingIntro) { //only not intro
    float vibratoHz = map(analogRead(POT), 0, 1023, 1, 20);
    float vibratoDepth = 20;

// calculate vibrato as a sine wave
    unsigned long now = millis(); // current time
    float offset = sin(2.0 * 3.14159 * vibratoHz * now / 1000.0) * vibratoDepth;
    // oscillate between -20 and +20, not too much
    bool anyPressed = false; // any piano note pressed
    for (int i = 0; i < 8; i++) { // look through which button is pressed
      if (digitalRead(buttons[i]) == HIGH) {
        int finalFreq = (int)(baseNotes[i] + offset); // play the note with vibrato
        tone(PIEZO, finalFreq);
        anyPressed = true;
        break;
      }
    }
    if (!anyPressed) noTone(PIEZO);
  }
}

Schematic:

Reflections:

I’m glad it came out well. I was worried I’d mess this up and wouldn’t be able to hear the sound, especially with all the fumbles in between such as loud sounds that weren’t coded, or bad timing of the notes. I’m also glad that not only the song worked, but so did the notes! I didn’t expect the vibrato to actually work out so you can actually hear it clearly. I had a lot of fun making this. ദ്ദി(。•̀ ,<)~✩‧₊

I do feel like I could have added more things for it to come out the way I would further envision it. I wanted to put LEDs to light up every time you press a button (but was worried about breadboard space), those LED displays to show something while the song played (but didn’t want to venture there just yet), and include a way for people to add on the rest of the song’s instruments, like the drums and guitar (but didn’t know how to do it on TinkerCad). Hopefully I can implement these in my final project! 🙂

Week 11’s Brief Reading Ran- Sorry, Response | A Brief Rant on the Future of Interaction Design

The Rant first:

Before I start dissecting, let me just put it out there that I agree with everything he’s saying here. Now, we proceed.

"A tool addresses human needs by amplifying human capabilities. A tool converts what we can do into what we want to do."

Always good to start with definitions everyone knows before diving in. He’s right about us hearing about our tools and our needs again and again. But, what makes a tool interesting? What makes one tool capable of replacing another tool? Maybe, it’s because it goes beyond what boxes we had made to determine our human capabilities for that specific task or item. The way my brain describes the core argument (in my notes) of the main article is,

I’ve never read an article that talks, in a much-than-usual amount of detail, about the functions of hands this much before. Also, could we come up with ways to interact with things with other body parts too? (That’s a tangent, so I’ll leave it there). I really liked how he mentions that despite our insane amount of nerve endings, we still decide to go with everyone’s favorite, Pictures Under Glass. This was also super cool:

How do people just think of this? When I scroll with two fingers, my fingers curve, but when I scroll with four, my fingers start flattening. Depending on what you play in the guitar, you can manipulate how your fingers bend without even realizing (bar chords vs. non-bar chords, for example).

I also liked when he talked about Alan Kay and the iPad. He “chased that carrot through decades of groundbreaking research,” decades! If we can spend that long making an iPad with our lovely Pictures Under Glass, surely we can spend some time finding other ways to interact with our hands with technology.

What I found interesting was that he did what good media criticism does: he noticed the assumed thing nobody questions. I would have thought of this, but I wouldn’t have gone all the way to actually further test my theory.

Now… The follow-up. (Since when did ranting need justification?)

  • It’s funny how people say that he didn’t offer a solution. Come up with your own solution then? Sometimes, speaking things out in the void can also end up making change. (For example, we’re reading this, and we’re thinking about what he said, and we can choose to follow his belief and try and do something different.)
  • The second argument is good because it builds on the idea that we can take something good which is existing, and make it better. It doesn’t make it bad… you just add functions that can possibly remove problems that currently exist, or just make it easier to use.
  • “My child can’t tie his shoelaces, but can use the iPad.” Well.
  • He also rebuked my thought of waving hands in the air. Your hands think they’re somewhere different than where the computer thinks they are. No thank you.

What I got from this was that, when I design things, I should remember that there are many different ways we can interact with things around us. If my work only talks to eyes and fingers, I’m wasting the whole human body. I wonder how I could implement that with a video game that’s spread worldwide. How long do we think it will take before we actually live a lifestyle that he proposes?

Week 11 – Reading Reflection

A Brief Rant on the Future of Interaction Design

I really liked this text, and I strongly agree with the author. The fact that a lot of people envision our future of interaction and technology as just super-powerful phones and laptops isn’t really encouraging. I believe that even now we have so much technologies and innovative interactive things, that saying that in the future, the superior one, it’s just a phone is really not right.

Even now people use a lot of motion, body, voice driven technology. For instance, scrolling using your head, if I recall it correctly, you bow your head up and down to control the screen. Of course, it’s not the most creative, and obviously not the best way to interact but it is still more interesting than just tapping. Voice input is also really crazy that by commanding we can control devices, even if they’re as simple as smart speakers. This just shows that there’re a lot of ways to interact beside simple “tap here, tap there”.

I also find the author’s point on touching and physical response really interesting. This is true that the senses we have in our hands is something we shouldn’t ignore, since it allows for so many ways to interact and so much new technology and art. However, I find it hard to imagine what exactly “useful” or widespread, as smartphones, we can do using these sensations. Maybe it is the reason the author talks about the future and not the present.

This part about hands made me remember some technologies from Professor Eid’s lab once again. As I wrote in the last week’s reading response, they have a device that also triggers vibrations on the fingertips of the user if they touch the object in the VR.

They also had a really cool technology I think can be expanded a lot and that fit the idea of the author perfectly: there was some kind of a handle, and an app where you can choose a texture, for instance, some kind of hard jelly. So, the handle controls a ball that you see on the screen. As you move the handle, the ball moves also. And the thing is, that this handle was also “mimicing” the texture: when you try to push the ball through the jelly, you have some resistance and even that “bouncy” feeling, and when it finally comes through — lightness and 0 resistance. I find it to be SO COOL, and the fact that it’s made using only one handle is mind-blowing. I think if it’s possible to expand this technology to make this object-control dependent on the hands, and passing these sensations to the hands, it will be exactly what the author of the text was describinng.


*This is a short video I filmed of using this device so you can see how it works

Week 11 – Musical Device

Concept

I really liked the Ultrasonic Distance Sensor, and I really love the idea of using the outer environment and motion capturing. First, I wanted just to make a device controlled by buttons/potentiometer, but then the idea of using something less obvious came to me. I thought that trying to play sound without touching anything can be really interesting. I decided to use Distance Sensor and Photoresistor for this device.

The musical device is pretty simple: the photoresistor has a threshold of 900 (basically the light that it gets if you point the flashlight right at it), and if it receives light that is higher than this value, it will make the device play. Otherwise it will be silenced. The distance sensor converts the distance into frequency: the farther the object is from it, the lower frequency will be played.

Code

The code is pretty simple. It assigns global variables, has some local variables assigned in the loop() (like the distance and the frequency). Frequency that will be output by the buzzer is determined by the distance. I used distance = duration * 0.0343 / 2; to convert the distance to cm depending on the output of the Ultrasonic device, and then freq = map((int)distance, 5, 200, 800, 200); to map the distance to frequency, so that the distance from 5 to 200 is assigned to frequencies from 800 to 200.

There’s a small block in the beggining of my loop() part to turn off and on the buzzer. It’s made like that so it can change the frequency that it will be outputting.

int lightVal = 0;
bool lightOn = false;

int trigPin = 6;
int echoPin = 5;
long duration;
float distance;
int freq;

int soundPin = 8;

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  Serial.begin(9600);
}

void loop() {
  Serial.println(lightVal);
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);

  digitalWrite(trigPin, HIGH);
  digitalWrite(trigPin, LOW);

  duration = pulseIn(echoPin, HIGH);
  // Calculate distance in cm
  distance = duration * 0.0343 / 2;
  freq = map((int)distance, 5, 200, 800, 200);

  lightVal = analogRead(A0);
  lightOn = lightVal > 950;

  if (lightOn) {
    tone(soundPin, freq);
  } else {
    noTone(8);
  }
}

I was mainly referencing tutorials on Arduino website, like this one for the distance sensor and this one for the photoresistor, to figure out how do I mke them work.

Schematic & Preview

The schematic of the device looks like this (I tried my best to draw it correctly):

This is how it looks in real life:

And this is how it works:

Reflection

I really like how it turned out. My main goal was to use the components of Arduino we haven’t worked in the class with, so I achieved this objective. I also like the fact that I can actually “play” this instrument without even touching it – I think it’s pretty cool.

For further improvement, I believe I can make the device more “usable” because right now pointing the phone right in the middle of bunch of wires doesn’t seem too good. Also, I think I can work with the short delays the device has because now if I flicker the light, it wouldn’t catch it being turned off.

Week 10: Digital (Slide Switch) and Analog (Sound Sensor)

Concept

I wanted to make a lightning system where sound controls the brightness. For my digital sensor, I used a slide switch to turn the LEDs on and off. As for my analog sensor, I used a sound sensor where every time it detects a sound, the LEDs get dimmer. I’m quite interested in sound-activated lights and even though it didn’t come with the kit, I still wanted to give it a try and I got the KY-037 from Amazon.

Full Code | Video Demo | Schematics

Code that I’m proud of

int soundValue = analogRead(soundPin);
 int change = abs(soundValue - 512);

 if (change > threshold) {
   brightness = brightness - 50; // drop brightness by 50 each sound
   Serial.println("Lowering brightness.."); // debug to make sure sound is going through and brightness lowering
   delay(200);
 }

I’m proud of this part because I had to actually understand what the sound sensor was giving me. I thought it would just tell me loud or quiet (HIGH/LOW), but it outputs a number between 0 and 1023 and in silence it sits around 512 (which is about half of that range). I also added the Serial.println myself because I had no idea if claps were even registering so I wanted to confirm it was working in the Serial Monitor before trusting the LEDs and as reassurance.

How this was made

I started with the slide switch wired to pin 2 with a 10kΩ pull-down resistor to GND, the reason for the higher Ω used was to prevent the pin from floating and giving random readings when the switch is open. When it reads LOW, both LEDs turn off and brightness resets to 255 (maximum brightness) so it starts fresh every time. I had to learn how the sound sensor worked and wired it myself. It outputs a continuous number between 0 and 1023 of the volume, sitting around 512 (mid-range value) in silence, and every loop the code reads that value, subtracts 512, and takes the absolute value to get the amplitude. If that crosses my threshold of 70 the brightness drops by 50. I had a problem with sensitivity at first as it kept triggering on background noise or missing claps and I found out that the module has a small dial you adjust with a screwdriver. I also added Serial.println debug line so I could confirm in the Serial Monitor that claps were actually registering before trusting the LEDs. The two LEDs on pins 9 and 10 each have a 330Ω resistor to GND and receive the brightness value through analogWrite using PWM.

Reflection & Future Improvements

This was quite a challenging assignment because I heavily insisted on using a sound sensor. Even though the sound sensor was new to me and not something we covered in class, I was able to apply a lot of the analog concepts we already learned (things like analogRead, analogWrite, and PWM) and it translated over and made it easier to grasp. I went back to the class notes and a few tutorials online (referenced below) to piece it together. If I were to keep going I’d add more brightness steps so the dimming feels smoother, and I’d revisit an earlier idea where the two LEDs go in opposite directions where one dims while the other brightens based on the same concept of having live sound adjusts that. Nevertheless, I’m happy with my output and how it turned out.

References

https://github.com/liffiton/Arduino-Cheat-Sheet

https://docs.arduino.cc/language-reference/

Arduino Sound Sensor: Control an LED with Sound

Week 10 Assignment

Concept

My project is a simple interactive lighting system using one analog sensor and one digital sensor. I used a potentiometer as the analog sensor and a pushbutton as the digital sensor. The idea is that the potentiometer controls the brightness of one LED, while the button controls whether the other LED turns on or off. I wanted to make a small circuit that shows two different ways Arduino can read input and control output. This project is simple, one LED changes gradually, while the other only has two states, on and off.

How I made it:

const int potPin = A0;      // potentiometer connected to analog pin A0
const int buttonPin = 2;    // pushbutton connected to digital pin 2
const int ledDigital = 13;  // LED controlled by the button
const int ledAnalog = 9;    // LED with adjustable brightness

void setup() {
  pinMode(buttonPin, INPUT_PULLUP);  // use the internal pull-up resistor for the button
  pinMode(ledDigital, OUTPUT);       // set digital LED as output
  pinMode(ledAnalog, OUTPUT);        // set analog LED as output
}

void loop() {
  // read the potentiometer value from 0 to 1023
  int potValue = analogRead(potPin);

  // convert the potentiometer value into a brightness value from 0 to 255
  int brightness = map(potValue, 0, 1023, 0, 255);

  // control the brightness of the LED on pin 9
  analogWrite(ledAnalog, brightness);

  // read the button state
  int buttonState = digitalRead(buttonPin);

  // when the button is pressed, turn on the digital LED
  // when the button is not pressed, turn it off
  if (buttonState == LOW) {
    digitalWrite(ledDigital, HIGH);
  } else {
    digitalWrite(ledDigital, LOW);
  }
}

First, I built the circuit in Tinkercad because I do not have a physical Arduino with me right now. I added an Arduino Uno, a potentiometer, a pushbutton, two LEDs, and resistors. Then I connected the potentiometer to 5V, GND, and A0 so Arduino could read its analog value. After that, I connected the pushbutton to pin 2 and GND so it could work as a digital input.

Next, I connected one LED to pin 13 for simple on and off control, and another LED to pin 9 so I could control its brightness with PWM. In the code, I used analogRead() to read the potentiometer value and map() to change that value into a brightness level from 0 to 255. Then I used analogWrite() to change the brightness of the LED on pin 9. For the button, I used digitalRead() to check whether it was pressed, and then I turned the LED on pin 13 on or off.

 

What I’m proud of

I am proud that I solved a problem by checking my circuit carefully. At the beginning, I placed the button incorrectly, so LED2 could not light up. At first, I thought the code might be wrong, but later I realized the problem was in the button connection. After fixing the button placement, the circuit worked correctly.

This mistake helped me learn that in Arduino projects, the wiring is just as important as the code. Even if the code is correct, the circuit will not work if one part is connected incorrectly. I think this was an important learning moment for mebecause I am still new to Arduino.

Conclusion

Overall, this project helped me understand the basic difference between analog input and digital input. I learned how a potentiometer can control LED brightness and how a pushbutton can control an LED in a simple on/off way. I also learned how important it is to test carefully and fix mistakes step by step. However, this is my very fist time to learn Arduino, so this project might be a bit lack of creativity. But during the process, I learned the basics of Arduino.

Week 10 – Arduinooo (#1)

Due to the time crunch this week, I wanted to make something small which I would be able to achieve on time, so unfortunately, this won’t be as fun as usual. (߹ ᯅ ߹)

Concept:

I thought I’d make a Mood Light with a Panic Button: the potentiometer (analog) controls a blue LED that fades smoothly, while a pushbutton (digital) triggers a red LED to blink rapidly, like an alert. I liked the contrast between the two, and I thought it would be fun to see if I could make it work. It kind of reminds me of an Ambulance, maybe.

Process:

Due to my wonderful wi-fi here, my internet kept disconnecting in class, so if there is content here that WAS covered in class but I struggled with, I’m assuming those were what were mentioned while I was fighting with the wi-fi and my data (sorry!). The first thing I had to understand was the pull-down resistor on the button. I kept seeing it in tutorials without really knowing what it was for, so I looked into it. I found that if I just wire a button between 5V and a digital pin, the pin has no defined state when the button isn’t pressed. It floats and picks up random electrical noise, which makes it read random HIGH and LOW values. A pull-down resistor (10k ohms in my case) connects the pin to GND through a high resistance, so when the button isn’t pressed, the pin reads LOW. When the button is pressed, it connects to 5V to show HIGH.

The rest of the circuit was pretty straightforward. In the beginning, I kept forgetting which was the cathode and anode in the LEDs and I had some issues with figuring out the wires (silly mistakes and TinkerCad creating wires when I wanted to click on something else).

The analog part reads the potentiometer using analogRead(), which returns a value between 0 and 1023. Since it expects a value between 0 and 255, I divide the reading by 4 to scale it to the right range. The digital part reads the button with digitalRead(), and if it’s HIGH (pressed), the blue LED alternates on and off with a short delay.

const int POT_PIN    = A0;
const int BUTTON_PIN = 2;
const int FADE_LED   = 9; //analog
const int BLINK_LED  = 8; // digital

void setup() {
  pinMode(BUTTON_PIN, INPUT);
  pinMode(FADE_LED, OUTPUT);
  pinMode(BLINK_LED, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  // Analog, potentiometer controls fade LED brightness
  int potValue = analogRead(POT_PIN);
  int brightness = potValue / 4;       // 0–1023 → 0–255
  analogWrite(FADE_LED, brightness);

  // Digital, button triggers blink LED
  int buttonState = digitalRead(BUTTON_PIN);
  if (buttonState == HIGH) {
    digitalWrite(BLINK_LED, HIGH);
    delay(100);
    digitalWrite(BLINK_LED, LOW);
    delay(100);
  } else {
    digitalWrite(BLINK_LED, LOW);
  }

  Serial.print("Pot: "); Serial.print(potValue);
  Serial.print("  Button: "); Serial.println(buttonState);
}

One thing I initially got wrong: I had the fade LED on pin 7 and couldn’t figure out why it was only turning fully on or fully off with no in-between. Pin 7 doesn’t support PWM, and only pins with the ~ symbol can use analogWrite(), so I moved it to pin 9 and it started working. Good to know.

Schematic:

I don’t even know if this is right, so if it’s not, I am so sorry. I tried to look at class diagrams and went ?, so then I looked at TinkerCad’s schematic and went even more ???.

Circuit:

Reflection/Improvements:

  • Right now, the buttons just trigger one fixed blink pattern. It would be more interesting to have it cycle through different patterns on each press (slow blink, fast blink, heartbeat, etc.) using a counter variable. That would also give the panic button more character.
  • Next time, I’d want to do something much more ambitious. As long as I figure out the beginning things, I can try to make cooler projects with this. I do want to explore actions that aren’t just “a person pressing a thing” to make things interesting.
  • (I can’t wait to get my physical Arduino kit so I can try to make this IRL! (≧▽≦))

Week 10 – Reading Reflection

Making Interactive Art: Set the Stage, Then Shut Up and Listen

I strongly agree with the author of this text especially about the fact that the artist should let the audience to experience, interpret, and feel the artwork without any instructed guidance.

Ideally they will understand what you’re expressing through that experience“. I really believe that if the artist wants the audience to really engage with the artwork on a deeper level, they should give them some space. Through the interaction, through emotions and reactions most of them who really make this interaction thoughtful will get the feeling or at least abstract idea behind the artwork, while still having their own interpretation and emotional feeling attached to that. Instructions kill the emotions, and personally I believe that artworks in the first place should make people feel.

What I noted down, it is really important to set up and design the experience in the way that will make people do the interactions that you want them to do. It goes back to our previous readings, and having this concept mentioned again only amplifies its importance.

Physical Computing’s Greatest Hits (and misses)

What’s great about the themes that follow here is that they allow a lot of room for originality

I believe that the “core” concepts that are used in many physical computing works are just like the core concepts and principles of any science or form of traditional art. “Everything is a remix” is basically telling the same thing: core concepts that artists are inspired by becomes the basis to which they add new unique ideas.

As for the works provided in the text, I really like “gloves” concept. I really like the idea of interacting with art and tech with your body, either it’s a projection, or something else, but I find this idea of just using your limbs to produce something without any additional “parties”. I know a lot of projects that are dependent on this technology, or that use kind of similar technique, not only for art but for more practical use. If I recall correctly, in NYUAD at Professor Eid’s Lab around a year ago I saw a project that was basically some training for children with CP transferred from offline to online VR experience. To keep the “senses” in the hands while completing the exercise, they made a technology that sends vibrations to the fingers when you touch something in VR, and depending on the touch, the vibrations frequency and power was adjusted. This is just another example of how the “gloves” concept was used in physical computing, even though it’s not really about art. However, I believe that the same “vibration” or mimicking a certain touch-sense technology can be really wisely applied in immersive and interactive art — letting the user to interact with the object while also letting them “feel” the object even if it’s not real is a really strong and impressive idea one could apply.

 

Week 10: Reading Response

Physical Computing’s Greatest Hits (and misses)

This article goes through a recurring list of physical computing project themes that show up in classes every year, and I found it quite fascinating how the author encourages students to pursue repeated ideas. That resonated with me because I sometimes catch myself thinking of a project idea only to search if it has been done and end up just not going through with it with the mindset of someone has already done it and even better that what I would’ve done. I think that has stunted my growth and exploration that I probably could’ve learned a lot from. This also reminded me of how in traditional art, everyone paints a still life or draws a figure at some point. Nobody tells you not to draw a bowl of fruit because it has been done before. These things become a learning stepping stone in your work and I think that is just as valuable.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This article felt like a continuation of that idea and here the main argument linked interactive art to a beginning of a conversation and not so much a finished statement like a painting or a sculpture, and the moment you over-explain what something means or how someone is supposed to feel about it, you have already ended that conversation before it started. Coming from a background where I spent some time exploring traditional art, where the work is usually a fixed object that speaks for itself, I found this shift in thinking genuinely difficult to wrap my head around at first. A painting hangs on a wall and you bring yourself to it. Interactive work is different because the piece is actually incomplete until someone engages with it, and what they do becomes part of what the work is.  I resonated with his comparison to a theater director working with actors. He says you can give an actor props and suggest intentions but you cannot tell them how to feel, they have to find it themselves. I think that is a really honest way of describing what good interactive design should do, you are building the conditions for something to happen.. And I think that is harder than it sounds because there is a natural instinct when you make something to want people to get it the way you intended. I feel that every time I finish a project and immediately want to stand next to it and explain it to whoever walks by. Reading this made me realize that impulse, as understandable as it is, is actually working against the experience I am trying to create.