Week 10 reading response: A Brief Rant on the Future of Interaction Design & it’s follow up

This was the first time I’d ever seen that Microsoft Productivity of Future vision video, and though I know it wasn’t their intention, it felt incredibly depressing and even somewhat dystopian to me. Something about a world so sanded down in its aesthetics and devoid of any rich sensory experiences – all in the name of efficiency.

It makes me think of the minimalism movement that’s been slowly eating away at our world for the past several decades. Creative shapes and textures and architecture have been replaced by simple, monotone geometric shapes and plain, shiny surfaces. You may have seen these a bunch of times already, but just take a look at what popular businesses like McDonald’s and Hot Topic used to look like a couple of decades ago or so:

And compare that to what they look like now.

It’s not just technology; everything is turning as flat and smooth and understimulating as possible. But why? I mean the answer is the same as it always is: money. It’s easier to sell blank slates, so it’s safer to keep your real estate as generic as you can get away with than to allow room for creativity and differentiation. And speaking of money, I think the reason technological advances are continuing to shift towards the “pictures under glass” format the author was talking about is probably, in part, due to the fact that it just sells better.

Humans are fundamentally lazy creatures, in an adaptive sense. We subconsciously try to spend fewer resources for the same results – and those resources includes mental and physical energy. It’s an evolutionary trait obviously, the more efficient we are the more resources and energy we have to shift focus to other things (the industrial revolution, for example). So it follows that the simplest, most brain-rotting designs that require the least amount of physical and mental effort will sell the best.

We’ve almost given up on the body already. We sit at a desk while working, and sit on a couch while playing, and even sit while transporting ourselves between the two. We’ve had to invent this peculiar concept of artificial “exercise” to keep our bodies from atrophying altogether.

It won’t be long before almost every one of our daily activities is mediated by a “computer” of some sort. If these computers are not part of the physical environment, if they bypass the body, then we’ve just created a future where people can and will spend their lives completely immobile.

This applies to you and me just as much as it does anyone else. We have to make an active effort not to get sucked in by the allure of algorithms that spoon-feed us simple mind-numbing social media content, (which, by the way, is associated with gray matter atrophy*). In a way, this instinct for efficiency is what keeps us advancing as a species. However, if left unchecked, we might end up giving up things we value about ourselves, our creativity and imaginations… or our ability to think critically.

Also, the author accidentally predicted AI slop:

Creating: I have a hard time imagining Monet saying to his canvas, “Give me some water lilies. Make ’em impressionistic.”

 

*Not to imply a causal relationship, there isn’t enough data to come to any severe conclusions.

Week 8

The Concept:

Using three LED lights and an Ultrasonic sensor, I made a little traffic light thing that goes from green to yellow to red the closer you get to it.

Here’s the (crudely drawn) schematic sketch:

Unfortunately, I encountered an error trying to upload my sketch to the Arduino that persisted throughout every single troubleshooting method and fix I could find.

This means that, while I can show the physical Arduino wiring, I’ll have to use Tinkercad to demonstrate it in action.

Here’s the Arduino:

Here’s the video demonstrating Tinkercad circuit:

The code:

As for the code itself, I used a public domain tutorial for some help because I had no idea how to get the ping received from the sensor and convert it to centimeters.

Here’s the GitHub link.

int cm = 0;
int triggerPin = 7;    // TRIG pin
int echoPin = 6;
int red = 12;
int yellow = 11;
int green = 10;    // ECHO pin
int ping = 0;
int close = 50;
int far = 100;

void setup()
{
  pinMode(red, INPUT);
  pinMode(yellow, INPUT);
  pinMode(green, INPUT);
  pinMode(triggerPin, OUTPUT);
  pinMode(echoPin, INPUT);
  Serial.begin(9600);
}

void loop()
{
  digitalWrite(triggerPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(triggerPin, LOW);

  // measure duration of pulse from ECHO pin
  ping = pulseIn(echoPin, HIGH);

  // calculate the distance
  cm = 0.017 * ping;
  // measure the ping time in cm
  Serial.print(cm);
  Serial.println("cm ");
  Serial.print(ping);
  Serial.println("ms");
  delay(100); // Wait for 100 millisecond(s)
  
  digitalWrite(green, LOW);
  digitalWrite(yellow, LOW);
  digitalWrite(red, LOW);

  
  if(cm < close){
    digitalWrite(red, HIGH);
  }
  if(cm < far && cm > close){
    digitalWrite(yellow, HIGH);
  }
  if(cm > far){
    digitalWrite(green, HIGH);
  }
  
  delay(100);
}

week 9 reading response – Physical Computing’s Greatest Hits (and misses)

I attended the IM final exhibition last semester, and my absolute favorite one was this tilty stand by Hubert Chang where you had to stand on a skateboard and steer it to control a little spaceship in a video game. It was so much fun to play and the one I found the most engaging due to the full-body involvement. I really love skating so I had the highest score – until some girl cheated by keeping her hands on the table to balance herself.

(ignore the terrible posture)

week 9 reading response – Making Interactive Art

I definitely agree with this, and I think it applies to art as a whole to some extent as well. I believe that artist interpretation matters less than the viewer’s, because one of the greatest things about art is the creativity and inspiration it can spark in people. Some of my favorite movies (listed below) are ones that are so confusing and surreal to the point where I have to construct my own narrative afterwards to put the pieces together. Like waking up and trying to recall a dream.

  • Possession (1891) (Also indirectly introduced me to Sonic Youth, my favorite band of all time, via a youtube edit)
  • Stay (2005)
  • I Saw the TV Glow (2024)
  • Climax (2018) (this one has a more defined plot, but it’s such a dizzying movie I had to include it)
  • Enter the Void (2009) (VERY mixed feelings abt this one..)

Week 9:

int brightness = 0;
int buttonState = 0;


void setup()
{
  
  pinMode(13, OUTPUT);
  pinMode(11, OUTPUT);
  pinMode(2, INPUT_PULLUP);
}

void loop()
{
  
  buttonState = digitalRead(2);
  if (buttonState == LOW) {
  digitalWrite(13, LOW);
} else {
  digitalWrite(13, HIGH);
}
 
int sensorValue = analogRead(A1);
brightness = map(sensorValue, 0, 1023, 0, 255);
analogWrite(11, brightness);

  delay(10); 
}

I don’t have my arduino yet, so i had to do this on tinkeCcad.

One led is triggered by a button, which acts as a digital sensor, and the other is controlled by a photoresistor, and increases in brightness the more light in the environment is detected. Not really creative, but I’m crunched on time.

UPDATE

That was TERRIBLE, I didn’t know what I was doing. Anyways, I finally got my Arduino kit, so I made a quick alternative version:

I completely forgot there had to be two sensors, but I hope that the previous iteration can make up for that. Please have mercy.

Here’s a video of the circuit in action, and below is the schematic.

Here’s the GitHub link.

int brightness = 0;
int buttonState = 0;


void setup()
{
  pinMode(12, OUTPUT);
  pinMode(11, OUTPUT);
  pinMode(2, INPUT);
}

void loop()
{
  // timing yellow lED
  for (brightness = 0; brightness <= 255; brightness += 5) {
    analogWrite(11, brightness);
    delay(30);
  }

  // triggering red LED if button is pressed
   buttonState = digitalRead(2);
  if (buttonState == HIGH) {
    digitalWrite(12, HIGH);
  } else {
    digitalWrite(12, LOW);
  }
  delay(10);
}

week 8 reading response – her code got humans on the moon

Hamilton wanted to add error-checking code to the Apollo system that would prevent this from messing up the systems. But that seemed excessive to her higher-ups. “Everyone said, ‘That would never happen,’” Hamilton remembers.

But it did.

This really begs the question of whether or not they would’ve taken her concerns more seriously had she been male. It’s insane to think that, despite the extent to which she had already proven her competency, they still dismissed her – especially when her concerns turned out to have been very reasonable.  Even today, women have to work harder than men in most places to be perceived as equally competent – and I think plenty of women would understand what I mean here. So I can’t imagine how much more difficult it must have been back in the 60s.

Stereotype threat is a psychological phenomenon defined as the pressure a person feels to disprove negative stereotypes regarding a group they identify with – be it race, gender, class, etc. Studies have proven that stereotype threat (ironically) has a significant negative impact on a person’s performance. Even being reminded of the stereotype before a task can change the outcome. It’s a vicious self-fulfilling prophecy that further perpetuates negative stereotypes and can hurt a person’s self-esteem, which then further affects performance.

Being a woman who grew up in a misogynistic environment, I really struggled with this cycle for almost as long as I remember. I have so much respect for Hamilton given what she must have had to overcome to thrive as she did.

week 8 reading response – attractive things work better

The author discussed two types of information processing: “affect” and cognition. Of course, these aren’t exact neuroscientific terms (as he mentions himself: “to avoid the technical debate … I use the reasonably neutral term of ‘affect'”), but I really appreciated his interpretation of this concept, as it reflects a very real biological mechanism that significantly impacts our day-to-day lives.

Neuroscientists understand that, to some extent, our reasoning can come after we make a decision. As animals first and foremost, we fundamentally operate on instinct and unconscious processes, especially in faster or emotionally salient situations.

Take a simple example — trying to escape a hazardous situation. Suppose that fleeing people encounter a door that wont open. The anxiety-produced response is to try again harder. When the first push doesn’t open the door, press harder, kick, and even throw the body against it

This is illustrated beautifully by studies on subjects with a severed corpus callosum: when one half of a subject’s brain is asked to explain an action that was triggered and carried out by the other half (thus completely outside its control and awareness), the subject may provide a reasonable rationale and experience the temporal illusion that this reasoning came first.

But all this thinking comes after the fact: the affective system works independently of thought. Your thoughts are occurring after the affective system has released its chemicals.

Affect, as described by the author, is an instinctual gut reaction, while cognition comes afterward. You can see this pattern emerge especially in fields such as philosophy. In fact, I think philosophy – particularly ethics – is a perfect example. There is an almost universal, intuitive sense of right and wrong among our species: a gut feeling that assigns value judgments, just as the author describes (this idea is controversial, but I’m referring to instinctual affect applied to and affected by our species’ advanced social environment.) Ethical philosophy emerges when someone attempts to construct a cognitive framework through which these gut value judgments can be derived. Of course, since these judgments are instinctual, there is no inherent logical framework underlying moral affect, which is why there is no universal agreement on which ethical philosophy is most reliable or “true” (as far as I know).

Each system impacts the other: some emotions — affective states — are driven by cognition, and cognition is impacted by affect.

remy midterm project

 

Embedded sketch:
Overall concept:

My goal with this midterm was to create a demo of a video game – one that I’m planning to expand on in a future opportunity. The concept I had in mind for this demo was a retro-style horror pixel game that takes place in a lab. The player will experience a cutscene and then be placed into an environment where they must interact with the setting in some way.

The story, which isn’t really relevant in the demo version, is supposed to follow a young woman (the player character) working a late-night shift in a laboratory, where she begins to see things in the dark. Below are some of the sprites and assets (used and unused) I created for this project.

player character sprite sheet

unused non-player character sprite sheet

cutscene art

 

laboratory background

How it works and what I’m proud of:

To start with the assets and how I obtained them: all visual elements were drawn by me using the online pixel-art program Pixilart.com. All the sound effects and background noise were downloaded and cut from copy-right free YouTube sounds.

As for the code, rest-assured there was absolutely no ChatGPT usage or any other form of ai-coding. I did attempt to go to two friends – one CS major senior and one graduated CS major – and they somehow only managed to make things worse. I figured everything out myself through either research or agonizing tear-inducing Vyvanse-fueled trial and error.

Below I’ll share and briefly describe snippets of code I’m proud of.

//item toggling; ensuring you need to be within a certain distance, facing the item to interact with it, and the item is still in its initial state
if (keyIsDown(ENTER)) {
  //pc1
  if (
    pc1Opacity == 0 &&
    x > midX - bgWH / 2 + 220 &&
    x < midX - bgWH / 2 + 300 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    pc1Opacity = opaque;
    inRange = true;
    //pc2
  } else if (
    pc2Opacity == 0 &&
    x > midX + bgWH / 2 - 280 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    inRange = true;
    pc2Opacity = opaque;
    //pc3
  } else if (
    pc3Opacity == 0 &&
    x > midX + bgWH / 2 - 280 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 3
  ) {
    inRange = true;
    pc3Opacity = opaque;
    //trash
  } else if (
    trashCanOpacity == 0 &&
    x > midX + bgWH / 2 - 460 &&
    x < midX + bgWH / 2 - 440 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    inRange = true;
    garbageOpacity = 0;
    trashCanOpacity = opaque;
  } else if (
    tableOpacity == 0 &&
    x < midX + bgWH / 2 - 290 &&
    x > midX - bgWH / 2 + 310 &&
    y == midY + bgWH / 2 - 320 &&
    direction === 0
  ) {
    inRange = true;
    tableOpacity = opaque;
  } else {
    inRange = false;
  }
  //playing the toggle sound every time all parameters are met
  if (inRange) {
    toggle.setVolume(0.1);
    toggle.play();
  }
}

Okay, so I won’t say I’m exactly too proud of this one because it’s really clunky and a bit repetitive, and I’m sure I would’ve found a much more efficient way to put it had I been more experienced. It does, however, do it’s job perfectly well, and for that I think it deserves a place here. It’s probably one of the parts I struggled with the least given how straightforward it is.

for (let j = 0; j < 4; j++) {
  sprites[j] = [];
  for (let i = 0; i < 4; i++) {
    sprites[j][i] = spritesheet.get(i * w, j * h, w, h);
  }
}

//cycling through sprite array and increments by the speed value when arrow keys are pressed. %4 resets it back to the first sprite in the row (0)
if (keyIsDown(DOWN_ARROW)) {
  direction = 0;
  y += speed;
  step = (step + 1) % 4;
} else if (keyIsDown(LEFT_ARROW)) {
  direction = 2;
  x -= speed;
  step = (step + 1) % 4;
} else if (keyIsDown(UP_ARROW)) {
  direction = 1;
  y -= speed;
  step = (step + 1) % 4;
} else if (keyIsDown(RIGHT_ARROW)) {
  direction = 3;
  x += speed;
  step = (step + 1) % 4;
  //when no key is being pressed, sprite goes back to the standing position (0,j)
} else {
  step = 0;
}

//keeping the sprite from walking out of bounds
if (y >= midY + bgWH / 2 - 320) {
  y = midY + bgWH / 2 - 320;
}
if (y <= midY - bgWH / 2 + 390) {
  y = midY - bgWH / 2 + 390;
}
if (x >= midX + bgWH / 2 - 180) {
  x = midX + bgWH / 2 - 180;
}
if (x <= midX - bgWH / 2 + 175) {
  x = midX - bgWH / 2 + 175;
}

I probably included this snippet in my progress post, since it’s the code I worked on before anything else. Everything else was kind of built around this. (keep in mind that in the actual sketch, the array is created in the setup function and the rest is in the draw function. I just combined them here for simplicity.)

function cutScene1() {
  background(0, 8, 9);
  jumpscare.setVolume(1);
  spookyNoise.setVolume(0.05);
  spookyNoise.play();
  
  //having the creature jitter randomly 
  let y = randomGaussian(midY + 50, 0.4);
  let wh = bgWH;

  tint(255, doorwayOpacity);
  image(doorway, midX, midY + 55, wh, wh);
  noTint();

  //creature fading in
  if (a >= 0) {
    a += 0.5;
    tint(255, a);
    image(creature, midX, y, wh, wh);
    noTint();
  }

  // triggering jumspcare once opacity reaches a certain value
  if (a >= 50) {
    jumpscare.play();
  }

  //ending the function
  if (a > 54) {
    doorwayOpacity = 0;
    background(0);
    spookyNoise.stop();
    jumpscare.stop();
    START = false;
    WAKE = true;
  }
}

This is one of the last functions I worked on. I actually messed this one up quite a bit because my initial attempts really overcomplicated the animation process, and I didn’t know how to make sure the code executed in a certain order rather than at the same time. I tried using a for() loop for the creature fading in, and honestly I really hate for() and while() loops because they keep crashing for some goddamn reason and I kept losing so much progress. It didn’t occur to me at first that I could just… not use a for() loop to increment the opacity. It also took a few tries to get the timing right. One thing I’ll improve on here if I can is add a visual element to the jump scare. I’d probably have to draw another frame for that.

Another thing I’d improve on is adding some dialogue and text-narration to the sequence so that the player has a better idea of what’s going on. I was also planning on implementing some dialogue between the player character and the doctor right after the cutscene, though I unfortunately didn’t have the time for that.

Overall, I’m mostly proud of the visual elements (I’ll be honest, I spent MUCH more time on the visual elements and designing the assets over the rest), because I think I managed to make everything look balanced and consistent – integrating the sprite well with the environment, while having the interactions remain, as far as I’m aware, bug free.

week 5 reading response

human vision and computer vision is actually quite similar in a way. humans are especially attuned to detect even the most subliminal changes in their physical environment, be it sound or light or movement etc.

to illustrate just how important change is for our “vision”, next time you’re in a very dark room (when you go to bed tonight), try and stare at the far corner without blinking or moving your eyes. you’ll begin to notice that, gradually, a darkness is creeping up from your peripherals and slowly makes it way towards the centerpoint of your vision. i thought this was so cool when i first discovered it, felt like i was falling into a void. this happens because the rods in your eyes (which are attuned to both light and movement) and the cones (which are attuned to color) are almost completely deprived of stimulation. your brain figures you dont need your vision if theres nothing to detect.

this is also the reason why we are constantly moving our eyes. ever notice the little micromovements your eyes are always making when your attention is focused externally? they need the movement to help keep them stimulated enough to see. and also, ever notice how, when theres a noise that’s been going on for a long time, you only notice it when it suddenly stops? the brain kind of filters out stimuli that are continuous and unchanging. it’s looking for change, just like computer vision does.

it’s important to realize how inseparable technology and art both are from human biology, it’s all modeled off of our understanding of ourselves. the farther we progress in the fields of biology, medicine, neuroscience, and psychology, the greater capacity we have for advancements and inspiration in ai, computers, architecture, and, by extension, interactive media art.

week 3

    • concept:

simple, i wanted to create a cute blinking birdie staring at some trippy stars and contemplating things. what on earth could he possibly be pondering about? i fear we will never know.

  • A highlight of some code that i’m particularly proud of:

i used while() and if() functions to make the background animation. it’s quite literally just a bunch of thin white concentric circle patterns bouncing off the edges of the canvas, overlapping with eachother and a static one in the middle. pretty neat.

//sky pattern
 rectMode(CENTER);
 strokeWeight(0.4);
 stroke(255);
 stroke("white");
 noFill();

 while (sky > 1 && sky < 900) {
   circle(200, 200, 1);
   circle(200, 200, sky);
   sky += 10;
 }

 stroke(0);

 //pattern 1
 Circle(10, x, y);
 if (y > 375 || y < 25) {
   speedY = speedY * -1;
 }
 if (x > 375 || x < 25) {
   speedX = speedX * -1;
 }
 x = x + speedX;
 y = y + speedY;

 // pattern 2
 Circle(10, a, b);

 if (b > 375 || b < 25) {
   speedB = speedB * -1;
 }
 if (a > 375 || a < 25) {
   speedA = speedA * -1;
 }
 a = a + speedA;
 b = b + speedB;

i also used a randomGaussian() function to have the birdie blink at random intervals.

function lilGuy(){
push();
let r = randomGaussian(50, 150);
stroke(0);
strokeWeight(1);
translate(90, 0);
fill(255);
arc(195, 355.5, 80, 160, 270, 0, PIE);
circle(195, 265, 39);
arc(194, 280, 55, 25, 180, 270, PIE);
strokeWeight(0);
arc(195.5, 360.5, 80, 170, 270, 0);
circle(195, 265, 38);
strokeWeight(1.5);
fill(255);
strokeWeight(1.5);
ellipse(192, 267, w, h);
if (r < 51 && r > 45) {
h = 1;
} else {
h = 17;
}
pop();
}
  • Reflection and ideas for future work or improvements:

if i had more time, i’d definitely add an interactive element, maybe some dialogue options so you can chat with the strange bird and get to the bottom of what he’s been musing about all mysteriously.