Week 9: analog input & output

int brightness = 0;
int buttonState = 0;


void setup()
{
  
  pinMode(13, OUTPUT);
  pinMode(11, OUTPUT);
  pinMode(2, INPUT_PULLUP);
}

void loop()
{
  
  buttonState = digitalRead(2);
  if (buttonState == LOW) {
  digitalWrite(13, LOW);
} else {
  digitalWrite(13, HIGH);
}
  
int sensorValue = analogRead(A1);
brightness = map(sensorValue, 0, 1023, 0, 255);
analogWrite(11, brightness);

  delay(10); 
}

I didn’t have my arduino yet, so i had to do this on tinkercad.

one led is triggered by a button, which acts as a digital sensor, and the other is controlled by a photoresistor, and increases in brightness the more light in the environment is detected. not really creative, but im crunched on time.

week 8 reading response – her code got humans on the moon

Hamilton wanted to add error-checking code to the Apollo system that would prevent this from messing up the systems. But that seemed excessive to her higher-ups. “Everyone said, ‘That would never happen,’” Hamilton remembers.

But it did.

This really begs the question of whether or not they would’ve taken her concerns more seriously had she been male. It’s insane to think that, despite the extent to which she had already proven her competency, they still dismissed her – especially when her concerns turned out to have been very reasonable.  Even today, women have to work harder than men in most places to be perceived as equally competent – and I think plenty of women would understand what I mean here. So I can’t imagine how much more difficult it must have been back in the 60s.

Stereotype threat is a psychological phenomenon defined as the pressure a person feels to disprove negative stereotypes regarding a group they identify with – be it race, gender, class, etc. Studies have proven that stereotype threat (ironically) has a significant negative impact on a person’s performance. Even being reminded of the stereotype before a task can change the outcome. It’s a vicious self-fulfilling prophecy that further perpetuates negative stereotypes and can hurt a person’s self-esteem, which then further affects performance.

Being a woman who grew up in a misogynistic environment, I really struggled with this cycle for almost as long as I remember. I have so much respect for Hamilton given what she must have had to overcome to thrive as she did.

week 8 reading response – attractive things work better

The author discussed two types of information processing: “affect” and cognition. Of course, these aren’t exact neuroscientific terms (as he mentions himself: “to avoid the technical debate … I use the reasonably neutral term of ‘affect'”), but I really appreciated his interpretation of this concept, as it reflects a very real biological mechanism that significantly impacts our day-to-day lives.

Neuroscientists understand that, to some extent, our reasoning can come after we make a decision. As animals first and foremost, we fundamentally operate on instinct and unconscious processes, especially in faster or emotionally salient situations.

Take a simple example — trying to escape a hazardous situation. Suppose that fleeing people encounter a door that wont open. The anxiety-produced response is to try again harder. When the first push doesn’t open the door, press harder, kick, and even throw the body against it

This is illustrated beautifully by studies on subjects with a severed corpus callosum: when one half of a subject’s brain is asked to explain an action that was triggered and carried out by the other half (thus completely outside its control and awareness), the subject may provide a reasonable rationale and experience the temporal illusion that this reasoning came first.

But all this thinking comes after the fact: the affective system works independently of thought. Your thoughts are occurring after the affective system has released its chemicals.

Affect, as described by the author, is an instinctual gut reaction, while cognition comes afterward. You can see this pattern emerge especially in fields such as philosophy. In fact, I think philosophy – particularly ethics – is a perfect example. There is an almost universal, intuitive sense of right and wrong among our species: a gut feeling that assigns value judgments, just as the author describes (this idea is controversial, but I’m referring to instinctual affect applied to and affected by our species’ advanced social environment.) Ethical philosophy emerges when someone attempts to construct a cognitive framework through which these gut value judgments can be derived. Of course, since these judgments are instinctual, there is no inherent logical framework underlying moral affect, which is why there is no universal agreement on which ethical philosophy is most reliable or “true” (as far as I know).

Each system impacts the other: some emotions — affective states — are driven by cognition, and cognition is impacted by affect.

remy midterm project

 

Embedded sketch:
Overall concept:

My goal with this midterm was to create a demo of a video game – one that I’m planning to expand on in a future opportunity. The concept I had in mind for this demo was a retro-style horror pixel game that takes place in a lab. The player will experience a cutscene and then be placed into an environment where they must interact with the setting in some way.

The story, which isn’t really relevant in the demo version, is supposed to follow a young woman (the player character) working a late-night shift in a laboratory, where she begins to see things in the dark. Below are some of the sprites and assets (used and unused) I created for this project.

player character sprite sheet

unused non-player character sprite sheet

cutscene art

 

laboratory background

How it works and what I’m proud of:

To start with the assets and how I obtained them: all visual elements were drawn by me using the online pixel-art program Pixilart.com. All the sound effects and background noise were downloaded and cut from copy-right free YouTube sounds.

As for the code, rest-assured there was absolutely no ChatGPT usage or any other form of ai-coding. I did attempt to go to two friends – one CS major senior and one graduated CS major – and they somehow only managed to make things worse. I figured everything out myself through either research or agonizing tear-inducing Vyvanse-fueled trial and error.

Below I’ll share and briefly describe snippets of code I’m proud of.

//item toggling; ensuring you need to be within a certain distance, facing the item to interact with it, and the item is still in its initial state
if (keyIsDown(ENTER)) {
  //pc1
  if (
    pc1Opacity == 0 &&
    x > midX - bgWH / 2 + 220 &&
    x < midX - bgWH / 2 + 300 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    pc1Opacity = opaque;
    inRange = true;
    //pc2
  } else if (
    pc2Opacity == 0 &&
    x > midX + bgWH / 2 - 280 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    inRange = true;
    pc2Opacity = opaque;
    //pc3
  } else if (
    pc3Opacity == 0 &&
    x > midX + bgWH / 2 - 280 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 3
  ) {
    inRange = true;
    pc3Opacity = opaque;
    //trash
  } else if (
    trashCanOpacity == 0 &&
    x > midX + bgWH / 2 - 460 &&
    x < midX + bgWH / 2 - 440 &&
    y == midY - bgWH / 2 + 390 &&
    direction === 1
  ) {
    inRange = true;
    garbageOpacity = 0;
    trashCanOpacity = opaque;
  } else if (
    tableOpacity == 0 &&
    x < midX + bgWH / 2 - 290 &&
    x > midX - bgWH / 2 + 310 &&
    y == midY + bgWH / 2 - 320 &&
    direction === 0
  ) {
    inRange = true;
    tableOpacity = opaque;
  } else {
    inRange = false;
  }
  //playing the toggle sound every time all parameters are met
  if (inRange) {
    toggle.setVolume(0.1);
    toggle.play();
  }
}

Okay, so I won’t say I’m exactly too proud of this one because it’s really clunky and a bit repetitive, and I’m sure I would’ve found a much more efficient way to put it had I been more experienced. It does, however, do it’s job perfectly well, and for that I think it deserves a place here. It’s probably one of the parts I struggled with the least given how straightforward it is.

for (let j = 0; j < 4; j++) {
  sprites[j] = [];
  for (let i = 0; i < 4; i++) {
    sprites[j][i] = spritesheet.get(i * w, j * h, w, h);
  }
}

//cycling through sprite array and increments by the speed value when arrow keys are pressed. %4 resets it back to the first sprite in the row (0)
if (keyIsDown(DOWN_ARROW)) {
  direction = 0;
  y += speed;
  step = (step + 1) % 4;
} else if (keyIsDown(LEFT_ARROW)) {
  direction = 2;
  x -= speed;
  step = (step + 1) % 4;
} else if (keyIsDown(UP_ARROW)) {
  direction = 1;
  y -= speed;
  step = (step + 1) % 4;
} else if (keyIsDown(RIGHT_ARROW)) {
  direction = 3;
  x += speed;
  step = (step + 1) % 4;
  //when no key is being pressed, sprite goes back to the standing position (0,j)
} else {
  step = 0;
}

//keeping the sprite from walking out of bounds
if (y >= midY + bgWH / 2 - 320) {
  y = midY + bgWH / 2 - 320;
}
if (y <= midY - bgWH / 2 + 390) {
  y = midY - bgWH / 2 + 390;
}
if (x >= midX + bgWH / 2 - 180) {
  x = midX + bgWH / 2 - 180;
}
if (x <= midX - bgWH / 2 + 175) {
  x = midX - bgWH / 2 + 175;
}

I probably included this snippet in my progress post, since it’s the code I worked on before anything else. Everything else was kind of built around this. (keep in mind that in the actual sketch, the array is created in the setup function and the rest is in the draw function. I just combined them here for simplicity.)

function cutScene1() {
  background(0, 8, 9);
  jumpscare.setVolume(1);
  spookyNoise.setVolume(0.05);
  spookyNoise.play();
  
  //having the creature jitter randomly 
  let y = randomGaussian(midY + 50, 0.4);
  let wh = bgWH;

  tint(255, doorwayOpacity);
  image(doorway, midX, midY + 55, wh, wh);
  noTint();

  //creature fading in
  if (a >= 0) {
    a += 0.5;
    tint(255, a);
    image(creature, midX, y, wh, wh);
    noTint();
  }

  // triggering jumspcare once opacity reaches a certain value
  if (a >= 50) {
    jumpscare.play();
  }

  //ending the function
  if (a > 54) {
    doorwayOpacity = 0;
    background(0);
    spookyNoise.stop();
    jumpscare.stop();
    START = false;
    WAKE = true;
  }
}

This is one of the last functions I worked on. I actually messed this one up quite a bit because my initial attempts really overcomplicated the animation process, and I didn’t know how to make sure the code executed in a certain order rather than at the same time. I tried using a for() loop for the creature fading in, and honestly I really hate for() and while() loops because they keep crashing for some goddamn reason and I kept losing so much progress. It didn’t occur to me at first that I could just… not use a for() loop to increment the opacity. It also took a few tries to get the timing right. One thing I’ll improve on here if I can is add a visual element to the jump scare. I’d probably have to draw another frame for that.

Another thing I’d improve on is adding some dialogue and text-narration to the sequence so that the player has a better idea of what’s going on. I was also planning on implementing some dialogue between the player character and the doctor right after the cutscene, though I unfortunately didn’t have the time for that.

Overall, I’m mostly proud of the visual elements (I’ll be honest, I spent MUCH more time on the visual elements and designing the assets over the rest), because I think I managed to make everything look balanced and consistent – integrating the sprite well with the environment, while having the interactions remain, as far as I’m aware, bug free.

week 5 reading response

human vision and computer vision is actually quite similar in a way. humans are especially attuned to detect even the most subliminal changes in their physical environment, be it sound or light or movement etc.

to illustrate just how important change is for our “vision”, next time you’re in a very dark room (when you go to bed tonight), try and stare at the far corner without blinking or moving your eyes. you’ll begin to notice that, gradually, a darkness is creeping up from your peripherals and slowly makes it way towards the centerpoint of your vision. i thought this was so cool when i first discovered it, felt like i was falling into a void. this happens because the rods in your eyes (which are attuned to both light and movement) and the cones (which are attuned to color) are almost completely deprived of stimulation. your brain figures you dont need your vision if theres nothing to detect.

this is also the reason why we are constantly moving our eyes. ever notice the little micromovements your eyes are always making when your attention is focused externally? they need the movement to help keep them stimulated enough to see. and also, ever notice how, when theres a noise that’s been going on for a long time, you only notice it when it suddenly stops? the brain kind of filters out stimuli that are continuous and unchanging. it’s looking for change, just like computer vision does.

it’s important to realize how inseparable technology and art both are from human biology, it’s all modeled off of our understanding of ourselves. the farther we progress in the fields of biology, medicine, neuroscience, and psychology, the greater capacity we have for advancements and inspiration in ai, computers, architecture, and, by extension, interactive media art.

week 3

    • concept:

simple, i wanted to create a cute blinking birdie staring at some trippy stars and contemplating things. what on earth could he possibly be pondering about? i fear we will never know.

  • A highlight of some code that i’m particularly proud of:

i used while() and if() functions to make the background animation. it’s quite literally just a bunch of thin white concentric circle patterns bouncing off the edges of the canvas, overlapping with eachother and a static one in the middle. pretty neat.

//sky pattern
 rectMode(CENTER);
 strokeWeight(0.4);
 stroke(255);
 stroke("white");
 noFill();

 while (sky > 1 && sky < 900) {
   circle(200, 200, 1);
   circle(200, 200, sky);
   sky += 10;
 }

 stroke(0);

 //pattern 1
 Circle(10, x, y);
 if (y > 375 || y < 25) {
   speedY = speedY * -1;
 }
 if (x > 375 || x < 25) {
   speedX = speedX * -1;
 }
 x = x + speedX;
 y = y + speedY;

 // pattern 2
 Circle(10, a, b);

 if (b > 375 || b < 25) {
   speedB = speedB * -1;
 }
 if (a > 375 || a < 25) {
   speedA = speedA * -1;
 }
 a = a + speedA;
 b = b + speedB;

i also used a randomGaussian() function to have the birdie blink at random intervals.

function lilGuy(){
push();
let r = randomGaussian(50, 150);
stroke(0);
strokeWeight(1);
translate(90, 0);
fill(255);
arc(195, 355.5, 80, 160, 270, 0, PIE);
circle(195, 265, 39);
arc(194, 280, 55, 25, 180, 270, PIE);
strokeWeight(0);
arc(195.5, 360.5, 80, 170, 270, 0);
circle(195, 265, 38);
strokeWeight(1.5);
fill(255);
strokeWeight(1.5);
ellipse(192, 267, w, h);
if (r < 51 && r > 45) {
h = 1;
} else {
h = 17;
}
pop();
}
  • Reflection and ideas for future work or improvements:

if i had more time, i’d definitely add an interactive element, maybe some dialogue options so you can chat with the strange bird and get to the bottom of what he’s been musing about all mysteriously.

week 5 – midterm project

project concept:

for my midterm project, i’m planning to create a simple pixel rpg-style demo. the demo will start with an interactive cutscene, then the player will be able to navigate a room / setting.  the details are very vague, but i’ll figure out what works as i make progress.

design:

my visual inspirations for this project are the games Undertale (particularly the way the sprites and backgrounds are designed), and Sally Face, which i’m using as a reference on how to visually incorporate the dialogue interactions as well as the vibe i’m going for.

i drew the sprites via pixelart.com (honestly i spent more time on them than the coding itself…), and the background music is an 8-bit cover i found of a deftones song, which i thought sounded pretty awesome.  going forward, i intend to draw a pixel background with some interactive elements for the player to navigate. i want the overall experience to look eerie and sickly (which is why the sprite i made may seem a little jaundiced).

most frightening part and how i tackled it:

having to animate a sprite was definitely the most intimidating part for me. to start off, i reread the slides and really studied the examples provided.  problem was, i wanted my sprite to be able to move while the arrow keys are pressed, unlike the example in the slides where you have to spam the keys rapidly. to figure out how to achieve this, i did some googling and scrounged around for (mostly useless) advice on the internet (obviously including ai overview), and ultimately was referred back to the KeyIsDown() reference page on p5.js. however, in trying to incorporate what i was learning, the code got extremely messy and buggy. all sort of horrendous things happened to my little sprite – i cannot bear to speak of it. eventually, i figured things out myself through trial and error (like always), and, while heavily relied on my references to keep me on the right track, all the code is written by me. (i shall add comments later when i continue to work on the project.)

function draw() {
  background(0);
  if (keyIsDown(DOWN_ARROW)) {
    direction = 0;
    y += speed;
    step = (step + 1) % 4;
  } else if (keyIsDown(LEFT_ARROW)) {
    direction = 2;
    x -= speed;
    step = (step + 1) % 4;
  } else if (keyIsDown(UP_ARROW)) {
    direction = 1;
    y -= speed;
    step = (step + 1) % 4;
  } else if (keyIsDown(RIGHT_ARROW)) {
    direction = 3;
    x += speed;
    step = (step + 1) % 4;
  } else {
    step = 0;
  }
  if (y <= 0) {
    y = 0;
  }
  if (x <= 0) {
    x = 0;
  }
  if (y >= windowHeight - 126) {
    y = windowHeight - 126;
  }
  if (x >= windowWidth - 60) {
    x = windowWidth - 60;
  }
  image(sprites[direction][step], x, y, 70, 147);
}

references:

as mentioned earlier, ai was used in the sense that it popped up and tried to provide answers to the questions i googled. i used it as a tool to try and understand how KeyIsDown works when i was experiencing bugs and analyzed the (very simple) examples it provided, then tried to implement what i learned into my code. I did not ask it to fix my bugs or provide me with code.

https://p5js.org/reference/p5/keyIsDown/

https://drive.google.com/file/d/18ZMq9BB1l5XhMx5OfzNciU2OJQbUKvg3/view?usp=sharing

reading reflection, week 3

First of all, I really enjoyed the way this was written. ironically, I think it succeededs emulating  interactivity itself in a way – atleast to the highest extent a static piece of text is capable of.  There’s a clear pattern in the way he writes: he establishes a prospect, makes an attempt at predicting the reader’s reflections or reactions, and then responds accordingly.

He makes the argument that written words cannot be interactive, however, if we were to take his own definition of interactivity (having two parties who, in turn, listen, think, and respond to one another), and if we were to assume he is atleast somewhat correct in his estimations of the reader’s thoughts, there’s clearly an interaction going on between the author and reader. The author is performing an interaction – perhaps not one with any reader in particular, but is clearly making an attempt to reflect back on and respond to a hypothetical audience. At one point he says that “movies dont listen to their audience, nor do they think about what the audience may be saying,” and yet he manages to do exactly that through words alone.

I’m unsure of how intentional this was on the author’s part – I was inclined to give him all the credit, yet he makes abundantly clear his opinion on this matter. I do agree that it isn’t ‘real’ interaction in a traditional sense, but a deeply psychological form of interaction that i think should be appreciated more. I think we could all benifit from achieving this to some level in our own writings.

 

reading response, week 4

I hate Apple and their designs so much. it’s very intentional too, the way they make it such an inconvenience to deal with any non-Apple electronic device to the point where you feel compelled to buy more of their products just to accomodate your phone. The way they impose unnecessary updates so that you’re almost forced to upgrade to a newer model because now your phone is no longer compatible with even their own products. The way they make it such a hassle to transfer your photos and data to a non-Apple product, so the longer you have an iPhone the harder it is to switch to another brand. The way they insist upon automatically sharpening your photos for no reason and making your selfies look fried to tabsolute hell with no ability to disable it. (Still don’t understand that one, it’s just stupid design.) The way their products are so awfully flimsy that they just break for no goddamn reason and Oh no! looks like you need to upgrade to another stupid ass iPhone because all your photos are automatically uploaded to the iCloud and you had no warning or prep time to find a way to save everything to another cloud! pisses me off. someday i’ll free myself of this corporate curse…

anyways if there’s one lesson about design you can glean from this god awful mess of an evil corporation, it’s to not make your consumers hate you, i guess. put the people first. also Macbooks are so confusing to navigate i still refuse to use one.

homework week 4

i tried

so for this one i ended up making this floating eyeball angel thing that generates some cryptic dialogue in a manner reminiscent of 2d indie game boss fight interactions. (click the screen).

the background is very similar to the one i made for my week 2 assignment, nothing special there. I used frameCount to animate the ominous red rays eminating from the eyeball. The slightly more tricky part was getting them to actually follow the eye as it oscillates, though it was pretty easy to figure out.

as for the eyeball itself, i implemented some rotation to give it some depth, and had it oscillate via a sin() function, for which i referred back to the slides and studied the example projects provided. i think he’s my favorite part since it turned out exactly as i intended.

//eyeball & bg animation settings
push();
let pos = 200 + sin(angle) * amp; //sin oscillation
angle += 1;
translate(100, 0);
strokeWeight(0.5);
stroke(0);
noFill();

let x = 300;
let y = pos;
let pattern = 10;

if (frameCount % 8 > 6) {
  pattern = 29;
} else if (frameCount % 8 > 4) {
  pattern = 24;
} else if (frameCount % 8 > 2) {
  pattern = 19;
} else {
  pattern = 17;
}
while (pattern > 1 && pattern < 1000) {
  strokeWeight(1);
  circle(x, y, pattern);
  pattern += 14;
}

//eyeball
strokeWeight(1);
stroke(255);
fill(0);
circle(300, pos, 95);
rotate(-5);
ellipse(262, pos + 27, 40, 55);
ellipse(265, pos + 27, 25, 35);
stroke(255);
strokeWeight(2);
fill(255, 0);
rotate(15);
ellipse(333, pos - 100, 120, 20);
pop();

the generated text was incredibly frustrating to figure out on a short notice, especially since i insisted upon having it animated in a typewriter style. I’ll be honest, I still don’t fully understand what I did, especially since I stayed up all night figuring it out after having deleted all my code by accident (would not recommend). Here I had a friend teach me some tricks, as well as ask AI mode on google to help explain how text generation works, however i did all the actual coding myself. My friend really saved my life, though.

i was thinking of adding some interactive dialogue options, but thats for another time perhaps.

im so tired bro