Week 10 Reading Response: Tom Igoe

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This was a pretty interesting look into interactive media as a participatory media. Often artists tend to be afraid of their art being assigned a wrong meaning, and artistic clarifications usually keep popping up well after a work is created (JK Rowling is a particularly glaring example). It stems from a fear that our own beliefs, our own creation, will be inevitably challenged by the audience. This is even more true for something that involves even more two-way communication, like an interactive art showcase. Even in our own documentation, it is tempting to include an exact user guide of how to “use” our art. And while user guides are helpful for products, they might not have the same effect in art.

That is not to say however, that all interactive art has to be fully participatory and unscripted. This again goes back to the debate we had when discussing Chris Crawford’s The Art of Interactive Design. Once again, the question pops up of whether interactive is participatory, or whether it has to be organic. For example, a lot of games are deeply scripted, and while it may be argued that playing the game is an organic experience of its own, it doesn’t change the fact that in a lot of games, you will be seeing most, if not all, the events that are meant to happen. Visual novels are another example. By most definitions of interactivity, they are not interactive, being no better than picture books. Yet, they do feel interactive in their own way.

At the end of the day, it is up to the artist and what compromise they make between interactivity and meaning.

Physical Computing’s Greatest Hits (and misses)

This was a nice look at different classes of physical computing projects and the benefits and challenges of each. Usually, there seems to be a common theme where there is a compromise between implementation and engagement. Relatively simpler projects like the theremin, floor pad, video mirrors, and Scooby Doo paintings may be easier to implement and more common, but they also lose engagement quickly. On the other hand, more complex mechanical pixels, body/hand cursors, multi-touch interfaces and fields of grass may be more engaging but often run into implementation or maintenance issues that take away some of the charm. In a way, my main takeaway from the article was that rather than your physical computing mechanism, what is more important is the story and how the mechanisms tie in to the story being told.

Week 10: Analog/Digital (Parking Sensor)

Introduction

For this project, I wished to use the ultrasonic sensor in some form, especially as it used to be my favorite sensor back when I experimented around with LEGO Mindstorms. I remembered that conceptually, a parking sensor was, in its simplest form, a combination of a digital input (Reverse Gear On/Off) and an analog input (ultrasonic/electromagnetic sensor). So, I decided to emulate that for this project.

Components
  • 1 Arduino Uno R3 SMD
  • 1 HC-SR04 4-pin Ultrasonic Distance Sensor
  • Slideswitch
  • Arduino Piezo Buzzer
  • 10 kΩ Resistor
  • 330 Ω Resistor
  • Red LED
  • 1 Green LED
  • Jumper Wires (3 red, 5 black, 2 blue, 2 yellow, 2 white, 1 green [in lieu of black, as per convention])
Circuit Schematic and Simulation

The first step was to prepare a circuit schematic on TinkerCAD. Basically, the digital input took the form of a slideswitch feeding into a digital input pin (pin 12) through a 10 kΩ pull-down resistor. Analog Input (which I later discovered was actually a digital input) came from the Echo pin of the Ultrasonic sensor (pin 8), while the ultrasonic pulses were triggered from pin 9. Pin 7 and PWM Pin 5 controlled the digital-controlled Green LED and the analog-controlled Red LED respectively, while the buzzer output was from PWM pin 5.

Figure 1: Component Simulation View

Figure 2: Schematic View

TinkerCAD also has a handy simulation functionality, that even allows to upload the Arduino code and test how the circuit would work under simulator conditions. This definitely helped in fixing bugs before even testing with the actual circuit, and also helped to individually simulate each component before assembling together.

Usage and Implementation

So, the “parking sensor” is first switched on with a slideswitch, which simulates the gear being changed. The green LED turns on, indicating that the sensor is now switched on.

The analog part works by taking the reflection times reported by the ultrasonic sensor, which is converted to distance using the speed of sound. The distance is mapped to the brightness of the red LED and the frequency of repetition of beeps from the buzzer.

The circuit was assembled almost identically to the above schematic. As I had run out of black jumper wires, I substituted using a green jumper wire, as per convention. No other connections made use of green jumper wires. Other than the convention of red for live / black for neutral, I had additional conventions of my own: blue for inputs, white for digital outputs, and yellow for analog outputs.

Figure 3: Front View

Figure 4: Top View

Figure 5: Arduino Pins Connection

Code
//Global
int distance = 0;
int duration = 0;
int brightness = 0;
int minDistance = 5;
int maxDistance = 30;

//Pins
int digitalIn = 12;
int trigger = 9;
int echo = 8;
int green = 7;
int red = 5;
int buzzer = 3;

// Ultrasonic Function
long readUltrasonicDistance(int triggerPin, int echoPin)
{
  pinMode(triggerPin, OUTPUT);  // Clear the trigger
  digitalWrite(triggerPin, LOW);
  delayMicroseconds(2);
  // Sets the trigger pin to HIGH state for 10 microseconds
  digitalWrite(triggerPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(triggerPin, LOW);
  pinMode(echoPin, INPUT);
  // Reads the echo pin, and returns the sound wave travel distance in centimeters
  return 0.01723 * pulseIn(echoPin, HIGH);
}

void setup()
{
  pinMode(digitalIn, INPUT);
  pinMode(green, OUTPUT);
  pinMode(red, OUTPUT);
  pinMode(buzzer, OUTPUT);
  Serial.begin(9600);
}

void loop()
{
  if (digitalRead(digitalIn) == HIGH) {
    digitalWrite(green, HIGH);
    distance = readUltrasonicDistance(trigger, echo);
    if (distance < maxDistance && distance != 0) {
      duration = map(distance, 0, maxDistance, 5, 60);
      brightness = map(distance, 0, maxDistance, 220, 0);
      analogWrite(red, brightness);
      tone(buzzer, 523); // play tone C5 = 523 Hz for 100 ms
      delay(100);
      if (distance > minDistance){
        noTone(buzzer);
        delay(duration); // Wait for (duration) millisecond(s)
      }
    } else {
      analogWrite(red, 0);
      noTone(buzzer);
    }
    Serial.println(distance);
    delay(10); // Wait for 10 millisecond(s)
  } else {
    digitalWrite(green, LOW);
    analogWrite(red, 0);
    noTone(buzzer);
  }
}

The code is not too different from the examples we did in class, other than the function for the Ultrasonic Sensor. For that, I followed this helpful tutorial on the Arduino Project Hub.

Showcase

Reflections

I enjoyed working on this project, although figuring out the Ultrasonic Sensor and debugging it did take a while. I was actually impressed by how sensitive the sensor turned out to be, and how it managed to sense even objects passing perpendicular to its field of view (as demonstrated towards end of demo). Thus, other than being a parking sensor, it does actually work as a rudimentary safety sensor, being able to detect pedestrians as well.

I originally wished to include another LED (a yellow one that would fade with distance and a red that would trigger at a threshold minimum distance), but I ran out of black jumper wires and viable space on the breadboard, so I cut it down to two. Also, not shown in the showcase itself is an issue that the circuit has with powering off. Possibly due to the built-in delays, it takes a while between the switch turning to off position and the LEDs themselves actually turning off.

Also, I realized only later that Ultrasonic Sensors technically are digital inputs, but since they transduce an analog measurement into a digital input, I felt that it worked for the scope of this project.

Week 9: Unusual Switch (Sleep-Snitcher 3000)

The Switch itself

Before going into the background, let’s look at the switch itself:

Concept

A switch to be controlled by a body part other than the hand is not completely out of the realm of imagination, but it is quite difficult to imagine, especially for a circuit based on an Arduino. When I went to collect the copper strip from the IM lab, I got the idea of taping it to my forehead. And that’s how Sleep-Snitcher 3000 was born.

Conceptually, it is a tongue-in-cheek detector of whether a student is sleeping in class. When a student’s forehead is on the table, the LED lights up. This may also be the right place to add a disclaimer that the maker of this project does not advocate the use of this in schools or other educational institutions. It is meant to only be a fun and slightly satirical project.

Design

Circuit Diagram of the circuit, which goes from 5V pin on Arduino, to a 330ohm resistor, to the components acting as the switch to the led and finally to the GND pin. Figure 1: Circuit Diagram

The circuit is pretty simple. After this is all that the assignment actually requires. The above is a slightly confusing diagram from Tinkercad that shows the circuit design.

The 5V DC power output pin is used to keep the resistor and one terminal of what is technically a push-button switch live. When my forehead is on the table, it connects the two terminals of the “push-button switch” and thus lights up the LED, whose anode is connected to the the switch and cathode to GND (ground).

Materials

  1. Arduino Uno Rev3 SMD x1
  2. Jumper wires x5
  3. 330Ω carbon resistor x1
  4. Breadboard with 400 contacts x1
  5. Basic Red 620nm LED x1
  6. Aluminum Foil x1 A4 sheet (courtesy of Indian by Nature!)
  7. Copper Tape x10 inches
  8. Paper Bag x1

Implementation

At first, I was planning to connect one terminal to an aluminum foil sheet on the table and the other to my forehead. This came with a major problem: the wires were just too short. Besides, a hypothetical customer of this product wouldn’t want to connect and disconnect themselves from a circuit repeatedly. Then I remembered that in push-button switches, the button itself is usually not a terminal of the switch but acts as a wire bridging the two terminals. So, I came up with the following split-terminal design using aluminum sheets.

Two aluminum sheets connected to wires linked to a breadboard.Figure 2: Switch terminals (the green terminal is +ve and the yellow terminal is -ve)

The copper strip simply goes across the forehead and that’s pretty much it.

And finally, the breadboard components were obtained from the Sparkfun kit we were provided, and the Arduino was connected to a laptop for power supply.

Reflections

This turned out to be more fun and simpler than I thought it would be. This was my first experience with Arduino outside of our class, but it felt nice to directly apply the concepts that were drilled into us by both high school physics and FOS. Also, the switch was surprisingly consistent, as I had expected that the circuit would be interrupted by a layer of insulating aluminum oxide. All in all, it was a fun project, with a fun concept.

Use of Generative AI

Generative AI (OpenAI ChatGPT 3.5) was used to name the project (the original name was A Student’s Nightmare) and to generate the disclaimer at the end of the video.

Week 8 Reading Response: Don Norman and Margaret Hamilton

Don Norman’s Emotion and Design: Attractive things work better

The most interesting part of this reading for me (as well as the most important, I believe) was the part about Affect Theory, and how it influences design considerations. I was always aware of the tenets of affect theory, as in its essence it is a core part of behavioral ecology, or how the behavior of animals influences their survival and reproduction, although I never knew of “affects” by name prior to this reading. I especially liked reading about affect, especially negative affect, being a threshold effect, with low levels of negative affect increasing concentration. That is because the negative affect mainly draws instinctual reactions of fear or anger, and both of these reactions have evolved specifically to increase concentration in survival situations. But when the negative affect gets higher, the fear/anger triggered gets overwhelming and leads to anxiety and freezing up, which to be fair is another evolution-designed response to challenges that cannot be solved by “fight” or “flight”. In general, as Norman points out, negative affect causes “tunnel visioning”, while positive affect causes the “broadening of the thought process”.

So, knowing about affects becomes an important consideration in design. This part of the reading is also highly interesting. Norman compares scenarios where there is a degree of external negative (like in emergencies or dangerous work), neutral (most day-to-day actions), or positive affect (like in creative and safe spaces). In the negative affect case, Norman asserts that designs should emphasize function and minimize irrelevancies. For example, emergency exit doors should, by design, immediately tell users which way they swing (besides they should swing outwards anyway to prevent crowd crush, but that is a matter of building codes rather than design). This is a sentiment Norman has expressed earlier in “The Design of Everyday Things”. But in a neutral and positive affect scenario, it becomes important to also consider design, and small sacrifices of functionality for good design becomes increasingly more tolerable in positive affect scenarios. For example, we often find ourselves gravitating towards better-looking pencil boxes, better-looking soap dispensers, ornate wall clocks and wristwatches, and sleeker smartphones, among others. In each case, as long as the product scores high enough on our mental calculations regarding usability and cost-effectiveness, we often do go for the more attractive product.

And this is most apparent when looking at phone sales. For example, recent iPhones (like the 14 and the 14 Pro), actually have higher sales in colors like Purple, Gold, and Blue than even Black (an ever popular color), and significantly higher sales than Silver/White. iPhone colors (other than Red), generally do not cost a user more, so in the absence of any influence of design on sales, the expected result would have been a near-even distribution of sales across all color options, or at least a distribution that reflected production and availability (because Black is usually overproduced compared to other colors). The fact that the sales distribution is skewed goes to show that beautiful products are automatically seen as more attractive.

Her Code Got Humans on the Moon—And Invented Software Itself

This reading was very inspiring. I had known of the work of female mathematicians like Katherine Johnson and Dorothy Vaughan on the Mercury and Apollo programs, mostly due to the film Hidden Figures. Their work on calculating trajectories and backup trajectories for the Apollo mission was instrumental in the program’s success and even saved the life of the Apollo 13 astronauts. However I was unaware of the key contributions of Margaret Hamilton to both the moon landings and modern software design, through her work on the Apollo In-flight Guidance Computer.

I was especially surprised at reading how, despite Hamilton’s insistence to include exception handling in the software (which is now essentially a software engineering 101 concept, as far as I’m aware), NASA had nearly rejected it as being too excessive. However, Apollo 8 had shown the importance of such error handling. I had also heard about Apollo 11’s memory overflow error before (apparently a result of the navigation RADAR being left on past when it was supposed to be used), but through this article, I learned that Margaret Hamilton was the one who came up with the solution to it.

Reading further about this incident, I found out about another contribution of Margaret Hamilton to the success of the Apollo 11 mission, specifically when it came to interaction. While the “priority displays” exception handling mechanism was innovative, the low processing power and slow speeds of the Apollo 11 computers meant that there was a risk that the astronauts inputs and the computer could go out of sync while it was trying to load up the priority sub-routines. This was why Hamilton put a standing instruction that when the priority displays came online, astronauts should wait for 5 seconds for everything to load up properly before putting in any inputs, which helped prevent knock-on memory overflows and asynchronous input-output logic.

Overall, Margaret Hamilton’s work is highly inspiring and aspects of it can still be seen in software design today.

Midterm Project: Immune Defenders!

Introduction and Concept

The idea for this project came as a sort of natural follow-up to my projects in this class so far. I have tried to include elements of bioinspiration in almost every assignment I have done for this class so far (barring the Data Visualization of Week 4), so I already knew I wanted to do something within that theme.

My inspiration for this game came almost directly from the retro arcade game Galaga by Namco (see image), and the many space shooters it inspired.

Namco’s Galaga

So, my goal was to now somehow combine a space shooter with the immune system. I also wanted it to be relatively accurate to the human immune system, so there was that added challenge.

So, the end result is a game where you control an effector B-cell (or plasma cell), to shoot antibodies at and neutralize bacteria, just like your own immune system does millions of times each day, even when you’re healthy.

Code and Design

With a whopping 783 lines of code (including class definitions), this is easily the largest project I have ever coded. Here are some chunks that I am particularly proud of:

if (keyCode == 32) {
  shot.play();
  if (int((millis() - lastboosttime) / 1000) > boosttime) {
    numBullets = 1;
  }
  for (let i = 1; i <= numBullets; i++) {
    bulletx =
      lives[0].x +
      lives[0].r * 0.25 * (-2 * pow(-1, i) * i + pow(-1, i) - 1);
    bullety = lives[0].y;
    bulletr = height / 100;
    bulletspeed = height / 20;
    var bullet = new Bullet(
      bulletx,
      bullety,
      bulletr,
      bulletspeed,
      bulletsprite
    );
    bullets.push(bullet);
  }
}

So, the above code is used to shoot the bullets that are a core component of the game. But the code is written mainly to handle the challenge of summoning multiple bullets together during the Vitamin C powerup (more on that later). I could have just called it a day and made three separate “new bullets” with the correct x-positions, but I wanted a way to make it uniform for no matter how many bullets could be added by the powerup. This tool from Wolfram Alpha was essential for this, as it calculated a mathematical formula (the part which has quite a few terms involving multiplication of -1 raised to the bullet number). So, whether I add 3, 5, or 19 bullets, theoretically, I should get a consistent result.

//bullet display,, movement and collision
for (let i = 0; i < bullets.length; i++) {
  bullets[i].display();
  bullets[i].move();
  //Delete bullet and bacterium, and add score when killing bacteria
  for (let j = 0; j < bacteria.length; j++) {
    if (bullets.length > i) {
      //length condition added as parameter changes before draw() function is next called
      if (bullets[i].collide(bacteria[j])) {
        bullets.splice(i, 1);
        bacteria.splice(j, 1);
        score++;
      }
    }
  }
  //Delete when bullets fly off-screen
  if (bullets.length > i) {
    //length condition added as parameter changes before draw() function is next called
    if (bullets[i].wallCollide()) {
      bullets.splice(i, 1);
    }
  }
}

The above part of the code deals with the collision logic for the bacteria, and is another part I’m proud of. Using splice() instead of pop() ensures that the correct bacterium is deleted, even when they move out of order because of their different speeds.

Other Code Features

Other features of the code that are important for the functioning of the game, but are not particularly remarkable are the event handlers for clicking and key presses. While clicking to change stage, I had to ensure that the user wouldn’t accidentally click out of the game into the game over screen, but that was easy enough with some conditionals. Key event listeners involve the shooting using the keyboard, movement visa left/right or A/D keys, and using powerups via the Z, X, C keys. Each had their respective actions coded within the event handler itself.

There is also a timer that tracks the time from each iteration of the game starting anew (even in the same sketch run), as well as a counter to track score. The timer is used to both indicate the length of time survived by the player, as well as to control the powerup cooldowns.

Classes

There are a total of 7 classes: Bacteria (display, movement and collision of bacteria), Immune (immune cell display), Bullet (display, movement and collision of bullet), Boost, Net, Bomb (display and update of the cooldown of the powerups), and Button (hover/click behavior of end-screen buttons). I realize now that I could have probably included the three powerup classes under one, but I had initially planned to have their respective functions as class methods. I could probably still have done that by inheritance, but I wasn’t aware of how to make parent classes in p5 or JS, and I did not have sufficient time to learn.

Gameplay

As described earlier, the game is basically a Galaga clone. All you primarily do is shoot down or dodge waves of bacteria (enemy spaceships). To make the gameplay more interesting however, I included three powerups.

The first powerup (try pressing the Z key), allows you to shoot out 3 antibodies per shot instead of just one. That turns out to be particularly useful when the bacteria tend to be just off the exact center mark of the white blood cell.

The second powerup (X key) allows you to slow down the bacteria, giving you more time to shoot or dodge them, whichever you prefer. This powerup was based on the ability of some neutrophils (one of the types of white blood cells that act as a first-line defender) to produce traps that literally stick the bacteria in place so that they can be neutralized and then consumed by other immune cells.

The third and final powerup (C key) almost feels like cheating as it completely nukes everything on the screen and gives you the rewards for it. Initially, I wanted to balance it by reducing the score you get back, but I realized that would confuse players (after all, the bacteria are being killed). So, instead I balanced it with a high score cost, which does kind of match real life. Such an attack that kills multiple colonies of bacteria in one go often results in the immune cells attacking the body’s own healthy cells as well, engaging in inflammatory reactions with massive collateral damage, often causing even more damage than the disease on its own.

The best part about these powerups according to me is that they’re all based at least loosely in real biological concepts, and are not just make-believe gameplay conveniences.

Graphics

Graphics were mostly obtained from royalty-free clipart on the web, which I then recolored to increase contrast/make them look more interesting. The title card’s image was generated using DALLE-3 on Bing Copilot. Any editing required was easily done in Microsoft Paint 3D.

All of the graphics elements involved in this project.

Sound and Music

Pixabay has always been a lifesaver for me when it comes to obtaining high-quality royalty-free/Creative Commons-licensed music and sound effects without any pesky registration or annoying PLUS subscriptions (this is not an ad). I already had somewhat of a general idea of the feel of music I was going for, so I just searched up a few different categories and stuck to the music that immediately hit it off with me. I trimmed a few of the sound clips using Audacity and also put them through FileConvert’s compression tool to reduce the burden on p5.js, which does tend to struggle with loading heavy images/sounds. My only regret is not leaving enough time to include SFX for the powerups.

Pixabay User Interface

Challenges, Improvements and Future Considerations

Bug-fixing was probably the most challenging aspect. Because I had worked on the project over several days, I found that there were often things I was forgetting that led to weird interactions and things not going as expected. Asking my roommate to play-test the game definitely did help.

There are many things that I wanted to include that had to be left out in the interest of time. I had planned to include other pathogens, including viruses that didn’t damage your health but instead temporarily blocked your ability to shoot, and even a parasite final boss that would not attack you directly but instead open up more wounds for hordes of bacteria and viruses to enter the blood vessel and overwhelm the player.

Additionally, as mentioned earlier, I would have preferred to have more sound effects, for not just the powerup, but also when the player was hit by a bacterium and lost a life. However, overall, I am happy with the final result, and I can say that it closely matched my initial expectations.

Midterm Demo

Fullscreen Link: https://editor.p5js.org/amiteashp/full/Uvgv-fIWb

Link to project on P5 editor: https://editor.p5js.org/amiteashp/sketches/Uvgv-fIWb

 

Week 5 Reading Response: Computer Vision

I am not a programmer or a videographer, so I’ll mostly speak about the first topic, Computer Vision in Interactive Art.

I was really impressed by Krueger’s Videoplace. It seemed to be a highly interactive Computer Vision piece. I found it especially interesting that this project is actually quite old, older even than the computer mouse. This surprised me as I thought that computer vision, at least the kind that could track poses accurately, was relatively new. It’s pretty amazing that the piece was working even in 2006.

Also, the part about computer vision and interactive art based on it being a response to increasing surveillance really stood out to me. Art has often been a response to various kinds of troubles or challenges faced by society, and is often a means of coping with societal trauma. Therefore, it is no surprise that the increased surveillance following 9/11 and the War on Terror, especially machine-based surveillance, triggered an outpouring of interactive artpieces based on computer vision. Lozano-Hemmer’s Standards and Double Standards is probably the best example of this, as to me, it represents that surveillance makes people more distant from each other (represented by the “absent crowd”) while enforcing a sense of oppression and authority (represented by the belts).

Rokeby’s Sorting Daemon was another particularly interesting project, especially after I visited his website to understand how the composites were made. On the left side of the composite are the faces, sorted from dark skin to fair, left to right. The right side meanwhile captures details of their clothes, sorted into mosaics by similar color. I found it to be a rather fitting representation of the profiling of people, which is often racial and unfairly targets racial minorities and immigrants who appear more similar to the stereotypical “terrorist” or “criminal”. It is also a visual representation of the dehumanization of people into datapoints that are monitored by the state.

Overall, this was a very interesting reading about the history of Computer Vision-based art. While I regret not being able to understand the technical aspects better, I would say this was quite a well-written article, which simplified many complex concepts of this old, yet cutting-edge, field.

Week 5 Assignment: Midterm Progress Report

Organ Defenders (WIP)

Concept

For this project, I was inspired mostly by Namco’s Galaga (image attached) and other games in the shoot ’em up genre. In the game, there are waves of enemies that you have to shoot down to score points. Being shot by the enemy or colliding into them makes you lose a life. Finally, there are some enemies that have special attacks that don’t kill you outright but make it harder for you to survive the wave.

A game of Galaga. The player controls a spaceship that is trying to dodge attacks from enemy spaceships. The boss spaceship has a tractor beam that spreads out in front of itself towards the player.

Galaga

As the Biology major of this class, I thought of combining the gameplay of Galaga with a concept based on our body’s immune system in order to make something new. Thus, Organ Defenders (would appreciate suggestions on the name) was born.

Design

This can be divided into three aspects, the game design, the code structure, and the visual design.

Game Design: I mostly went with the tried and true format of Galaga. The player has four lives and can control a white blood cell in the x-axis at the bottom of the screen. The player will be able to shoot bullets of enzymes at the enemies to bring them down and score points.

Waves of bacteria and viruses keep generating at the top and moving downwards (I might change this to right-to-left later). I may give some of the bacteria the ability to shoot toxins later, but for now, colliding with a bacterium makes the player lose a life. Viruses won’t directly kill the player but will disable the gun control, thereby not being able to earn any points.

As mentioned before, score is earned based off of how many bacteria/viruses have been killed. If I have the time, I might create a hard mode, where the higher the number of bacteria/viruses that are allowed to leave the bottom of the screen, the higher the number of new bacteria that are generated at the top. This will attempt to simulate infection.

Additionally, I plan to include three different kinds of timed powerups that can also help the player in challenging situations. Again, the hard mode might include some kind of cost to use the powerups more than once, to prevent spamming. The powerups I am currently planning for will be Antibody Rush (freezes the bacteria/viruses in place), Cytokine Boost (the player will be able to shoot more projectiles per shot), and Complement Bomb (basically nukes everything on the screen).

Code Structure: I plan to have 4 classes: the white blood cell, bacteria, virus, and the bullet. Here’s an example of how the classes will look, using my Bacteria class.

class Bacteria {
  constructor(
    initX,
    initY,
    radius,
    speed
  ) {
    this.x = initX;
    this.y = initY;
    this.r = radius
    this.moveY = speed;
  }
  
  display(){
    fill("green")
    ellipse(this.x, this.y, 2 * this.r);
  }
  
  move(){
    this.y += this.moveY
  }
  
  collide(other){
    return dist(this.x, this.y, other.x, other.y) <= this.r + other.r
  }
  
  wallCollide(){
    return (this.y > height)
  }
}

Visual Design: While it is a bit early to finalize the visual design before most of the gameplay is ready, I can still talk about it briefly. I plan to use Creative Commons-licensed sprites for the white blood cell, bacteria, and viruses. I might use spritesheets to show some animation, but I am not sure whether that will add to the game’s visual aspect as things might be going too fast anyway for a player to enjoy animations. At the most, I might include animations for when the white blood cell shoots a projectile.

Frightening Aspects / Challenges

There are quite a few aspects that seem daunting now:

  • Actually creating a desirable core gameplay loop, something that will encourage the player to keep playing
  • The powerups seem like a hassle to implement, especially the way I’m planning to implement them. Also, it will be challenging to show the timer of how much time is left before the next use of the projectile.
  • Timing of the animations will also need to be fine-tuned to avoid distracting the player and detracting from the game experience.
  • I also need to find a way to have the bacteria not clump together, which would unfairly make the player lose more lives. I tried for a bit to use while loops to keep regenerating random positions until there was a non-colliding one, but it didn’t seem to work properly.

Risk Prevention

  • I’ll have to work on extensive playtesting to resolve bugs. I also aim to have my roommates/friends playtest it to fine-tune the UI/UX.
  • Implementing the powerups will require me to learn some new skills.
  • I should have a core gameplay loop ready, along with visuals and SFX before I attempt to include the powerups, so that in the worst-case scenario, I still have a ready project.
  • Regarding the bacteria clumping together, I need to either find a way to unclump them or give the player i-frames (invincibility frames) upon losing a life. I might include the i-frames anyway, but will need to find a way to indicate that to the player (in fact, my prototype currently has half a second of i-frames, clearly not enough).
  • I should also make my code more responsive and check for any in-built functions I use that may respond unexpectedly on other systems.

Midterm Prototype

Finally, here’s the prototype. So far, only the player movements, the bacteria collisions, and the game over/game restart logic have been put in place.

Week 4 Assignment: Personal Electronics Market in India

I had initially wanted to work with generative text to create poems in my native language (Bengali), which would have translations in English. Midway through this project however, I realized how challenging it was as the rules of grammar in English and Bengali are quite different, and I couldn’t just make one-to-one sentences, even simple ones.

Then I decided to go through the Kaggle website to search for suitable datasets. The dataset on the Device market in India over last 15 years (https://www.kaggle.com/datasets/michau96/device-market-in-india-over-last-15-years) was a trending dataset, so I decided to use that. Since the dataset was on a monthly basis, I first used Excel to take averages across each year, converting the dataset to an annual one.

When it came to making the plot itself, I first tried getting the basic stacked histogram right. This was done using rectMode(CORNERS) as it allows to specify the opposite corners of the rectangle. The x position for each bar was specified using the Year column and the y position and height using the percentage value of each column normalized to the desired height of the plot.

    rectMode(CORNERS); //to allow stacking of the bars
    //bar for mobile
    fill(this.mobileColor);
    rect(
      (this.dataYear - 2007) * w,
      y,
      w * (this.dataYear - 2007 + 1),
      y - (h / 100 * this.mobile)
    );
    //bar for desktop
    fill(this.desktopColor);
    rect(
      (this.dataYear - 2007) * w,
      y - (h/100 * this.mobile),
      w * (this.dataYear - 2007 + 1),
      y - (h / 100 * this.mobile) - (h / 100 * this.desktop)
    );
    //bar for tablet
    fill(this.tabletColor);
    rect(
      (this.dataYear - 2007) * w,
      y - (h / 100 * this.mobile) - (h/100 * this.desktop),
      w * (this.dataYear - 2007 + 1),
      y - h
    );

After that, I decided to work on the graph legend. The legend class takes one of the bars and makes the legend based off of that (this works since every bar is identical when it comes to the number of groups and the color associated with each group).

Finally, I wanted to add a level of interactivity in the form of a popup that comes up when you hover over a bar, similar to the statistics website Statista (statista.com). I tried using the mouseOver() event listener, but that didn’t work with the bar object for some reason, so I decided to go with the hard route of checking mouseX and mouseY against the bar’s dimensions.

The final result is below:

 

I would have loved to make this more generalized and capable of visualizing any dataset loaded by a user (kind of like ggplot in R). In that case, the program would need to work for any number of columns. But until I figure out how to make function arguments optional, this is kind of impossible.

Week 4 Reading Response: The Psychopathology of Everyday Things

This was a really interesting reading on the principles of good design. A lot of considerations mentioned by Don Norman seem obvious from the user standpoint but are often forgotten by product designers. Norman goes in depth regarding the principles he sees as crucial for designing good products, especially products with increasing functional complexity.

This reading is archaic. After all, the latest technologies mentioned here include new types of landline telephones. But everything mentioned here is relevant in the Web age. Even now, we still see complicated products in the market that need you to search for an instruction manual on Google. And UI/UX is a very important consideration on websites and applications. Yet many websites fail at one or more of Norman’s principles of good design: providing good conceptual models, making things visible, mapping, or feedback. A common example is the fact that a lot of websites that are mapped well on desktops completely lose all sense of mapping on mobile, even when (and sometimes especially when) using the mobile version of the website. An example from the application side of things might be Slack, whose problems range from the fact that separate channels usually do not display notifications if not actively logged into them (at least in my experience), the interface that seems hostile to new users with its multiple menus of options (some of which don’t visibly change anything), and notification “quiet hours” being turned on by default. Many of these problems have slowly been fixed by updates, yet Discord, or even WhatsApp, feels like a better alternative to Slack as a group messaging service. Which is not to say that the other two apps don’t have their own UX problems either.

To close off, the final line about the paradox of technology struck a chord with me. “Added complexity and difficulty cannot be avoided when functions are added, but with clever design, they can be minimized.” This, after all, is the very principle of UX design.

Week 3 Assignment: The Ant Life

For this assignment, I had initially planned to create Agar Plate V2, but quickly found out that there would be very few ways to add more features and convert it to follow the guidelines of OOP without overhauling some major parts, so I decided to start from scratch. The bouncing balls example we did in class inspired me to do a project on generative art using ants.

Four Tips to Keep Ants out This Spring · ExtermPRO

Fig.: A column of ants, hard at work carrying food back to their colony

The patterns formed by scout ants searching for food and then leading worker ants to the food using pheromones always fascinated me. I wanted to replicate something along those lines in a generative art project. Because this was generative art, most of it had to involve randomness in initial positions, direction, and motions. I first got to work to converting the bouncing balls to ants. For this, I also converted my simple speedX and speedY parameters to be calculated from polar coordinates instead. The hope was that using angles instead of plain coordinate directions would allow better control over the behavior when reflecting off surfaces. I also finally figured out how to set up the collision between two objects in p5.js, which I implemented as below:

collide(other) {
  var d = dist(this.x, this.y, other.x, other.y);
  if (d < this.rad + other.rad) {
  return true; 
  } else {
  return false;
  }
}
for (var j = 0; j < ants.length; j++) {
  if (i != j && ants[i].collide(ants[j]) && !(ants[i].collide(colony))) {
    ants[i].speedDir += PI/2;
    ants[j].speedDir += PI/2;
  }
}

I was initially having the ants go in opposite directions when they met, but decided that having them bounce off 90 degrees instead better matched the behavior of real ants, who often exchange information about the presence or lack of food along their already searched path and move off in different directions instead.

This is the final result:

Finally, I decided to add a bit of randomness to the motion to avoid having them go in exact straight lines and be a closer match to ant-like motion.

Overall, there are many more things that I wanted to include in this project, but figuring out direction and the p5.js rotate() function took the bulk of my time. I had initially planned to have user interactivity in the form of sugar cubes they could drop, which would cause all the ants nearby to rush towards it. But, I had already tired of the math required just to figure out which direction the ants would be oriented in when they bounced off the walls, and I wasn’t too eager for further math. If I have more time in the future, I would love to revisit this project and improve upon it.