Assignment 2 – Loops

Concept:

For this assignment, I wanted to make a design that moves and looks alive, as compared to my static design for the first assignment. I used nested loops to draw a grid of circles, and each circle changes size and color using a wave. I had to learn how to use the wave and the map functions so I could control how the circles grow and change colors, which was a little tricky at first, but really fun to figure out. The loops made it so much easier to draw many circles without writing a lot of code.

Code Highlight:

I’m really proud of how I made the circles pulse and change color using the wave and map functions. Even though it’s just a few lines of code, it completely changes how the artwork feels. Figuring out how to use the wave and map together was really challenging at first. I had to do a lot of trial and error, watch some YouTube videos, but it was satisfying to see it finally work.

let wave = sin((frameCount * 0.07) + (x + y) * 0.05); // simple wave
let size = map(wave, -1, 1, 10, spacing); // circle size changes with wave
let colorValue = map(wave, -1, 1, 50, 255); // color changes with wave
fill(colorValue, 170, 355);

 

Embedded Sketch:

Reflections & Future improvements:

This assignment was initially challenging because I had to figure out how to make the circles pulse and change color using loops and functions. I spent some time watching tutorials and testing different ideas to get the animation to look right, especially figuring out how to use the wave and map functions together. Completing this project helped me understand how to combine new concepts with what I already know, such as loops and grids, to create something that actually moves and changes on the screen.

For the future, I’d like to make it more interactive. Right now, the circles just pulse on their own, but it would be cool if the user could click or drag the mouse to change the colors or the speed of the animation.

Week 2 – video reflection

Watching this video, what struck me most was how something so structured could still feel alive. It’s strange to see code—something I usually think of as rigid—turn into movement and shape that almost feels organic. It made me wonder if maybe rules and freedom aren’t opposites after all. Maybe rules are what allow freedom to show up in the first place.

I kept thinking about how in life we spend so much time wanting total freedom, no limits, no boundaries. But when I watched these shifting patterns, I realized the boundaries are what made them beautiful. If there were no rules, it would just be noise. The art only worked because there was a balance—enough structure to hold it together, and enough randomness to make it feel alive.

That thought sat with me: maybe life is like that too. Too much control makes things rigid, too much chaos makes them meaningless. The sweet spot is somewhere in between, and it’s not something we can design perfectly—it has to come naturally.

Week 2 – Reading Response

After I watched Casey Reas’s Chance Operations talk I was filled with different emotions, curious and even inspired. The way he treated randomness was probably the most interesting part of it, although it may sound cliche. The progress of his works that were shown was fascinating. He treated randomness not as something chaotic but rather international, and the precision and effort it took to create that art is truly fascinating. Even a single dash or slash, once given a set of rules and a little unpredictability, could turn into visuals that felt alive. It made me think about my own tendency to over-control creative work, sometimes the most interesting results come when I just let the imagination be (just like in homework for this class). 

I also liked the way he uses geometry as a way to move from points to lines, and even dimensions that are very difficult to picture in mind. At the same time, his examples reminded me that digital randomness isn’t truly random at all, it’s always controlled by algorithms and history. 

What stayed with me most was his idea that just a “slight bit of noise” keeps systems alive, which I think is such a relatable thing in our daily life as well, because would it be so interesting to live and to be if it wasn’t for imperfections?



Week 2: Work of Art

Portrait Concept

While browsing the old computer art magazines I came across the two images below and felt like they perfectly captured the different possible contrast that can be achieved with computer art. The first one is more mechanical, sharp and exact, while the second one felt more natural and human due to it’s varying corners and circular flow. I derived my concept from this contrast, creating a work that is able to capture both ends of the scale using interactivity created with the use of loops. Initially the screen displays a constant line that stimulates the contents of a heart rate monitor, it is mechanical and rigid, what one could describe as lifeless. Then when the user comes in the picture and interacts with the work by pressing their mouse, the work begins to transition into a more organic flowing curve, along with a pulsing heart that appears in the middle that in a way adds life to the piece. This highlights the collaboration between human and machine that computer art embodies, combining both mechanical precision and human emotions.

Bild 1 Polygonzug
Bild 2 Kreis-Variationen

 

Code that I am Proud of  

for (let x = 0; x < width; x++) {
    let y_machine = baseline;

    let lineX = (x + frameCount*1.5) % 130;

    //Machine heart beat based on x position
    if (lineX < 15) {
      y_machine = baseline;
    } else if (lineX < 30) {
      y_machine = baseline - 60;
    } else if (lineX < 45) {
      y_machine = baseline + 30;
    } 

    //Human heartbeat wave
    let y_human = baseline + sin((x + frameCount) * 0.05) * 40;

    //Transitioning between machine and human
    let y = y_machine + (y_human - y_machine) * trans;

    noFill();
    strokeWeight(2);
    
    //Color changing with transition
    let r = 255 * trans;
    let g = 255 * (1 - trans);
    let b = 50 * trans;
    stroke(r, g, b);

    vertex(x, y);
  }

Most my time spent on this assignment was in creating and refining the transition between the mechanical and human line, trying to recreate the vision. After multiple tries I ended up with the for loop above, that both creates the lines and blends them together using the transition factor defined earlier in the code. I do believe that this is the most important element of the code that creates the part that captures the concept, which is why it occupied most of my attention when creating the work.  Creating it taught me a lot about the technical and conceptual thinking that goes into coding computer art. It brought my attention to things like the use of color and the possibilities technically when it came to transitioning the color, which I integrated into the code.

Embedded Sketch

Reflection

I really enjoyed exploring the possibilities that come with using loops in the creation of a work of art. After completing my previous assignment and getting more comfortable with the program, I had a goal of shifting my focus a little to the concept behind the work. Which I think I was able to achieve this assignment with the use of the computer art magazines. With their assistance I was able to get inspired and integrate meaning into the code. For future work I’d like to use the techniques I learnt today to create a work where the user interaction does not only affect the visuals, but also the narrative and direction of the piece. With this time of interactivity the work can become a collaboration between the creator, the users and the computer all at once.

Week 2 Assignment_Loops

My concept

This week after learning about loops, for my design I wanted to explore how I could use different shapes and in some way combine or merge them to create different patterns.

For this assignment I created two designs after experimenting the use of loops and shapes. The first one as shown below is composed of a “for ()” loop including two sets of  lines and two sets of ellipses. As shown in the second design, the first layer is a set of lines underneath one set of ellipses which produce a pattern that matches that of scales such of a fish. After this, I explored the use of a second layer of ellipses with a different size for the height and width, thus creating a pattern as shown in the first design.

Highlight code

One of the codes I was most proud of were the loops I used to make the scales. These consisted of lines that followed the code we did in class, but with a slight change in the width, height and their size. The second code was for the scales themselves, consisting of ellipses placed on top of the lines.

 

/// green lines
   stroke(200,200,200);
  for(let x = 0; x <= width; x += 60){
    for(let y = 0; y <= height; y += 60){    
     line(x, y, x+75, y+75);  
      fill( 235, 0, 255);
    }
  }
  
  
  /// Scales
  for(let x = 0; x <= width; x += 30){
    for(let y = 0; y <= height; y += 30){    
     ellipse( x, y, 50, 50);  
      fill( 225, 100, 230);
    }
  }

 

Embedded sketch

 

Reflection and ideas for future work or improvements

One of the things I struggled the most with was getting the loop code right for the lines. At first, I thought it would be the same as we did for the ellipses, however, after testing it out I realized this did not work. Nonetheless, after discussing my situation with the professor, I was able to understand my mistake, and how the code for the lines was different when it came to the variables after the (x, y,), which, unlike the code for the ellipse, required a plus sign to adapt an accurate width and height, resulting in this:

line (x, y, x + 75, y + 75);

For my next project, I will look at more tutorials beforehand to have more preparation and to have a better understanding of the codes I will be using in order to have a more precise result. I will also produce multiple sketches of what I want my final product to be so I can aim towards a specific design in case I don’t have the time to explore different outcomes. Lastly, I would also love to add more animations and movement to my sketches.

Final Project Documentation

Concept

I was thinking long and hard about the final project that would be a great finale to all of the things we have learned during the semester. I have decided I want to make a robot. Robot is a broad term and I had to decide the purpose of mine, and since I wanted to create something that is innovative, fun and creative I decided to make a maze solving robot. The initial plans were to make the robot go through the maze on its own and have the user just set up the maze, but we will get to why it didn’t work out like that later. Instead of the robot solving the maze on its own, now its the user who is in control and trying to go through it “blind”, precisely using the ultrasonic sensors as guides. The user that controls the robot does not see the maze and is solving it based just on the sensors, while their friend rearranges the maze between sessions in order to make the game fun and interesting throughout the session.

Video of user interaction with the project
Arduino code
const int AIN1 = 13;
const int AIN2 = 12;
const int PWMA = 11;

const int PWMB = 10;
const int BIN2 = 9;
const int BIN1 = 8;

const int trigPinFront = 6;
const int echoPinFront = 5;

const int trigPinLeft = A0;
const int echoPinLeft = 2;

const int trigPinRight = 4;
const int echoPinRight = 3;

unsigned long lastEchoTime = 0;
const unsigned long echoInterval = 300;

void setup() {
  Serial.begin(9600);

  pinMode(AIN1, OUTPUT); 
  pinMode(AIN2, OUTPUT); 
  pinMode(PWMA, OUTPUT);
  pinMode(BIN1, OUTPUT); 
  pinMode(BIN2, OUTPUT); 
  pinMode(PWMB, OUTPUT);
  pinMode(trigPinFront, OUTPUT); 
  pinMode(echoPinFront, INPUT);
  pinMode(trigPinLeft, OUTPUT); 
  pinMode(echoPinLeft, INPUT);
  pinMode(trigPinRight, OUTPUT); 
  pinMode(echoPinRight, INPUT);

  Serial.println("READY");
}

void loop() {
  if (Serial.available()) {
    char command = Serial.read();

    //Resppond to command to move the robot
    switch (command) {
      case 'F':
        leftMotor(50); rightMotor(-50);
        delay(1000);
        leftMotor(0); rightMotor(0);
        break;
      case 'B':
        leftMotor(-50); rightMotor(50);
        delay(1000);
        leftMotor(0); rightMotor(0);
        break;
      case 'L':
        leftMotor(200); rightMotor(200);
        delay(300);
        leftMotor(200); rightMotor(200);
        delay(300);
        leftMotor(0); rightMotor(0);
        break;
      case 'R':
        leftMotor(-200); rightMotor(-200);
        delay(300);
        leftMotor(-200); rightMotor(-200);
        delay(300);
        leftMotor(0); rightMotor(0);
        break;
      case 'S':
        leftMotor(0); rightMotor(0);
        break;
    }
  }

  //Send distance data to the serial
  unsigned long currentTime = millis();
  if (currentTime - lastEchoTime > echoInterval) {
    float front = getDistance(trigPinFront, echoPinFront);
    float left = getDistance(trigPinLeft, echoPinLeft);
    float right = getDistance(trigPinRight, echoPinRight);

    Serial.print("ECHO,F,"); Serial.println(front);
    Serial.print("ECHO,L,"); Serial.println(left);
    Serial.print("ECHO,R,"); Serial.println(right);

    lastEchoTime = currentTime;
  }
}

//Logic for controling the movement of the right and left motor
void rightMotor(int motorSpeed) {
  if (motorSpeed > 0) {
    digitalWrite(AIN1, HIGH);
    digitalWrite(AIN2, LOW);
  } else if (motorSpeed < 0) {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, HIGH);
  } else {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, LOW);
  }
  analogWrite(PWMA, abs(motorSpeed));
}

void leftMotor(int motorSpeed) {
  if (motorSpeed > 0) {
    digitalWrite(BIN1, HIGH);
    digitalWrite(BIN2, LOW);
  } else if (motorSpeed < 0) {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, HIGH);
  } else {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, LOW);
  }
  analogWrite(PWMB, abs(motorSpeed));
}

//Logic for measuring distance
float getDistance(int trigPin, int echoPin) {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  long duration = pulseIn(echoPin, HIGH);
  float distance = duration / 148.0;
  return distance;
}

The Arduinos main purpose is to handle motor move meant as well as use the data from the distance sensors and send them to p5. For the movement it takes data from p5 which the user enters by pressing buttons on the keyboard and translates them to motor movement which imitates the movement on screen. The data from the 3 ultrasonic sensors is picked up with the Arduino and sent to the serial in order to be picked up by p5.

The p5 code takes the echo values that the Arduino sends and uses that date to draw the “echo lines” which the user will use to “see” the maze with the walls being visible every now and then if in range. P5 is also used to take user input and send it to the Arduino which translates it to movement. It also has code that serves as main connection from the Arduino to p5.

Here is the schematic of the circuit. One of the most challenging parts of this project was connecting all the wires and making sure they wouldn’t disconnect during transportation and during the showcase. I kept all the wires as far apart from each other as possible and made sure everything that could move them is nicely secured to the plate.

The making of the project

Making this project was a journey. What seemed to be a straight forward project turned out to be a 2 week long process of trial and error until I got the final result.

As I have mentioned above in the beginning the idea was to make the robot go through the maze on its own and have the user just set up the maze.

This is one of the photos of the early stages of development of the robot. As you can see it looks so much different than the final product. This was the part of the project where I was focusing on just getting the movement and some reading from the sensors.

After I managed to get the movement done with the cable attached and with sending commands through my laptop I was ready to move on to the next phase which was adding 2 more sensors and having the robot move on its own. But before I could even do that I wanted to start working on the maze. The base of them maze was 120cm wide and 180cm tall, so if I put it up it would be roughly the same size as me. I also had to make the walls of the maze which were 20cm each in order to get picked up by the sensors on the robot. I also created temporary walls that could be moved by the users to give more interactivity to the project. This turned out to be much more of a time consuming and painful process than I thought because I had to use scraps of cardboard and make sure each peace is not only 20cm tall, but also that the cut on the side of the piece is straight enough so it can stick to the other piece. After that was done, testing for the autonomous movement was ready to start.

The movement seems to be alright, but if you watched carefully in the beginning the paperclip that was used as the 3rd wheel got a bit stuck on the cardboard. At the time of the recording of this video I didn’t think that would be an issue, but damn was I wrong. After more testing something scary started happening. The paperclip wouldn’t only get stuck a bit and make the robot slow down, it would actually fly right off the robot every time it got stuck. Also other problems came up, such as the robot constantly resetting in place every time it would start without a cable attached to it and also the third sensor not reading anything. So lets go through one problem at a time.

The problem with the robot resetting was very easy to debug and fix. The main reason something like that would be happening only when the cable is not plugged would mean something with the power is not alright. At the time I was using 4 1.5V batteries to power the motors of the robot as well as the Arduino which proved to be insufficient.  The fix was to connect a 9V battery to the motors to allow the 1.5V batteries to be used just by the Arduino which fixed the problem. The next problem was the reading of the ultrasonic sensors. from all the wiring I have ran out of digital pins and had only one digital and one analog pin left for the sensor. After searching the internet I read that it should be fine since the analog pins can behave as digital, but my readings were still just 0. After talking with the professor who went through the source code of the Arduino he discovered that “Indeed there is a conversion table for pin numbers to their internal representations (bits in a port) and the table only includes the digital pins!”. The fix that the professor suggested and which ended up working is plugging the Echo pin in the analog one and the Trig in the digital. This helped me move on with the project and I would like to thank professor Shiloh one more time for saving me countless hours debugging!

Back to the movement issue. Because the paperclip kept getting stuck and flying off I had decided to remove it completely and instead use a wheel in the back which would act as support and allow the robot to move through all the bumps in the floor without a problem, or so I thought. After cutting the acrylic and getting the wheel in place I spent 2 hours trying to get the acrylic to stick to the base of the robot and in the process I superglued my finger to my phone which was not a fun experience at all! When I managed that I started the robot up and all seemed fine until I decided to try it out on the maze. Not only was the robot not detecting the walls, it was also not turning at all. After another countless hours of debugging here is what happened.

First of all the ultrasonic sensors are very unreliable which I found was the reason the robot wasn’t seeing the walls. Sometimes the reading would be 25 inches and would suddenly jump to 250inches which was impossible because it would mean the walls were too far apart. This was the main reason I decided to switch from an autonomous robot to the one that is controlled by the user. As for the movement, since the wheel was made out of rubber it created friction with the cardboard an it didn’t allow the robot to make a turn. It took me a lot of trial and error to realize the problem and come up with somewhat of the solution. I taped the the bottom up with clear duct tape which was slipping on the cardboard and allowed turning. The problem with this was the mentioned slipping, as one press of the button would make the robot go 360. I put in tape on the bottom of the cardboard which would stop that, but would also sometimes stop the turning mid way. In retrospect I would have saved myself so much trouble if I just let go of the idea of the cardboard floor!

And so we come to the final design of the robot and the maze.

Areas of improvement

There are definitely some areas that could be worked on more in order to make the robot more functional. The first and most obvious one would be changing the cardboard floor with something else, perhaps something which doesn’t have bumps in it and wouldn’t create too much friction. Another thing is removing the back wheel and adding something else in its place, something that allows it to spin in place, but creates enough friction so the robot is not just spinning around in circles. I would also add an instructions page on the p5 as I have realized some users were confused on what to do when they approached the computer. Also I would try to find an alternative to the ultrasonic sensors and use something much more reliable which would allow the autonomous movement of the robot.

Things I am proud of and conclusion

I am honestly proud of the whole project. I think it is a great reflection of the things we have learned during the semester of Intro to IM and a great representation of how far we have come. When I started the course I didn’t even know how to connect wires to bake an LED light up, and here I am 14 weeks later making a robot that drives around the maze. Even though the journey to the end product was a bumpy one, I am grateful for everything I have learned in the process, every wire I had to cut and put back 20 times, all the sensors I went through to find the ones that work, all of it taught me valuable lessons and I am excited to start with new projects in the future. Thank you for reading through my journey through this class and I hope I will have a chance to write a blog again when I start my next project!

Final Project Documentation

I initiated this project to emulate the card-scanning excitement of Yu-Gi-Oh duel disks, in which tapping cards summons monsters and spells. Users present one or more RFID tags-each representing cowboy, astronaut or alien-to an MFRC522 reader connected to an Arduino Uno. The system then allocates a five-second selection window before launching one of three interactive mini-games in p5.js: Stampede, Shooter or Cookie Clicker. In Stampede you take the helm of a lone rider hurtling through a hazardous space canyon, dodging bouncing rocks and prickly cacti that can slow or shove you backwards-all while a herd of cosmic cows closes in on your tail. Shooter throws two players into a tense standoff: each pilot manoeuvres left and right, firing lasers at their opponent and scrambling down shields to block incoming beams until one side breaks. Cookie Clicker is pure, frenzied fun-each participant pounds the mouse on a giant on-screen cookie for ten frantic seconds, racing to rack up the most clicks before time runs out. All visual feedback appears on a browser canvas, and audio loops accompany each game.

 

 

Components

The solution comprises four principal components:

  • RFID Input Module: An MFRC522 reader attached to an Arduino Uno captures four-byte UIDs from standard MIFARE tags.
  • Serial Bridge: The Arduino transmits single-character selection codes (‘6’, ‘7’ or ‘8’) at 9600 baud over USB and awaits simple score-report messages in return. P5.js
  • Front End: A browser sketch employs the WebSerial API to receive selection codes, manage global state and asset loading, display a five-second combo bar beneath each character portrait, and execute the three mini-game modules.
  • Mechanical Enclosure: Laser-cut plywood panels, secured with metal L-brackets, form a cuboid housing; a precision slot allows the 16×2 LCD module to sit flush with the front panel.

Hardware Integration

The MFRC522 reader’s SDA pin connects to Arduino digital pin D10 and its RST pin to D9, while the SPI lines (MOSI, MISO, SCK) share the hardware bus. In firmware, the reader is instantiated via “MFRC522 reader(SS_PIN, RST_PIN);
” and a matchUID() routine compares incoming tags against the three predefined UID arrays.

Integrating a standard 16×2 parallel-interface LCD alongside the RFID module proved significantly more troublesome. As soon as “lcd.begin(16, 2)”  was invoked in setup(), RFID reads ceased altogether. Forum guidance indicated that pin conflicts between the LCD’s control lines and the RC522’s SPI signals were the most likely culprit. A systematic pin audit revealed that the LCD’s Enable and Data-4 lines overlapped with the RFID’s SS and MISO pins. I resolved this by remapping the LCD to use digital pins D2–D5 for its data bus and D6–D7 for RS/Enable, updating both the wiring and the constructor call in the Arduino sketch.

P5.js Application and Mini-Games

The browser sketch orchestrates menu navigation, character selection and execution of three distinct game modules within a single programme.

A single “currentState” variable (0–3) governs menu, Stampede, Shooter and Cookie Clicker modes. A five-second “combo” timer begins upon the first tag read, with incremental progress bars drawn beneath each portrait to visualise the window. Once the timer elapses, the sketch evaluates the number of unique tags captured and transitions to the corresponding game state.

Merging three standalone games into one sketch turned out to be quite the headache. Each mini-game had its own globals-things like score, stage and bespoke input handlers-which clashed as soon as I tried to switch states. To sort that out, I prefixed every variable with its game name (stampedeScore, sh_p1Score, cc_Players), wrapped them in module-specific functions and kept the global namespace clean.

The draw loop needed a rethink, too. Calling every game’s draw routine in sequence resulted in stray graphics popping up when they shouldn’t. I restructured draw() into a clear state machine-only the active module’s draw function runs each frame. That meant stripping out stray background() calls and rogue translate()s  from the individual games so they couldn’t bleed into one another

Finally, unifying input was tricky. I built a single handleInput function that maps RFID codes (‘6’, ‘7’, ‘8’) and key presses to abstract commands (move, shoot, click) then sends them to whichever module is active. A bit of debouncing logic keeps duplicate actions at bay- especially critical during that five-second combo window- so you always get predictable, responsive controls.

The enclosure is constructed from laser-cut plywood panels, chosen both for its sustainability, and structural rigidity, and finished internally with a white-gloss plastic backing to evoke a sleek, modern aesthetic. Metal L-brackets fasten each panel at right angles, avoiding bespoke fasteners and allowing for straightforward assembly or disassembly. A carefully dimensioned aperture in the front panel accommodates the 16×2 LCD module so that its face sits perfectly flush with the surrounding wood, maintaining clean lines.

Switching between the menu and the individual mini-games initially caused the sketch to freeze on several occasions. Timers from the previous module would keep running, arrays retained stale data and stray transformations lingered on the draw matrix. To address this, I introduced dedicated cleanup routine- resetStampede(), shCleanup() and ccCleanup()- that execute just before currentState changes. Each routine clears its game’s specific variables, halts any looping audio and calls resetMatrix() (alongside any required style resets) so that the next module starts with a pristine canvas.

Audio behaviour also demanded careful attention. In early versions, rapidly switching from one game state to another led to multiple tracks playing at once or to music cutting out abruptly, leaving awkward silences. I resolved these issues by centralising all sound control within a single audio manager. Instead of scattering stop() and loop() calls throughout each game’s code, the manager intercepts state changes and victory conditions, fading out the current track and then initiating the next one in a controlled sequence. The result is seamless musical transitions that match the user’s actions without clipping or overlap.

The enclosure underwent its own process of refinement. My first plywood panels, cut on a temperamental laser cutter, frequently misaligned-the slot for the LCD would be too tight to insert the module or so loose that it rattled. After three iterative cuts, I tweaked the slot width, adjusted the alignment tabs and introduced a white-gloss plastic backing. This backing not only conceals the raw wood edges but also delivers a polished, Apple- inspired look. Ultimately, the panels now fit together snugly around the LCD and each other, creating a tool-free assembly that upholds the project’s premium aesthetic.

Future Plans

Looking ahead, the system lends itself readily to further enhancement through the addition of new mini-games. For instance, there could be a puzzle challenge or a rhythm-based experience which leverages the existing state-framework; each new module would simply plug into the central logic, reusing the asset-loading and input-dispatch infrastructure already in place.

Beyond additional games, implementing networked multiplayer via WebSockets or a library such as socket.io would open the possibility of remote matches and real-time score sharing, transforming the project from a local-only tabletop experience into an online arena. Finally, adapting the interface for touch input would enable smooth operation on tablets and smartphones, extending the user base well beyond desktop browsers.

Conclusion

Working on this tabletop arcade prototype has been both challenging and immensely rewarding. I navigated everything from the quirks of RFID timing and serial communications to the intricacies of merging three distinct games into a single p5.js sketch, all while refining the plywood enclosure for a polished finish. Throughout the “Introduction to Interactive Media” course, I found each obstacle-whether in hardware, code or design-to be an opportunity to learn and to apply creative problem-solving. I thoroughly enjoyed the collaborative atmosphere and the chance to experiment across disciplines; I now leave the class not only with a functional prototype but with a genuine enthusiasm for future interactive projects.

 

Final Project Documentation

Concept / Description

My project was inspired by simple robots / AI from the 90s and early 2000s (like the Tamagotchi pets) made to just be fun toys for kids. In our current age, we’re so used to advanced AIs that can complete complex thoughts, but I wanted to inspire a sense of nostalgia and comfort from this robot. It also serves as an anchor point to see how far we’ve come in the past 2 decades as more “intelligent” AI develop. The main interaction and premise of this robot are centered around its hunger and feeding it. It starts off neutral, but as you feed the robot, it gets happier. However, if you overfeed it, it’ll get nauseous. If you don’t feed it at all, overtime, it’ll get incredibly sad. You need to watch out for its needs and make sure it’s in this Goldilocks state of happiness and being well-fed. The robot loves attention, so if you hold its hand, it’ll also get happy regardless of its hunger levels. However, if you hold its hand with too much force, it’ll feel pain and get sad.

The music and sounds from the p5 sketch use 8bit audio to tie in the retro feel of the robot. The limited pixels and display from the LCD screen also give a sense of limited technology to take you back a few decades.

Video Demonstration:

Cleaner version: https://drive.google.com/file/d/15zkLTwSH97eqe1FHWSYq188_5F6aHUkX/view?usp=sharing

Messier version:

https://drive.google.com/file/d/1rzX4EbBVYXzRDgda-7Dk08BkqQ0m9Qx8/view?usp=sharing

Media (photos)

Implementation

Link to sketch: https://editor.p5js.org/bobbybobbb/full/yeMCC3H4B

p5 and Arduino are communicating with each other by sending each other values like the emotional state of the robot and FSR values. p5 controls the emotional value (each number represents a different emotion) and sends it to Arduino so that the LCD screen will display the correct facial expression and the LED lights will display the corresponding colors. The emotional state also controls the servo motors that act as the legs. The force sensitive resistor values get sent to p5 to control sadness and happiness since they act as hands being held. Interactions also correspond with specific sounds, which I’m particularly proud of as it adds a lot more atmosphere to the experience. For example, holding hands triggers a specific sound, holding the hands too hard also triggers another sound, feeding the robot triggers another sound, the hunger bar going down triggers a sound, and feeding on a full stomach also triggers a different sound.

Once I had all my functionality implemented like the code and the circuit, I moved on to beautifying the robot by building a casing for it. The wires and circuit make it hard to make a simple box for the robot, so I had to do a lot of paper-prototyping at first to get the shape and dimensions of casing. By using paper, I could easily cut and paste pieces together to fit across the robot. Even if I made mistakes, the adaptability of paper made it simple to fix. Once I found the right dimensions, I created Illustrator files to laser cut pieces of acrylic out. From there, I needed to drill sides together to create a 3-dimensional box shape.

Early prototype:

Video of early prototype (Had to make sure all the functionality worked before the visuals came in):

https://drive.google.com/file/d/1RJzqBWGN9Tan1qQ-CqXS2n1jlQ580AKP/view?usp=sharing

User  Testing

When user testing, peers commented on the user interface of the p5 sketch and mentioned how it’d be nice if the sketch matched the physical body of the robot better.  They also mentioned the awkward holding of the robot (before it was encased). I was at a loss for how to build the casing of the body, and so I asked some of my peers who are more experienced with these kind of things for suggestions. I ended up using the L shaped brackets to help make the box and laser cutting my box out of acrylic under the advice of Sumeed and David, and with the help of IM lab assistants.

Difficulties

Communication between p5 and Arduino was difficult to implement because my computer crashed at some point from the code. I wasn’t sure what I did wrong, so I referred to the example from class, replicated it and changed some values to test out simple functionality at first. Once I made sure Arduino and p5 were communicating in real time, I started building my project from there.

Most of my difficulties came from hardware and building the physical robot since I’m most unfamiliar with hardware compared to software. For example, I wanted the FSR to resemble hands poking out the robot, but upon taping down the FSR, I realized that depending on where you tape the FSR, this’ll affect the sensor readings. There’s also very limited room on the base plate I’m using to hold the Arduino and breadboard for all the wiring involved. For example, I wanted everything to be contained in a neat box, but the Neopixel wires stick out quite a bit. I ended up just making a bigger box to counteract this.

Using Neopixels was a huge part of my project and a must. To use them, I needed to solder wires to the Neopixels, which took a really long time because instead of soldering into a hole, I’m soldering to a flat surface and have to make sure the wires stick on to that flat copper surface. Sometimes, the wires would fall off or it’d just be really difficult to get the wire to stick to the solder on the copper surface. After soldering came the software; I tested using the example strandtest by Adafruit, but it didn’t have the correct outcome even though the lights turned on perfectly. They weren’t displaying the right colors. Mind you, I randomly took these from the IM lab, so I had no idea what type of Neopixels they were. It simply came down to testing out different settings for the different types of Neopixels that exist until I hit the right one.

The LCD screen is also technically upside-down on the robot body because it’s the only way for maximum room on the breadboard to put wires in. Since I had no other option but to put the screen upside down, I had to draw and display all the pixels and bytes upside down. This required a lot of coordination and rewiring my brain’s perspective.

Future Improvements

In the future, I want to use an alternate source of power for the servo motors and Neopixels because every time the servo motors run, the LCD screen blinks and is less bright because the motors take up a lot of power. Every time the Neopixels switch from one color to another, the LCD screen is also affected. I think hooking up a battery to the circuit would solve this problem. In the future, I think more sensors and ways to interact with the robot would also be nice.

Progress Report

Current Progress

The p5.js environment has been successfully designed, I went from a 2D version to a 3D avatar rendered in WEBGL. The system also includes a custom font and a wooden background platform for visual warmth. A floating instruction frame appears at the beginning of the interaction, prompting users to “press to start.”

The Arduino hardware components (photoresistor, DIY capacitive touch sensor, LEDs, and buzzer) are currently in the process of being tested. I am actively working on matching sensor input with the avatar’s behavior (e.g., face expression, sound).

Video 

What’s Being Tested

    • Touch Sensor + LEDs → Plant’s mood environment (happy, sad)

    • Touch Input → Start and Display instructions

    • Avatar Design → Body, leaf animation, emotional face drawn in p5.js

    • Instructions Interface → Initial user onboarding screen

 Pending Tasks

    • Finalizing the integration of the Arduino circuit into the physical plant (soldering and arranging).

    • Smoothing the interaction between sensor readings and p5.js visual/audio feedback.

    • Conducting user tests to assess how people engage with the plant-avatar system.

      Avatar Demo

Week 11- Serial Communication

1. Concept

There are 3 parts to the project:

(1) Light Dependent Resistor (LDR) readings are sent from Arduino to p5js. The ellipse in p5js moves on the horizontal axis in the middle of the screen depending on the LDR readings. Nothing on arduino is controlled by p5js.
(2) Control the LED brightness from p5js using mouseX position. The more right the mouse position, the higher the LED brightness.
(3) Taking the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul), every time the ball bounces one led lights up and then turns off, and you can control the wind from the potentiometer (analog sensor).

2. Highlights: Challenges and Overcoming Them

(1) The circle position was not changing in response to the brightness of the surroundings. We attempted to tackle this problem by checking the serial monitor as to whether LDR readings are being read. After confirming the LDR’s functionality, we closed the serial monitor proceeded to use p5js and use the right serial port. However, the circle position was still not changing. With help from Professor, we streamlined our code to include just about all that seemed necessary. This worked!

 

(2) The LED was flickering. We did not know why. Alisa thought that the delay(50) must be at after instead of before analogWrite(ledPin, brightness). However, that did not solve the problem. Samuel thought to remove the delay(50). It still did not work. We decided to try to map the mouseX position (ranging from 0 to 600) to the range of 0 to 255 in Arduino instead of p5js. This worked!

 

(3) To code the 3rd assignment. I worked alongside many different individuals, Alisa, Samuel, Haris, (even Prof Aya’s slides). I was having trouble at every stage of the hardware aspect, from safari stopping the code from interacting with the Arduino, to then having serial issues with the laptop. I was able to piece together work until I finally had it working in the end. The coding aspect was simple overall, as the base code only had to be minority amended to take inputs from the Arduino, and the LED had to complete an action based on the running programme.

3. Video

Part 1:

Part 2:

Part 3:

4. Reflection and Future Areas of Improvement

Our challenges emphasized the need for understanding the code that allows communication between p5js and Arduino. Without understanding the code, it is difficult to make the right removals or changes necessary to achieve the results we are hoping to.

It would be great to find out why the LED did not flicker when the mapping was performed in Arduino rather than p5js. Is this related to PWM?