Final Project Documentation

Concept

I was thinking long and hard about the final project that would be a great finale to all of the things we have learned during the semester. I have decided I want to make a robot. Robot is a broad term and I had to decide the purpose of mine, and since I wanted to create something that is innovative, fun and creative I decided to make a maze solving robot. The initial plans were to make the robot go through the maze on its own and have the user just set up the maze, but we will get to why it didn’t work out like that later. Instead of the robot solving the maze on its own, now its the user who is in control and trying to go through it “blind”, precisely using the ultrasonic sensors as guides. The user that controls the robot does not see the maze and is solving it based just on the sensors, while their friend rearranges the maze between sessions in order to make the game fun and interesting throughout the session.

Video of user interaction with the project
Arduino code
const int AIN1 = 13;
const int AIN2 = 12;
const int PWMA = 11;

const int PWMB = 10;
const int BIN2 = 9;
const int BIN1 = 8;

const int trigPinFront = 6;
const int echoPinFront = 5;

const int trigPinLeft = A0;
const int echoPinLeft = 2;

const int trigPinRight = 4;
const int echoPinRight = 3;

unsigned long lastEchoTime = 0;
const unsigned long echoInterval = 300;

void setup() {
  Serial.begin(9600);

  pinMode(AIN1, OUTPUT); 
  pinMode(AIN2, OUTPUT); 
  pinMode(PWMA, OUTPUT);
  pinMode(BIN1, OUTPUT); 
  pinMode(BIN2, OUTPUT); 
  pinMode(PWMB, OUTPUT);
  pinMode(trigPinFront, OUTPUT); 
  pinMode(echoPinFront, INPUT);
  pinMode(trigPinLeft, OUTPUT); 
  pinMode(echoPinLeft, INPUT);
  pinMode(trigPinRight, OUTPUT); 
  pinMode(echoPinRight, INPUT);

  Serial.println("READY");
}

void loop() {
  if (Serial.available()) {
    char command = Serial.read();

    //Resppond to command to move the robot
    switch (command) {
      case 'F':
        leftMotor(50); rightMotor(-50);
        delay(1000);
        leftMotor(0); rightMotor(0);
        break;
      case 'B':
        leftMotor(-50); rightMotor(50);
        delay(1000);
        leftMotor(0); rightMotor(0);
        break;
      case 'L':
        leftMotor(200); rightMotor(200);
        delay(300);
        leftMotor(200); rightMotor(200);
        delay(300);
        leftMotor(0); rightMotor(0);
        break;
      case 'R':
        leftMotor(-200); rightMotor(-200);
        delay(300);
        leftMotor(-200); rightMotor(-200);
        delay(300);
        leftMotor(0); rightMotor(0);
        break;
      case 'S':
        leftMotor(0); rightMotor(0);
        break;
    }
  }

  //Send distance data to the serial
  unsigned long currentTime = millis();
  if (currentTime - lastEchoTime > echoInterval) {
    float front = getDistance(trigPinFront, echoPinFront);
    float left = getDistance(trigPinLeft, echoPinLeft);
    float right = getDistance(trigPinRight, echoPinRight);

    Serial.print("ECHO,F,"); Serial.println(front);
    Serial.print("ECHO,L,"); Serial.println(left);
    Serial.print("ECHO,R,"); Serial.println(right);

    lastEchoTime = currentTime;
  }
}

//Logic for controling the movement of the right and left motor
void rightMotor(int motorSpeed) {
  if (motorSpeed > 0) {
    digitalWrite(AIN1, HIGH);
    digitalWrite(AIN2, LOW);
  } else if (motorSpeed < 0) {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, HIGH);
  } else {
    digitalWrite(AIN1, LOW);
    digitalWrite(AIN2, LOW);
  }
  analogWrite(PWMA, abs(motorSpeed));
}

void leftMotor(int motorSpeed) {
  if (motorSpeed > 0) {
    digitalWrite(BIN1, HIGH);
    digitalWrite(BIN2, LOW);
  } else if (motorSpeed < 0) {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, HIGH);
  } else {
    digitalWrite(BIN1, LOW);
    digitalWrite(BIN2, LOW);
  }
  analogWrite(PWMB, abs(motorSpeed));
}

//Logic for measuring distance
float getDistance(int trigPin, int echoPin) {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  long duration = pulseIn(echoPin, HIGH);
  float distance = duration / 148.0;
  return distance;
}

The Arduinos main purpose is to handle motor move meant as well as use the data from the distance sensors and send them to p5. For the movement it takes data from p5 which the user enters by pressing buttons on the keyboard and translates them to motor movement which imitates the movement on screen. The data from the 3 ultrasonic sensors is picked up with the Arduino and sent to the serial in order to be picked up by p5.

The p5 code takes the echo values that the Arduino sends and uses that date to draw the “echo lines” which the user will use to “see” the maze with the walls being visible every now and then if in range. P5 is also used to take user input and send it to the Arduino which translates it to movement. It also has code that serves as main connection from the Arduino to p5.

Here is the schematic of the circuit. One of the most challenging parts of this project was connecting all the wires and making sure they wouldn’t disconnect during transportation and during the showcase. I kept all the wires as far apart from each other as possible and made sure everything that could move them is nicely secured to the plate.

The making of the project

Making this project was a journey. What seemed to be a straight forward project turned out to be a 2 week long process of trial and error until I got the final result.

As I have mentioned above in the beginning the idea was to make the robot go through the maze on its own and have the user just set up the maze.

This is one of the photos of the early stages of development of the robot. As you can see it looks so much different than the final product. This was the part of the project where I was focusing on just getting the movement and some reading from the sensors.

After I managed to get the movement done with the cable attached and with sending commands through my laptop I was ready to move on to the next phase which was adding 2 more sensors and having the robot move on its own. But before I could even do that I wanted to start working on the maze. The base of them maze was 120cm wide and 180cm tall, so if I put it up it would be roughly the same size as me. I also had to make the walls of the maze which were 20cm each in order to get picked up by the sensors on the robot. I also created temporary walls that could be moved by the users to give more interactivity to the project. This turned out to be much more of a time consuming and painful process than I thought because I had to use scraps of cardboard and make sure each peace is not only 20cm tall, but also that the cut on the side of the piece is straight enough so it can stick to the other piece. After that was done, testing for the autonomous movement was ready to start.

The movement seems to be alright, but if you watched carefully in the beginning the paperclip that was used as the 3rd wheel got a bit stuck on the cardboard. At the time of the recording of this video I didn’t think that would be an issue, but damn was I wrong. After more testing something scary started happening. The paperclip wouldn’t only get stuck a bit and make the robot slow down, it would actually fly right off the robot every time it got stuck. Also other problems came up, such as the robot constantly resetting in place every time it would start without a cable attached to it and also the third sensor not reading anything. So lets go through one problem at a time.

The problem with the robot resetting was very easy to debug and fix. The main reason something like that would be happening only when the cable is not plugged would mean something with the power is not alright. At the time I was using 4 1.5V batteries to power the motors of the robot as well as the Arduino which proved to be insufficient.  The fix was to connect a 9V battery to the motors to allow the 1.5V batteries to be used just by the Arduino which fixed the problem. The next problem was the reading of the ultrasonic sensors. from all the wiring I have ran out of digital pins and had only one digital and one analog pin left for the sensor. After searching the internet I read that it should be fine since the analog pins can behave as digital, but my readings were still just 0. After talking with the professor who went through the source code of the Arduino he discovered that “Indeed there is a conversion table for pin numbers to their internal representations (bits in a port) and the table only includes the digital pins!”. The fix that the professor suggested and which ended up working is plugging the Echo pin in the analog one and the Trig in the digital. This helped me move on with the project and I would like to thank professor Shiloh one more time for saving me countless hours debugging!

Back to the movement issue. Because the paperclip kept getting stuck and flying off I had decided to remove it completely and instead use a wheel in the back which would act as support and allow the robot to move through all the bumps in the floor without a problem, or so I thought. After cutting the acrylic and getting the wheel in place I spent 2 hours trying to get the acrylic to stick to the base of the robot and in the process I superglued my finger to my phone which was not a fun experience at all! When I managed that I started the robot up and all seemed fine until I decided to try it out on the maze. Not only was the robot not detecting the walls, it was also not turning at all. After another countless hours of debugging here is what happened.

First of all the ultrasonic sensors are very unreliable which I found was the reason the robot wasn’t seeing the walls. Sometimes the reading would be 25 inches and would suddenly jump to 250inches which was impossible because it would mean the walls were too far apart. This was the main reason I decided to switch from an autonomous robot to the one that is controlled by the user. As for the movement, since the wheel was made out of rubber it created friction with the cardboard an it didn’t allow the robot to make a turn. It took me a lot of trial and error to realize the problem and come up with somewhat of the solution. I taped the the bottom up with clear duct tape which was slipping on the cardboard and allowed turning. The problem with this was the mentioned slipping, as one press of the button would make the robot go 360. I put in tape on the bottom of the cardboard which would stop that, but would also sometimes stop the turning mid way. In retrospect I would have saved myself so much trouble if I just let go of the idea of the cardboard floor!

And so we come to the final design of the robot and the maze.

Areas of improvement

There are definitely some areas that could be worked on more in order to make the robot more functional. The first and most obvious one would be changing the cardboard floor with something else, perhaps something which doesn’t have bumps in it and wouldn’t create too much friction. Another thing is removing the back wheel and adding something else in its place, something that allows it to spin in place, but creates enough friction so the robot is not just spinning around in circles. I would also add an instructions page on the p5 as I have realized some users were confused on what to do when they approached the computer. Also I would try to find an alternative to the ultrasonic sensors and use something much more reliable which would allow the autonomous movement of the robot.

Things I am proud of and conclusion

I am honestly proud of the whole project. I think it is a great reflection of the things we have learned during the semester of Intro to IM and a great representation of how far we have come. When I started the course I didn’t even know how to connect wires to bake an LED light up, and here I am 14 weeks later making a robot that drives around the maze. Even though the journey to the end product was a bumpy one, I am grateful for everything I have learned in the process, every wire I had to cut and put back 20 times, all the sensors I went through to find the ones that work, all of it taught me valuable lessons and I am excited to start with new projects in the future. Thank you for reading through my journey through this class and I hope I will have a chance to write a blog again when I start my next project!

Final Project Documentation

I initiated this project to emulate the card-scanning excitement of Yu-Gi-Oh duel disks, in which tapping cards summons monsters and spells. Users present one or more RFID tags-each representing cowboy, astronaut or alien-to an MFRC522 reader connected to an Arduino Uno. The system then allocates a five-second selection window before launching one of three interactive mini-games in p5.js: Stampede, Shooter or Cookie Clicker. In Stampede you take the helm of a lone rider hurtling through a hazardous space canyon, dodging bouncing rocks and prickly cacti that can slow or shove you backwards-all while a herd of cosmic cows closes in on your tail. Shooter throws two players into a tense standoff: each pilot manoeuvres left and right, firing lasers at their opponent and scrambling down shields to block incoming beams until one side breaks. Cookie Clicker is pure, frenzied fun-each participant pounds the mouse on a giant on-screen cookie for ten frantic seconds, racing to rack up the most clicks before time runs out. All visual feedback appears on a browser canvas, and audio loops accompany each game.

 

 

Components

The solution comprises four principal components:

  • RFID Input Module: An MFRC522 reader attached to an Arduino Uno captures four-byte UIDs from standard MIFARE tags.
  • Serial Bridge: The Arduino transmits single-character selection codes (‘6’, ‘7’ or ‘8’) at 9600 baud over USB and awaits simple score-report messages in return. P5.js
  • Front End: A browser sketch employs the WebSerial API to receive selection codes, manage global state and asset loading, display a five-second combo bar beneath each character portrait, and execute the three mini-game modules.
  • Mechanical Enclosure: Laser-cut plywood panels, secured with metal L-brackets, form a cuboid housing; a precision slot allows the 16×2 LCD module to sit flush with the front panel.

Hardware Integration

The MFRC522 reader’s SDA pin connects to Arduino digital pin D10 and its RST pin to D9, while the SPI lines (MOSI, MISO, SCK) share the hardware bus. In firmware, the reader is instantiated via “MFRC522 reader(SS_PIN, RST_PIN);
” and a matchUID() routine compares incoming tags against the three predefined UID arrays.

Integrating a standard 16×2 parallel-interface LCD alongside the RFID module proved significantly more troublesome. As soon as “lcd.begin(16, 2)”  was invoked in setup(), RFID reads ceased altogether. Forum guidance indicated that pin conflicts between the LCD’s control lines and the RC522’s SPI signals were the most likely culprit. A systematic pin audit revealed that the LCD’s Enable and Data-4 lines overlapped with the RFID’s SS and MISO pins. I resolved this by remapping the LCD to use digital pins D2–D5 for its data bus and D6–D7 for RS/Enable, updating both the wiring and the constructor call in the Arduino sketch.

P5.js Application and Mini-Games

The browser sketch orchestrates menu navigation, character selection and execution of three distinct game modules within a single programme.

A single “currentState” variable (0–3) governs menu, Stampede, Shooter and Cookie Clicker modes. A five-second “combo” timer begins upon the first tag read, with incremental progress bars drawn beneath each portrait to visualise the window. Once the timer elapses, the sketch evaluates the number of unique tags captured and transitions to the corresponding game state.

Merging three standalone games into one sketch turned out to be quite the headache. Each mini-game had its own globals-things like score, stage and bespoke input handlers-which clashed as soon as I tried to switch states. To sort that out, I prefixed every variable with its game name (stampedeScore, sh_p1Score, cc_Players), wrapped them in module-specific functions and kept the global namespace clean.

The draw loop needed a rethink, too. Calling every game’s draw routine in sequence resulted in stray graphics popping up when they shouldn’t. I restructured draw() into a clear state machine-only the active module’s draw function runs each frame. That meant stripping out stray background() calls and rogue translate()s  from the individual games so they couldn’t bleed into one another

Finally, unifying input was tricky. I built a single handleInput function that maps RFID codes (‘6’, ‘7’, ‘8’) and key presses to abstract commands (move, shoot, click) then sends them to whichever module is active. A bit of debouncing logic keeps duplicate actions at bay- especially critical during that five-second combo window- so you always get predictable, responsive controls.

The enclosure is constructed from laser-cut plywood panels, chosen both for its sustainability, and structural rigidity, and finished internally with a white-gloss plastic backing to evoke a sleek, modern aesthetic. Metal L-brackets fasten each panel at right angles, avoiding bespoke fasteners and allowing for straightforward assembly or disassembly. A carefully dimensioned aperture in the front panel accommodates the 16×2 LCD module so that its face sits perfectly flush with the surrounding wood, maintaining clean lines.

Switching between the menu and the individual mini-games initially caused the sketch to freeze on several occasions. Timers from the previous module would keep running, arrays retained stale data and stray transformations lingered on the draw matrix. To address this, I introduced dedicated cleanup routine- resetStampede(), shCleanup() and ccCleanup()- that execute just before currentState changes. Each routine clears its game’s specific variables, halts any looping audio and calls resetMatrix() (alongside any required style resets) so that the next module starts with a pristine canvas.

Audio behaviour also demanded careful attention. In early versions, rapidly switching from one game state to another led to multiple tracks playing at once or to music cutting out abruptly, leaving awkward silences. I resolved these issues by centralising all sound control within a single audio manager. Instead of scattering stop() and loop() calls throughout each game’s code, the manager intercepts state changes and victory conditions, fading out the current track and then initiating the next one in a controlled sequence. The result is seamless musical transitions that match the user’s actions without clipping or overlap.

The enclosure underwent its own process of refinement. My first plywood panels, cut on a temperamental laser cutter, frequently misaligned-the slot for the LCD would be too tight to insert the module or so loose that it rattled. After three iterative cuts, I tweaked the slot width, adjusted the alignment tabs and introduced a white-gloss plastic backing. This backing not only conceals the raw wood edges but also delivers a polished, Apple- inspired look. Ultimately, the panels now fit together snugly around the LCD and each other, creating a tool-free assembly that upholds the project’s premium aesthetic.

Future Plans

Looking ahead, the system lends itself readily to further enhancement through the addition of new mini-games. For instance, there could be a puzzle challenge or a rhythm-based experience which leverages the existing state-framework; each new module would simply plug into the central logic, reusing the asset-loading and input-dispatch infrastructure already in place.

Beyond additional games, implementing networked multiplayer via WebSockets or a library such as socket.io would open the possibility of remote matches and real-time score sharing, transforming the project from a local-only tabletop experience into an online arena. Finally, adapting the interface for touch input would enable smooth operation on tablets and smartphones, extending the user base well beyond desktop browsers.

Conclusion

Working on this tabletop arcade prototype has been both challenging and immensely rewarding. I navigated everything from the quirks of RFID timing and serial communications to the intricacies of merging three distinct games into a single p5.js sketch, all while refining the plywood enclosure for a polished finish. Throughout the “Introduction to Interactive Media” course, I found each obstacle-whether in hardware, code or design-to be an opportunity to learn and to apply creative problem-solving. I thoroughly enjoyed the collaborative atmosphere and the chance to experiment across disciplines; I now leave the class not only with a functional prototype but with a genuine enthusiasm for future interactive projects.

 

Final Project Documentation

Concept / Description

My project was inspired by simple robots / AI from the 90s and early 2000s (like the Tamagotchi pets) made to just be fun toys for kids. In our current age, we’re so used to advanced AIs that can complete complex thoughts, but I wanted to inspire a sense of nostalgia and comfort from this robot. It also serves as an anchor point to see how far we’ve come in the past 2 decades as more “intelligent” AI develop. The main interaction and premise of this robot are centered around its hunger and feeding it. It starts off neutral, but as you feed the robot, it gets happier. However, if you overfeed it, it’ll get nauseous. If you don’t feed it at all, overtime, it’ll get incredibly sad. You need to watch out for its needs and make sure it’s in this Goldilocks state of happiness and being well-fed. The robot loves attention, so if you hold its hand, it’ll also get happy regardless of its hunger levels. However, if you hold its hand with too much force, it’ll feel pain and get sad.

The music and sounds from the p5 sketch use 8bit audio to tie in the retro feel of the robot. The limited pixels and display from the LCD screen also give a sense of limited technology to take you back a few decades.

Video Demonstration:

Cleaner version: https://drive.google.com/file/d/15zkLTwSH97eqe1FHWSYq188_5F6aHUkX/view?usp=sharing

Messier version:

https://drive.google.com/file/d/1rzX4EbBVYXzRDgda-7Dk08BkqQ0m9Qx8/view?usp=sharing

Media (photos)

Implementation

Link to sketch: https://editor.p5js.org/bobbybobbb/full/yeMCC3H4B

p5 and Arduino are communicating with each other by sending each other values like the emotional state of the robot and FSR values. p5 controls the emotional value (each number represents a different emotion) and sends it to Arduino so that the LCD screen will display the correct facial expression and the LED lights will display the corresponding colors. The emotional state also controls the servo motors that act as the legs. The force sensitive resistor values get sent to p5 to control sadness and happiness since they act as hands being held. Interactions also correspond with specific sounds, which I’m particularly proud of as it adds a lot more atmosphere to the experience. For example, holding hands triggers a specific sound, holding the hands too hard also triggers another sound, feeding the robot triggers another sound, the hunger bar going down triggers a sound, and feeding on a full stomach also triggers a different sound.

Once I had all my functionality implemented like the code and the circuit, I moved on to beautifying the robot by building a casing for it. The wires and circuit make it hard to make a simple box for the robot, so I had to do a lot of paper-prototyping at first to get the shape and dimensions of casing. By using paper, I could easily cut and paste pieces together to fit across the robot. Even if I made mistakes, the adaptability of paper made it simple to fix. Once I found the right dimensions, I created Illustrator files to laser cut pieces of acrylic out. From there, I needed to drill sides together to create a 3-dimensional box shape.

Early prototype:

Video of early prototype (Had to make sure all the functionality worked before the visuals came in):

https://drive.google.com/file/d/1RJzqBWGN9Tan1qQ-CqXS2n1jlQ580AKP/view?usp=sharing

User  Testing

When user testing, peers commented on the user interface of the p5 sketch and mentioned how it’d be nice if the sketch matched the physical body of the robot better.  They also mentioned the awkward holding of the robot (before it was encased). I was at a loss for how to build the casing of the body, and so I asked some of my peers who are more experienced with these kind of things for suggestions. I ended up using the L shaped brackets to help make the box and laser cutting my box out of acrylic under the advice of Sumeed and David, and with the help of IM lab assistants.

Difficulties

Communication between p5 and Arduino was difficult to implement because my computer crashed at some point from the code. I wasn’t sure what I did wrong, so I referred to the example from class, replicated it and changed some values to test out simple functionality at first. Once I made sure Arduino and p5 were communicating in real time, I started building my project from there.

Most of my difficulties came from hardware and building the physical robot since I’m most unfamiliar with hardware compared to software. For example, I wanted the FSR to resemble hands poking out the robot, but upon taping down the FSR, I realized that depending on where you tape the FSR, this’ll affect the sensor readings. There’s also very limited room on the base plate I’m using to hold the Arduino and breadboard for all the wiring involved. For example, I wanted everything to be contained in a neat box, but the Neopixel wires stick out quite a bit. I ended up just making a bigger box to counteract this.

Using Neopixels was a huge part of my project and a must. To use them, I needed to solder wires to the Neopixels, which took a really long time because instead of soldering into a hole, I’m soldering to a flat surface and have to make sure the wires stick on to that flat copper surface. Sometimes, the wires would fall off or it’d just be really difficult to get the wire to stick to the solder on the copper surface. After soldering came the software; I tested using the example strandtest by Adafruit, but it didn’t have the correct outcome even though the lights turned on perfectly. They weren’t displaying the right colors. Mind you, I randomly took these from the IM lab, so I had no idea what type of Neopixels they were. It simply came down to testing out different settings for the different types of Neopixels that exist until I hit the right one.

The LCD screen is also technically upside-down on the robot body because it’s the only way for maximum room on the breadboard to put wires in. Since I had no other option but to put the screen upside down, I had to draw and display all the pixels and bytes upside down. This required a lot of coordination and rewiring my brain’s perspective.

Future Improvements

In the future, I want to use an alternate source of power for the servo motors and Neopixels because every time the servo motors run, the LCD screen blinks and is less bright because the motors take up a lot of power. Every time the Neopixels switch from one color to another, the LCD screen is also affected. I think hooking up a battery to the circuit would solve this problem. In the future, I think more sensors and ways to interact with the robot would also be nice.

Progress Report

Current Progress

The p5.js environment has been successfully designed, I went from a 2D version to a 3D avatar rendered in WEBGL. The system also includes a custom font and a wooden background platform for visual warmth. A floating instruction frame appears at the beginning of the interaction, prompting users to “press to start.”

The Arduino hardware components (photoresistor, DIY capacitive touch sensor, LEDs, and buzzer) are currently in the process of being tested. I am actively working on matching sensor input with the avatar’s behavior (e.g., face expression, sound).

Video 

What’s Being Tested

    • Touch Sensor + LEDs → Plant’s mood environment (happy, sad)

    • Touch Input → Start and Display instructions

    • Avatar Design → Body, leaf animation, emotional face drawn in p5.js

    • Instructions Interface → Initial user onboarding screen

 Pending Tasks

    • Finalizing the integration of the Arduino circuit into the physical plant (soldering and arranging).

    • Smoothing the interaction between sensor readings and p5.js visual/audio feedback.

    • Conducting user tests to assess how people engage with the plant-avatar system.

      Avatar Demo

Week 11- Serial Communication

1. Concept

There are 3 parts to the project:

(1) Light Dependent Resistor (LDR) readings are sent from Arduino to p5js. The ellipse in p5js moves on the horizontal axis in the middle of the screen depending on the LDR readings. Nothing on arduino is controlled by p5js.
(2) Control the LED brightness from p5js using mouseX position. The more right the mouse position, the higher the LED brightness.
(3) Taking the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul), every time the ball bounces one led lights up and then turns off, and you can control the wind from the potentiometer (analog sensor).

2. Highlights: Challenges and Overcoming Them

(1) The circle position was not changing in response to the brightness of the surroundings. We attempted to tackle this problem by checking the serial monitor as to whether LDR readings are being read. After confirming the LDR’s functionality, we closed the serial monitor proceeded to use p5js and use the right serial port. However, the circle position was still not changing. With help from Professor, we streamlined our code to include just about all that seemed necessary. This worked!

 

(2) The LED was flickering. We did not know why. Alisa thought that the delay(50) must be at after instead of before analogWrite(ledPin, brightness). However, that did not solve the problem. Samuel thought to remove the delay(50). It still did not work. We decided to try to map the mouseX position (ranging from 0 to 600) to the range of 0 to 255 in Arduino instead of p5js. This worked!

 

(3) To code the 3rd assignment. I worked alongside many different individuals, Alisa, Samuel, Haris, (even Prof Aya’s slides). I was having trouble at every stage of the hardware aspect, from safari stopping the code from interacting with the Arduino, to then having serial issues with the laptop. I was able to piece together work until I finally had it working in the end. The coding aspect was simple overall, as the base code only had to be minority amended to take inputs from the Arduino, and the LED had to complete an action based on the running programme.

3. Video

Part 1:

Part 2:

Part 3:

4. Reflection and Future Areas of Improvement

Our challenges emphasized the need for understanding the code that allows communication between p5js and Arduino. Without understanding the code, it is difficult to make the right removals or changes necessary to achieve the results we are hoping to.

It would be great to find out why the LED did not flicker when the mapping was performed in Arduino rather than p5js. Is this related to PWM?

Week 11 Project Exercises

 

 

 

Exercise 1.1

let port;
let connectBtn;
let baudrate = 9600;
let lastMessage = "";

function setup() {
  createCanvas(400, 400);
  background(220);

  port = createSerial();

  // in setup, we can open ports we have used previously
  // without user interaction

  let usedPorts = usedSerialPorts();
  if (usedPorts.length > 0) {
    port.open(usedPorts[0], baudrate);
  }

  // any other ports can be opened via a dialog after
  // user interaction (see connectBtnClick below)

  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(80, 200);
  connectBtn.mousePressed(connectBtnClick);

}

function draw() {
  background("255");
  

  
  // Read from the serial port. This non-blocking function
  // returns the complete line until the character or ""
  let str = port.readUntil("\n");
  if (str.length > 0) {
    // Complete line received
    // console.log(str);
    lastMessage = str;
    
  
  }
  
      
  
  // Display the most recent message
  text("Last message: " + lastMessage, 10, height - 20);

  // change button label based on connection status
  if (!port.opened()) {
    connectBtn.html("Connect to Arduino");
  } else {
    connectBtn.html("Disconnect");
  }
  
  
  ellipse(lastMessage,10,20,20)
}

function connectBtnClick() {
  if (!port.opened()) {
    port.open("Arduino", baudrate);
  } else {
    port.close();
  }
}

 

Exercise 1.2

let port;
let connectBtn;
let baudrate = 9600;
function setup() {
  createCanvas(255, 285);
  port = createSerial();
  let usedPorts = usedSerialPorts();
  if (usedPorts.length > 0) {
  port.open(usedPorts[0], baudrate);
 } else {
  connectBtn = createButton("Connect to Serial");
  connectBtn.mousePressed(() => port.open(baudrate));
 }
}
function draw() {
  background(225);
  circle(140,mouseY,25,25):
  let sendtoArduino = String(mouseY) + "\n"
  port.write(sendtoArduino);
}

Exercise 1.3

 

 

int led = 5;
void setup() {
  Serial.begin(9600);
}
 
void loop() {
  if (Serial.available()) {
    int ballState = Serial.parseInt(); 
    if (ballState == 1) {
      digitalWrite(led, HIGH); // ball on ground
    } else {
      digitalWrite(led, LOW); // ball in air or default
     }
  }
  // read the input pin:
  int potentiometer = analogRead(A1);                  
  int mappedPotValue = map(potentiometer, 0, 1023, 0, 900); 
  // print the value to the serial port.
  Serial.println(mappedPotValue);                                            
  
  delay(100);
}

 

 

 

 

Reading Response: Design Meets Disability

The text prompted me to reflect deeply on the role of design in shaping not only products but also perceptions and identities- especially for those of us working in product design. The text challenges the traditional boundaries between medical necessity and mainstream desirability, and as a designer, I find both inspiration and discomfort in this tension.

One point that stands out is the way the Eameses’ leg splint for the U.S. Navy, originally a response to a medical brief, became a catalyst for innovations that transformed mainstream furniture design. The author’s admiration for the leg splint’s “organic form and geometric holes, the combination of subtle surfaces and crisp edges” resonates with me because it exemplifies how constraints can drive creativity. The evidence is clear: the technology and aesthetic language developed for a medical device directly influenced the iconic Eames furniture. This makes me think about how often, in design for disability, we excuse poor design because of the market or the context, rather than holding it to the same standards as mainstream products. It prompts me to question: why shouldn’t design for disability be as beautiful, as considered, and as desirable as anything else?

However, I am uneasy about the persistent bias towards discretion in assistive products. The text critiques the tradition of camouflaging medical devices- pink plastic hearing aids, for instance- in an attempt to make them invisible. The author argues that this approach can inadvertently reinforce the idea that disability is something to be hidden, rather than embraced. The evidence comes from the evolution of eyewear: glasses have transitioned from stigmatized medical appliances to fashion statements, even to the point where people without visual impairments wear them as accessories. This shift did not come from making glasses invisible, but from making them objects of desire. It makes me realise that, as a designer, I should challenge the default of discretion and instead explore how products can project positive identities.

The discussion of fashion’s role in design for disability is particularly provocative. The text points out that fashion and discretion are not true opposites, but there is a creative tension between them. Fashion’s embrace of diversity and its ability to make wearers feel good about themselves stands in stark contrast to the clinical, problem-solving culture that dominates medical design. The evidence is in the HearWear project, where inviting designers from outside the medical field led to radical new concepts for hearing aids- some playful, some overtly technological, and some drawing on jewellery and body adornment. This makes me reflect on my own practice: am I too quick to prioritise technical performance and problem-solving at the expense of self-expression and emotional value?

What I particularly like about the text is its insistence on keeping “the design in design for disability.” The author links the success of products like the iPod to a relentless focus on simplicity and attention to detail, arguing that these same sensibilities are often missing from assistive products because designers are not involved early enough in the process. The point is well made: design is not just about how something looks, but how it works and how it makes people feel. The evidence is in the contrast between the iPod’s iconic simplicity and the “flying submarine” syndrome of overburdened, over-complicated universal designs that try to be everything to everyone and end up pleasing no one. This reminds me that good design often means having the courage to say no to unnecessary features, and instead focusing on the essence of the product and the experience it creates.

Yet, I dislike the way the field of design for disability is still so often siloed and marginalised, both in practice and in perception. The text highlights how multidisciplinary teams developing prosthetics rarely include industrial or fashion designers, let alone sculptors or artists. The result is products that may function well but fail to resonate emotionally or culturally with their users. The evidence comes from the stories of Aimee Mullins and Hugh Herr, who both see their prostheses not just as tools but as extensions of their identity- sometimes even as sources of pride or advantage. This makes me think about the importance of diversity, not only among users but also among designers. We need to bring in more voices, more perspectives, and more creativity if we are to create products that are truly inclusive and empowering.

In conclusion, this text has challenged me to rethink my approach as a designer. It has made me more aware of the cultural and emotional dimensions of product design, especially in the context of disability. I am inspired to seek a healthier balance between problem-solving and playful exploration, between discretion and fashion, and between universality and simplicity. Most of all, I am reminded that design has the power to shape not only products but also identities and societies- and that this responsibility should never be taken lightly.

Preliminary Concept: Multiplayer Character Card Game System Using Arduino and p5.js

Finalised Concept

For my final project, I am designing an interactive multiplayer game system inspired by the iconic Yu-Gi-Oh! duel disk, but reimagined for creative, social play. Players use physical cards embedded with unique identifiers (such as RFID or NFC tags) to represent custom characters they create. These cards are scanned by an Arduino-powered duel disk, allowing each player to join the game with their personalized avatar and stats. The system supports multiple players competing in a series of mini-games or trivia challenges, with all real-time visuals and game logic handled by a p5.js interface on the computer.

This project merges tangible interaction (physical cards and duel disk) with digital feedback (customizable avatars, live scores, and dynamic mini-games), creating a seamless, engaging experience that emphasizes both individual expression and social play.

Arduino Program Design

Inputs:

  • Multiple RFID/NFC readers (or a shared reader for sequential scans), each detecting when a player places their card on the duel disk.

  • Optional: Buttons or touch sensors for additional in-game actions.

Outputs:

  • LEDs, buzzers, or vibration motors embedded in the disk to provide physical feedback (e.g., indicate successful scans, turns, or game outcomes).

Communication:

  • When a card is scanned, Arduino reads the card’s unique ID and sends it to the computer via serial communication.

  • Arduino can also receive commands from p5.js (e.g., to trigger LEDs or buzzers when a player wins a round).

Summary of Arduino’s Role:

  • Listen for card scans and input actions.

  • Transmit card IDs and sensor data to p5.js promptly.

  • Receive feedback commands from p5.js to control physical outputs.

p5.js Program Design

Responsibilities:

  • Listen for incoming serial data from the Arduino (card IDs, button presses).

  • Match each card ID to a player profile, loading their custom character (name, avatar, stats).

  • Allow players to customize their characters through a user-friendly interface.

  • Manage the game flow: let multiple players join, select mini-games, and track scores.

  • Display real-time game visuals, avatars, and results.

  • Send commands back to Arduino to trigger physical feedback (e.g., light up the winning player’s section).

Data Flow:

  • On card scan: p5.js receives the ID, loads or prompts for player customization, and adds the player to the game.

  • During play: p5.js updates visuals and scores based on game logic and player actions.

  • On game events: p5.js sends output commands to Arduino for physical feedback.

Interaction Design

  • Joining the Game: Each player places their card on the duel disk. The Arduino detects the card, and p5.js loads their profile or prompts for customization.

  • Customizing Characters: Players can use the p5.js interface to personalize their avatars, choose stats, and save progress.

  • Starting a Game: Once all players have joined, they select a mini-game or trivia challenge.

  • Gameplay: Players compete head-to-head or in teams, with p5.js managing the game logic and displaying results. Physical feedback (lights, sounds) enhances the experience.

  • Winning and Progression: Scores and achievements are tracked per player, and leaderboards are displayed at the end of each round.

System Communication

  • Arduino → p5.js: Sends card IDs and sensor/button states.

  • p5.js → Arduino: Sends commands to trigger LEDs, buzzers, or other outputs based on in-game events.

Project Progress and Next Steps

  • Prototyping: I am currently prototyping the card scanning system with Arduino and testing serial communication with p5.js.

  • UI/UX: Early sketches for the p5.js interface focus on clear avatar displays, easy customization, and intuitive game flow.

  • Game Logic: I am developing the first mini-game (a trivia challenge) and designing the multiplayer logic to support dynamic player counts.

Why This Project?

This system blends physical and digital interaction in a way that is social, customizable, and fun. It encourages users to invest in their characters and compete or collaborate with others, making every session unique. The project leverages Arduino for timely, tangible sensing and feedback, while p5.js handles multimedia processing and engaging visual responses- fulfilling the assignment’s requirements for a responsive, multimedia interactive system.

Week 11 – Assignment

Task 1:

We made it so that the program shows a circle on the screen that moves left and right when u rotate the potentiometer. The Arduino sends values to p5, and p5 reads those values through the serial port. As the potentiometer is rotated, the circle moves across the canvas to match its position. Theres also a button in the sketch that lets you connect to the Arduino manually.

p5.js

let serialPort;         // connection to the arduino
let connectButton;      // button to connect to arduino
let serialSpeed = 9600; // speed of communication between p5 and arduino
let xCircle = 300;      // starting x position of circle

function setup() {
  createCanvas(300, 300);
  background(245);

  serialPort = createSerial();

  let previous = usedSerialPorts();
  if (previous.length > 0) {
    serialPort.open(previous[0], serialSpeed);
  }

  connectButton = createButton("Connect Arduino"); // connect button
  connectButton.mousePressed(() => serialPort.open(serialSpeed)); // when clicked, connect
}

function draw() {
  let data = serialPort.readUntil("\n");  // reads the data from arduino

  if (data.length > 0) {       // if data received
    let sensorValue = int(data); // convert it to a number
    xCircle = sensorValue;       // use it to update the circles x position
  }

  background(245);
  fill(255, 80, 100);
  noStroke();
  ellipse(xCircle, height/2, 35); // draw circle at new position
}

Arduino

void setup() {
  Serial.begin(9600); // initialize serial communications
}
 
void loop() {
  // read the input pin:
  int potentiometer = analogRead(A1);                  
  // remap the pot value to 0-300:
  int mappedPotValue = map(potentiometer, 0, 1023, 0, 300); 
  Serial.println(mappedPotValue);
  // slight delay to stabilize the ADC:
  delay(1);                                            
  delay(100);
}

Task 2:

When the Arduino receives the letter ‘H’, it turns the LED on. When it receives the letter ‘L’, it turns the LED off. This lets you control the LED p5 by pressing the “Turn ON” or “Turn OFF” buttons.

p5.js

let serialPort;
let connectButton;
let serialSpeed = 9600;

function setup() {
  createCanvas(300, 200);
  background(240);

  serialPort = createSerial(); // create serial port connection

  let prev = usedSerialPorts(); // check if theres a previously used port
  if (prev.length > 0) {
    serialPort.open(prev[0], serialSpeed);
  }

  connectButton = createButton("Connect Arduino");
  connectButton.position(10, 10); // button position
  connectButton.mousePressed(() => serialPort.open(serialSpeed)); // open port when button clicked

  let onBtn = createButton("Turn ON"); // button to turn the LED on
  onBtn.position(10, 50); // position of ON button
  onBtn.mousePressed(() => serialPort.write('H')); // Send 'H' to arduino when pressed

  let offBtn = createButton("Turn OFF"); // button to turn the LED off
  offBtn.position(90, 50); // position of OFF button
  offBtn.mousePressed(() => serialPort.write('L')); // send 'L' to arduino when pressed
}

function draw() {
}

Arduino

void setup() {
  pinMode(9, OUTPUT);        // LED on pin 9
  Serial.begin(9600);        // start serial communication
}

void loop() {
  if (Serial.available()) {
    char c = Serial.read();

    if (c == 'H') {
      digitalWrite(9, HIGH); // turn LED on
    }
    else if (c == 'L') {
      digitalWrite(9, LOW);  // turn LED off
    }
  }
}

Task 3:

We used serialPort to read the sensor value and mapped it to wind force. To light up the LED only once per bounce, we added a boolean flag (ledTriggered). When the ball hits the ground, it sends a signal like ‘H’ to the Arduino to turn on the LED and ‘L’ to turn it off.

p5.js

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let serial;
let connectBtn;
let ledTriggered = false;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.5 * mass);
  wind = createVector(0, 0);

  // setup serial connection
  serial = createSerial();
  let previous = usedSerialPorts();
  if (previous.length > 0) {
    serial.open(previous[0], 9600);
  }

  connectBtn = createButton("Connect to Arduino");
  connectBtn.position(10, height + 10); // button position
  connectBtn.mousePressed(() => serial.open(9600));
}

function draw() {
  background(255);
  // check if we received any data
  let sensorData = serial.readUntil("\n");

  if (sensorData.length > 0) {
  // convert string to an integer after trimming spaces or newline

    let analogVal = int(sensorData.trim());
    let windForce = map(analogVal, 0, 1023, -1, 1);
    wind.x = windForce; // horizontal wind force
  }

  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x, position.y, mass, mass);
  if (position.y > height - mass / 2) {
    velocity.y *= -0.9; // A little dampening when hitting the bottom
    position.y = height - mass / 2;

    if (!ledTriggered) {
      serial.write("1\n");   // trigger arduino LED
      ledTriggered = true;
    }
  } else {
    ledTriggered = false;
  }
}

function applyForce(force) {
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed() {
  if (key === ' ') {
    mass = random(15, 80);
    position.y = -mass;
    velocity.mult(0);
  }
}

Arduino

int ledPin = 5;

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  // send sensor value to p5.js
  int sensor = analogRead(A1);
  Serial.println(sensor);
  delay(100);

  // check for '1' from p5 to trigger LED
  if (Serial.available()) {
    char c = Serial.read();
    if (c == '1') {
      digitalWrite(ledPin, HIGH);
      delay(100);
      digitalWrite(ledPin, LOW);
    }
  }
}

Assignment 10: Make a musical instrument

This is my Melodic Button Machine. It uses three push buttons (digital sensors) and a potentiometer (analog sensor) to create a simple, playful musical instrument. Each button plays a different musical note, while the potentiometer allows the player to bend the pitch of the note in real time- much like a musician bending a guitar string or sliding a trombone.

Machine Shown in Class

Assignment Brief

The assignment challenged us to create a musical instrument using Arduino technology. The requirements were clear: incorporate at least one digital sensor (such as a switch or button) and at least one analog sensor (like a potentiometer, photoresistor, or distance sensor). The instrument should respond to user input in a way that is both interactive and expressive.

Conceptualisation

The idea for this project emerged from my fascination with the simplicity of early electronic instruments. I remembered a childhood toy keyboard that could produce a handful of notes, and how magical it felt to create music with just a few buttons. I wanted to recreate that sense of wonder, but with a modern DIY twist. I also wanted to explore how analog and digital sensors could work together to give the user expressive control over the sound.

Process

Component Selection: I started by gathering the essential components: an Arduino Uno, a breadboard, three push buttons, a potentiometer, a piezo buzzer, jumper wires, and a handful of resistors. The buttons would serve as the digital inputs for note selection, while the potentiometer would act as the analog input to modulate pitch.

Circuit Assembly: The buttons were wired to digital pins 2, 3, and 4 on the Arduino, with internal pull-up resistors enabled in the code. The potentiometer’s middle pin was connected to analog pin A0, with its outer pins going to 5V and GND. The piezo buzzer was connected to digital pin 8, ready to bring the project to life with sound.

Code Development: I wrote Arduino code that assigned each button a specific musical note: C, D, or E. The potentiometer’s value was mapped to a pitch modulation range, so turning it would raise or lower the note’s frequency. This allowed for playful experimentation and made the effect of the potentiometer obvious and satisfying. I tested the code, tweaking the modulation range to make sure the pitch bend was dramatic and easy to hear.

Testing and Tuning: Once everything was wired up, I played simple tunes like “Mary Had a Little Lamb” and “Hot Cross Buns” by pressing the buttons in sequence. The potentiometer added a fun twist, letting me add vibrato or slides to each note.

Challenges

Pitch Range Calibration:
Finding the right modulation range for the potentiometer was tricky. If the range was too wide, the notes sounded unnatural; too narrow, and the effect was barely noticeable. After some trial and error, I settled on a ±100 Hz range for a musical yet expressive pitch bend.

Wiring Confusion:
With multiple buttons and sensors, it was easy to mix up wires on the breadboard. I solved this by colour-coding my jumper wires and double-checking each connection before powering up.

Potential Improvements

More Notes:
Adding more buttons would allow for a wider range of songs and melodies. With just three notes, the instrument can play simple tunes, but five or more would open up new musical possibilities.

Polyphony:
Currently, only one note can be played at a time. With some code modifications and additional hardware, I could allow for chords or overlapping notes.

Alternative Sensors:
Swapping the potentiometer for a light sensor or distance sensor could make the instrument even more interactive.

Visual Feedback:
Adding LEDs that light up with each button press or change colour with the pitch would make the instrument more visually engaging.

Schematics

Source Code

const int button1Pin = 2;
const int button2Pin = 3;
const int button3Pin = 4;
const int potPin = A0;
const int buzzerPin = 8;

// Define base frequencies for three notes (C4, E4, G4)
const int noteC = 262;  // C4
const int noteE = 330;  // E4
const int noteG = 294;  // D4

void setup() {
  pinMode(button1Pin, INPUT_PULLUP);
  pinMode(button2Pin, INPUT_PULLUP);
  pinMode(button3Pin, INPUT_PULLUP);
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  int potValue = analogRead(potPin);
  // Map potentiometer to a modulation range
  int modulation = map(potValue, 0, 1023, -100, 100);

  if (digitalRead(button1Pin) == LOW) {
    tone(buzzerPin, noteC + modulation); // Button 1: C note modulated
  } else if (digitalRead(button2Pin) == LOW) {
    tone(buzzerPin, noteE + modulation); // Button 2: E note modulated
  } else if (digitalRead(button3Pin) == LOW) {
    tone(buzzerPin, noteG + modulation); // Button 3: G note modulated
  } else {
    noTone(buzzerPin);
  }
}