Final Project: Mar’s Pie Shop

CONCPET:

I love baking, and I enjoy baking games. For my final project I decided to make a pie shop… Throughout my journey in Intro to IM, I made sure to always add a piece of me in every assignment and project. This final one is especially dear to me because I was able to finalize it after hard work. As well as being able to add my art in it (background and pie images). I think it goes well together and I’m proud of it.

FUN FACT: I have never had pie in my life!!!

HIGHLIGHT OF MY CODE IN P5.JS:

Entire Code: https://editor.p5js.org/mariamalkhoori/sketches/VUY94T9vh

-Function to check the correct match

function submitAnswer() {
  // Define the expected ranges for each statement
  const rawRange = [0, 200];
  const undercookedRange = [201, 400];
  const perfectRange = [401, 600];
  const overcookedRange = [601, 800];
  const burntRange = [801, 1023];

  let expectedRange;

  // Determine the expected range based on the current sentence
  switch (currentSentence) {
    case "I would like a raw pie to bake at home.":
      expectedRange = rawRange;
      break;
    case "I like my pie extremely soft.":
      expectedRange = undercookedRange;
      break;
    case "I want a perfect pie!":
      expectedRange = perfectRange;
      break;
    case "Crispy around the edges!!":
      expectedRange = overcookedRange;
      break;
    case "BURNT IT >:D":
      expectedRange = burntRange;
      break;
  }

  // Check if the current image range matches the expected range
  if (x >= expectedRange[0] && x <= expectedRange[1]) {
    // Increment the score for a correct match
    score++;
    
  ding.play();

-Different screen pages:

 

if (gameState === "start") {
  drawStartScreen();
} else if (gameState === "playing") {
  displayFoodImage();
  displayTimer();
  textSize(24);
  fill(255);
  text("Score: " + score, width - 100, 30);
} else if (gameState === "end") {
  // Display results in the end screen
  textSize(210);
  fill(255);
  text("Your Score: " + score, width / 2, height / 2);
  playAgainButton.show(); // Show the button on the end screen
}
else {
  // Draw other game elements here
  playAgainButton.hide(); // Hide the button if not in the end state
}

main COMPONENTS:

-LED Button

-Potentiometer

OVEN PROTOTYPE AND SCHEMATICS:

 

 

USER TESTING:

 

CHALLENGES:

  • Many challenges were faced with the coding part.
  • Button wasn’t working to confirm answers and also change the requests of the order.
  • Sometimes the images controlled by the potentiometer would not show, or not change.
  • I also had some issues with serial connection because of the wirings

FUTURE IMPORVEMENTS:

  • I hoped to be able to make the game restart without having to exit full screen.
  • Make the prototype a little prettier
  • Fine-tune the difficulty level based on user feedback. Adjust timer durations, scoring mechanisms, or image recognition ranges to make the game challenging yet enjoyable.
  • Introduce new challenges or power-ups to add variety to the gameplay.
  • Ensure that the game is responsive and looks good on various screen sizes.
  • I wanted to add a leaderboard to display at the end

IM SHOWCASE:

Final Project: Remember To Keep A Little Heart..

Mega trying out my project at the Showcase.

Concept:

My primary source of inspiration was Andrew Schneider’s ‘Stars’ exhibit, as it would not be an overstatement to say that it changed my life. His method of combining technology with poetry resonated with me because he allowed me to see that what I wanted to do was possible. I also want to acknowledge how much his approach of jumping into the project before sorting out the functional technicalities inspired me. Too often, I get discouraged from starting projects because of my insecurities surrounding my computational knowledge. Schneider reminded me to approach creating the way a child would–prioritizing fun over understanding. I didn’t have to be an expert to have a vision, or to create something great.

My second source of inspiration was my favorite installation at Manar Abu Dhabi, which detected your pulse in order to make a field of light glow with your heartbeat. Both this installation and Schneider’s used technology that evoked breathtaking, light-filled phenomena. But somehow, through either words or your heartbeat, they connected these surroundings back to you, indicating a kind of macro-micro spiritual connection. I wanted my project to accomplish an effect along these lines.

A more tangential inspiration was Tiktok. Most people my age consume a large amount of reels day to day, but every once in a while, a deeper, more meaningful video will appear from the usual, mind numbing congelation that is Tiktok and Instagram reels. I would find myself being surprisingly moved, and thought it was interesting how a lot of our generation must connect to inspiration or meditative clarity through these short-form videos. Because of the small scale of my project, I wanted to achieve an emotional effect closer to that of a meditative Tiktok–short, sweet, but profound enough to make you pause and feel something before resuming daily life.

The Process, Beginning With Arduino:

I started by tackling the Arduino part of the project. I followed a page linked here, which, thankfully, contained the exact instructions I needed. I followed the schematic they displayed, see below:

But I also drew my own schematic, here:

But here is an actual photograph, just to be sure:

The Arduino wiring couldn’t have been more simple. And frankly, the Arduino coding wasn’t that much more difficult either. I downloaded a PulseSensor library that I found on the page and made modifications to format the code for my own objectives. Here’s the code below:

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  // Arduino sending data to component, the green light, to light up

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH);
    Serial.println("0,0");
    delay(300);
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH);
    int left = Serial.parseInt();
    int right = Serial.parseInt();
 // Example of parsing, where Arduino breaks down left and right components so that when it sends the code to P5JS, P5JS interprets the left and right as one so that the code can be executed smoothly 
    if (Serial.read() == '\n') {
      int pulseSensor = analogRead(A0);
// Mapping analog values to digital values so pulse sensor can be read and show up on P5JS
      int mappedPulseSensor = map(pulseSensor, 0, 1023, 0, 255);
      delay(5);
      Serial.print(mappedPulseSensor);
      Serial.print(',');
      Serial.println(mappedPulseSensor);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

The most important section worthy of mention is the mapping of the pulse sensor to a digital range. Apart from the voltage and ground wires, because the pulse sensor reads analog signals, I plugged the last wire into A0 and then mapped its 0 to 1023 range to 0 to 255, so that P5JS could read its detections and convert them into the pulses of the ellipse on the screen.

The Process, Continuing With P5JS:

The most challenging section of this project was the coding regarding P5JS. I started by coding a glowing ellipse and making sure that it pulsed with the readings of the sensor, linked here, but also shown below:

let rVal = 0;
let pulse = 255;
let left = 0;
let right = 0;


function setup() {
  createCanvas(600,600);
  textSize(18);
}

function draw() {
  // one value from Arduino controls the background's red color
  background(0);

  // the other value controls the text's transparency value
  fill(255, 255, 255);
  strokeWeight(0);

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
 
  
  drawingContext.shadowBlur = 80;
  drawingContext.shadowColor = color(255);
  ellipse(width/2, height/2, map(pulse, 0, 255, 10, 100))

  
}
 


function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 2) {
      // only store values here
      // do everything with those values in the main draw loop
      rVal = fromArduino[0];
      pulse = fromArduino[1];
      print(pulse);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }
}

I coded the ellipse in the above section, making sure to map the diameter of the ellipse to the signals of the sensor, causing the pulsing effect. The section regarding how P5JS read the information from Arduino, but also sent information back to Arduino (see Handshake), so that Arduino and P5JS worked in tandem to make the ellipse pulse.

The Process, Continuing With P5JS:

After sorting out the ellipse pulsing, which was the main component, I began coding the stars and words I wanted to go along with the pulsing ellipse. I was really inspired by Pierre’s midterm project, because it gave that same all-encompassing starry night effect that Schneider’s installation did. I began by trying to make modifications to Pierre’s code, seen here, but I quickly realized after trying to insert the ellipse into it that coding within the constraints of WEBGL would be too difficult for me for the purposes of this project. While experimenting, I accidentally created this, which might be one of the coolest things I’ve ever made. I plan to pursue WEBGL on my own time, but I acknowledged that working within a two-dimensional realm for this project would suffice.

So, I began by coding the stars, linked here. I’ve attached the most difficult section of the code below:

function createStars() {
  for (let i = 0; i < 100; i++) {
    let x = random(width);
    let y = random(height);
    let radius = random(1, 3);
// I made the radius random within these constraints so the ellipses weren't all the same size, giving that 'starry night' effect and the illusion of distance, since I couldn't hack WEBGL
    let speedX = random(1, 3) * (random() > 0.5 ? 1 : -1); // So that the stars randomly go left and right
    let speedY = random(1, 3) * (random() > 0.5 ? 1 : -1); // So that the stars randomly go up or down 
    stars.push({ x, y, radius, speedX, speedY });
  }
}

function moveStars() {
  for (let star of stars) {
    star.x += star.speedX;
    star.y += star.speedY;

    // Makes the stars stay within the constraints of the display screen
    if (star.x < 0 || star.x > width || star.y < 0 || star.y > height) {
      star.x = random(width);
      star.y = random(height);
    }

I wanted the size to be random within the constraints of 1 and 3 so that I could give the ellipses that ‘starry night’ effect. Making their radiuses vary also gave the illusion of depth and distance, since I couldn’t hack WEBGL. I made their movement as random as I could, making them go in different directions so I could add as much dynamism to the project as possible. The focus of the project, the fixed ellipse, is not that moving and stimulating, so I knew I needed to make the stars add that layer of movement to make sure the project would be engaging.

After coding the stars, I combined the code of that with the pulsing ellipse so that I could begin adding the words.

The Result!!!!

And this was the result! Linked here, but also seen below!

let rVal = 0;
let pulse = 255;
let left = 0;
let right = 0;
let stars = [];
let ellipseVisible = false;
let starsVisible = false;
let keyPressedOnce = false;
let showAdditionalText = false;
let mPressed = false;

function setup() {
  createCanvas(2000, 1300);
  textSize(34);
  createStars();
}

function draw() {
  background(0);
  fill(255);
  textAlign(CENTER, CENTER);
  
  
//Code that ensures that the instructions for the interaction, and the serial port appear before the ellipse and stars appear
  if (!serialActive) {
    text("PRESS SPACE TO SELECT SERIAL PORT", width / 2, height / 2);
  } else if (!ellipseVisible && !starsVisible) {
    text("LIGHTLY TOUCH YOUR FINGER TO THE PULSE SENSOR", width / 2, height / 2);
  } else if (ellipseVisible || starsVisible) {
    
//Responsible for making the ellipse glow     
    drawingContext.shadowBlur = 80;
    drawingContext.shadowColor = color(255);
// Maps ellipse to arduino board, making it pulse
    let ellipseSize = map(pulse, 0, 255, 10, 300);
// Determines dimensions of ellipse
    ellipse(width / 2, height / 2, ellipseSize);
  }

// Once the ellipse and stars appear, the text taking you through the interaction appears too
  if (ellipseVisible && !starsVisible) {
    text("GIVE THE SENSOR TIME TO PICK UP ON YOUR PULSE\nBUT ONCE YOU SEE YOUR HEART BEATING, ACKNOWLEDGE\nTHAT THIS PULSE IS YOU, ALIVE AND BREATHING\n\nPress 'n'", width / 2, height - 180);
  } else if (starsVisible && !keyPressedOnce) {
// ensures that the stars appear and keep moving with each frame
    moveStars();
    displayStars();
    text("YOU ARE ALIVE AND BREATHING IN THE UNIVERSE\nLOOK AT AT ALL THE STARS\nBUT SOMETIMES, THE UNIVERSE SEEMS SO BIG\n IT'S EASY TO FEEL SMALL\n\nPress 'm'", width / 2, height - 180);
  } else if (starsVisible && !showAdditionalText) {
    moveStars();
    displayStars();
    text("BUT THE UNIVERSE IS ALIVE WITH YOU TOO\n\nPress 'v'", width / 2, height - 180);
  } else if (showAdditionalText) {
    moveStars();
    displayStars();
    text("TELL YOUR FRIENDS YOU LOVE THEM\nREMEMBER TO FEEL YOUR HEARTBEAT WHEN YOU\n LOOK UP AT THE SKY, BECAUSE SOMEDAY\nYOU'LL WISH YOU COULD'VE LIVED IT ALL OVER AGAIN", width / 2, height - 180);
  }
}

function createStars() {
// Defines speed and number of stars
  for (let i = 0; i < 100; i++) {
    let x = random(width);
    let y = random(height);
// Makes sure the stars are randomly different sizes within these constraints to give them that 'starry sky' effect
    let radius = random(1, 3);
// Defines random speed of stars within these constraints
    let speedX = random(1, 3) * (random() > 0.5 ? 1 : -1);
    let speedY = random(1, 3) * (random() > 0.5 ? 1 : -1);
    stars.push({ x, y, radius, speedX, speedY });
  }
}

function moveStars() {
  for (let star of stars) {
// After the speed of the stars has been defined, use moveStars to make sure they keep moving through the entire project 
    star.x += star.speedX;
    star.y += star.speedY;

// Makes sure stars move within the display screen
    if (star.x < 0 || star.x > width || star.y < 0 || star.y > height) {
      star.x = random(width);
      star.y = random(height);
    }
  }
}

// Ensures the stars never disappear with each changing slide
function displayStars() {
  noStroke();
  for (let star of stars) {
// Makes stars glow after key 'm' is pressed 
    if (mPressed) {
      fill(255, 255, 255, 180);
// Defines the size of the shadow that gives the stars their 'glow' effect
      ellipse(star.x, star.y, star.radius * 7);
      // for (i = 0; i < 100; i++) {
      //   ellipse(star.x, star.y, (star.radius * i * 1) / 20);
      // }
    }
    fill(255);
    ellipse(star.x, star.y, star.radius * 2, star.radius * 2);
  }
}

// My attempt to make the stars pulse, but I was okay with them simply glowing, so I never used updateStars but I kept it just in case
function updateStars() {
  let rValMapped = map(rVal, 0, 255, -0.1, 0.1);
  let pulseMapped = map(pulse, 0, 255, 1, 3);
  let pulsingFactor = map(pulse, 0, 255, 0.5, 2);

  for (let star of stars) {
    star.speedX += rValMapped;
    star.speedY += rValMapped;

    star.radius = pulseMapped * pulsingFactor;
  }
}


// These are all the instructions so that the project knows to move to the next slide when certain kets are pressed
function keyPressed() {
  if (key == " " && !serialActive) {
    setUpSerial();
  } else if (key == "n" && ellipseVisible && !starsVisible) {
    starsVisible = true;
  } else if (key == "m" && starsVisible && !mPressed) {
    keyPressedOnce = true;
    mPressed = true;
  } else if (key == "m" && starsVisible && mPressed) {
    mPressed = false;
  } else if (key == "v" && starsVisible && keyPressedOnce) {
    showAdditionalText = true;
//Code that allows one to exit our of and enter fullscreen by pressing the key 'f'
  } else if (key == "f") {
    if (!fullscreen()) {
      fullscreen(true);
    } else {
      fullscreen(false);
    }
  }
}

// How P5JS knows to read from Arduino in order to give the ellipse the pulse
function readSerial(data) {
  if (data != null) {
    let fromArduino = split(trim(data), ",");
    if (fromArduino.length == 2) {
      rVal = fromArduino[0];
//The pulse from Arduino that P5JS was able to read to make the ellipse pulse
      pulse = fromArduino[1];
    }

    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
// If a pulse greater than 0 is detected from the sensor and the ellipse is visible, then the pulsing begins
    if (pulse > 0 && !ellipseVisible) {
      ellipseVisible = true;
    }
  }
}


// Ensures stars and ellipse stay visible when mouse is pressed 
function mousePressed() {
  if (!starsVisible && ellipseVisible) {
    keyPressedOnce = true;
    starsVisible = true;
  }
}

I commented everything, but the most difficult part of the code was definitely this section:

// Once the ellipse and stars appear, the text taking you through the interaction appears too
  if (ellipseVisible && !starsVisible) {
    text("GIVE THE SENSOR TIME TO PICK UP ON YOUR PULSE\nBUT ONCE YOU SEE YOUR HEART BEATING, ACKNOWLEDGE\nTHAT THIS PULSE IS YOU, ALIVE AND BREATHING\n\nPress 'n'", width / 2, height - 180);
  } else if (starsVisible && !keyPressedOnce) {
// ensures that the stars appear and keep moving with each frame
    moveStars();
    displayStars();
    text("YOU ARE ALIVE AND BREATHING IN THE UNIVERSE\nLOOK AT AT ALL THE STARS\nBUT SOMETIMES, THE UNIVERSE SEEMS SO BIG\n IT'S EASY TO FEEL SMALL\n\nPress 'm'", width / 2, height - 180);
  } else if (starsVisible && !showAdditionalText) {
    moveStars();
    displayStars();
    text("BUT THE UNIVERSE IS ALIVE WITH YOU TOO\n\nPress 'v'", width / 2, height - 180);
  } else if (showAdditionalText) {
    moveStars();
    displayStars();
    text("TELL YOUR FRIENDS YOU LOVE THEM\nREMEMBER TO FEEL YOUR HEARTBEAT WHEN YOU\n LOOK UP AT THE SKY, BECAUSE SOMEDAY\nYOU'LL WISH YOU COULD'VE LIVED IT ALL OVER AGAIN", width / 2, height - 180);
  }

Making the words appear in the order I wanted, without the ellipse or stars disappearing was difficult and something I had to work at for a while. I solved this by coding successive “if, else” statements and establishing this:

let starsVisible = false;
let keyPressedOnce = false;
let showAdditionalText = false;

at the beginning of the code. My friend Zion helped me extensively through this part of the coding, so thank you Zion.

After that, I just had to make the small adjustments of making the text the size I wanted, adjusting the stars to full screen, and the project was good to go. I didn’t know how to upload the song I wanted, “Stone in Focus” by Aphex Twin, as a file to P5JS, so I just played it in a Youtube tab while the project was going on.

I also made the stars glow with the ellipse when the line “BUT THE UNIVERSE IS ALIVE WITH YOU TOO” to give a sense of connection with the universe. The stars got triggered to glow after pressing the ‘m’ key.

Showcase Day:

On the day of the showcase, I constructed my setup, seen here:

You can see the headphones that people put on to hear the music, but I want to bring special attention to the box:

One of my biggest concerns developing the project was the unreliability of the pulse sensor itself. It would reliably pick up on a pulse but also pick up on any extra movement of your hand, making the pulse look wonky. It’s because the pulse sensor works by shining a light on your finger, and reading your pulse by the way it affects the light. So if you move your finger too much, it disrupts the pulse sensors ability to pick up on your heartbeat. At Manar Abu Dhabi, even that pulse sensor had to wait up to a few minutes before being able to get a stable reading. So, creating better pulse sensors for future interactive installations is a concern. But in order to counteract this problem, I constructed a box that the audience member could naturally rest their hands on so that the pulse sensor could pick up a more stable reading. It also looked more approachable and finished as an interactive installation.

One concern I left with though, from a design point of view, was that people tended to put their their right hand on the sensor, causing them to have to reach over with their left hand to press the keys to go through the project. I would want to make my project more convenient and interactive by fixing this issue.

Takeaways:

In the end, I am so proud of what I made, because I was able to capture an,  albeit small, portion of the feeling that I experienced when at Schneider’s installation and Manar Abu Dhabi. Here are some people interacting with my project:

What they’re going through, is seeing their pulse on the screen and then reading successive phrases that remind them of their connection with the universe, giving them that same macro-micro spiritual connection that Andrew Schneider’s installation gave me. One of my favorite moments was when Linda, an audience member, saw her pulse on the screen and gasped, “It’s me!” That’s the exact effect I wanted to provide.

At the beginning of this post, I posted a picture of Mega and me. After she had completed the project, she turned to me and said, “Elora, I love you.” Even though the effect was small, I had emotionally achieved what I set out to do with this project, and I am so grateful.

User Testing- Puzzle Picture

 

It was pretty exciting to see my roommate and friend dive into the project I’ve been working on. They handled it surprisingly well, navigating through most of it without needing any pointers from me. That’s a good sign that things are intuitive. Most parts of the project seemed to flow smoothly for them, but the joystick part caused a bit of confusion. This is something I would need to focus on especially for those who aren’t familiar with sliding puzzles. I’m sure making that clearer would make it more user-friendly and increase useability. Once I explained the basic instructions—reading the guidelines and hitting the right buttons—they seemed to sail through the rest. Straightforward instructions can really make a difference. There are definitely some areas I want to improve. The shuffling aspect of the puzzle needs tweaking. Sometimes, it gets all twisted up in a way that makes it impossible to solve. It’s on my list of things to work on. Also, those key press functions need fixing. Having to stop and restart the whole thing instead of a simple button press for a restart can be a hassle. So these are things that I am definitely working on for improvement. But overall, I’m pretty proud of where the project stands. It feels good to be at a comfortable point, just needing a few tweaks to make it even better. Excited to keep refining it.

Final Project Second Draft: Mar’s Pie Shop

Concpet:

I love baking, and I enjoy baking games. For my final project I decided to make a pie shop… Throughout my journey in Intro to IM, I made sure to always add a piece of me in every assignment and project. This final one is especially dear to me because I was able to finalize it after hard work. As well as being able to add my art in it (background and pie images). I think it goes well together and I’m proud of it.

FUN FACT: I have never had pie in my life!!!

Highlight OF MY Code in P5.js:

-Function to check the correct match

function submitAnswer() {
  // Define the expected ranges for each statement
  const rawRange = [0, 200];
  const undercookedRange = [201, 400];
  const perfectRange = [401, 600];
  const overcookedRange = [601, 800];
  const burntRange = [801, 1023];

  let expectedRange;

  // Determine the expected range based on the current sentence
  switch (currentSentence) {
    case "I would like a raw pie to bake at home.":
      expectedRange = rawRange;
      break;
    case "I like my pie extremely soft.":
      expectedRange = undercookedRange;
      break;
    case "I want a perfect pie!":
      expectedRange = perfectRange;
      break;
    case "Crispy around the edges!!":
      expectedRange = overcookedRange;
      break;
    case "BURNT IT >:D":
      expectedRange = burntRange;
      break;
  }

  // Check if the current image range matches the expected range
  if (x >= expectedRange[0] && x <= expectedRange[1]) {
    // Increment the score for a correct match
    score++;
    
  ding.play();

-Different screen pages:

if (gameState === "start") {
  drawStartScreen();
} else if (gameState === "playing") {
  displayFoodImage();
  displayTimer();
  textSize(24);
  fill(255);
  text("Score: " + score, width - 100, 30);
} else if (gameState === "end") {
  // Display results in the end screen
  textSize(210);
  fill(255);
  text("Your Score: " + score, width / 2, height / 2);
  playAgainButton.show(); // Show the button on the end screen
}
else {
  // Draw other game elements here
  playAgainButton.hide(); // Hide the button if not in the end state
}

Arduino IDE Code:

int potPin = A0;      // Potentiometer connected to analog pin A0
int buttonPin = 2;    // Digital button connected to digital pin 2

void setup() {
  Serial.begin(9600); // Set the baud rate to the same value as in p5.js
  pinMode(buttonPin, INPUT_PULLUP); // Set the digital button pin as input with internal pull-up resistor
}

void loop() {
  int potValue = analogRead(potPin); // Read potentiometer value (0 to 1023)
  int buttonState = digitalRead(buttonPin); // Read button state (HIGH or LOW)
  Serial.print(potValue); // Send potentiometer value to p5.js
  Serial.print(',');
  Serial.println(buttonState); // Send button state to p5.js

  delay(550); 
}

Components:

-LED Button

-Potentiometer

OveN Prototype and schematics:

 

 

User Testing:

 

Challenges:

  • Many challenges were faced with the coding part.
  • Button wasn’t working to confirm answers and also change the requests of the order.
  • Sometimes the images controlled by the potentiometer would not show, or not change.
  • I also had some issues with serial connection because of the wirings

Future Imporvements:

  • I hoped to be able to make the game restart without having to exit full screen.
  • Make the prototype a little prettier
  • Fine-tune the difficulty level based on user feedback. Adjust timer durations, scoring mechanisms, or image recognition ranges to make the game challenging yet enjoyable.
  • Introduce new challenges or power-ups to add variety to the gameplay.
  • Ensure that the game is responsive and looks good on various screen sizes.

 

FINAL PROJECT- MUSICAL MAP

IMG_7438For my final project, I had to last minute change the concept since my midi controller idea fell through and proved to be unfit for this assignment. My project now is a musical map that lets users play music from different regions of the world. I initially wanted to create a map of the globe but decided to consolidate my map into one continent. The continent I chose was South America (SA). South America resonated with me the most because of my love of Latin American music! I initially was going to take an image of the map from off of the internet but decided to design my own map of SA. This was designed in Adobe Illustrator.

I picked a total of 13 songs all from different parts of SA:

Brazil: Dindi – Sylvia Telles

Venezuela: Mi Corazon – Los Melodicos

Colombia: Tobaco y Ron – Rodolfo y su typica ra7

Ecuador: Escribeme – Tierra Canela

Suriname: Moi Misi – Sabokoe

Guyana: Catch me Lova – JMC

French Guiana: Kraze  Dj Action X Jaydieff

Peru: Peru de Valses: Y se llama Peru – Oscar Avilles

Chile: 1977 – Ana Tijoux

Argentina: Buenos Aires – Nathy Peluso

Paraguay: Inevitable – Shako El Sh

Uruguay: Cora Aislado – Bebe 21

Bolivia: Amor Segrado – Chila Jatun

 

USER TESTING:

MUSICAL MAP

Interaction Design:

Users simply can move the joystick, which acts as a cursor on the screen and press on the squares in the p5 sketch to trigger a unique sound file from each country.

Aesthetic Design:

I made my box out of cardboard and painted on several separate sheets of paper a similar color scheme to my digital design of the map. I then glued those pieces of paper onto the box!

P5 Code Sketch:

https://editor.p5js.org/dianadonatella/sketches/NDBhUIeWI

P5 Code:

let img; //variable for image

//variables for square buttons.
let Colombia;
let Argentina;
let Peru;
let Paraguay;
let Uruguay;
let Guyana;
let Fguiana;
let Venezuela;
let Brazil;
let Bolivia;
let Ecuador;
let Chile;
let Suriname;

//colors for each country's buttons
let color_state_col;
let color_state_ven;
let color_state_arg;
let color_state_bra;
let color_state_bol;
let color_state_per;
let color_state_guy;
let color_state_par;
let color_state_uru;
let color_state_chi;
let color_state_fgu;
let color_state_sur;
let color_state_ecu;

//PLAYING AND PAUSING states
let isColombiaPlaying = false;
let isVenezuelaPlaying = false;
let isArgentinaPlaying = false;
let isBrazilPlaying = false;
let isBoliviaPlaying = false;
let isPeruPlaying = false;
let isGuyanaPlaying = false;
let isParaguayPlaying = false;
let isUruguayPlaying = false;
let isChilePlaying = false;
let isFguianaPlaying = false;
let isSurinamePlaying = false;
let isEcuadorPlaying = false;

//ORIGINAL BUTTON COLORS
let originalColorCol;
let originalColorVen;
let originalColorArg;
let originalColorBra;
let originalColorBol;
let originalColorPer;
let originalColorGuy;
let originalColorPar;
let originalColorUru;
let originalColorChi;
let originalColorFgu;
let originalColorSur;
let originalColorEcu;

//For joystick
let joystick_X;
let joystick_Y;
let joystick_button;
let circleX;
let circleY;

//Only update when Joystick is pressed, making it equal to the time that the joystick was pressed at
let LastTime = 0;
// Always updated with the latest number of Millis
let CurrentTime = 0;

function success(e) {
  console.log("works: ", e);
}

function error(e) {
  console.log("oopsie: ", e);
}

function loading(e) {
  log("loading", e);
}


function preload() {
  img = loadImage("SA_MAP.png");
  Colombia = loadSound("./COLOMBIA.mp3", success, error, loading);
  Argentina = loadSound("./ARGENTINA.mp3");
  Peru = loadSound("./PERU.mp3");
  Paraguay = loadSound("./PARAGUAY.mp3");
  Uruguay = loadSound("./URUGUAY.mp3");
  Guyana = loadSound("./GUYANA.mp3");
  Fguiana = loadSound("./FGUIANA.mp3");
  Venezuela = loadSound("./VENEZUELA.mp3");
  Brazil = loadSound("./BRAZIL.mp3");
  Bolivia = loadSound("./BOLIVIA.mp3");
  Ecuador = loadSound("./ECUADOR.mp3");
  Chile = loadSound("./CHILE.mp3");
  Suriname = loadSound("./SURINAME.mp3");
}

function setup() {
  createCanvas(1920, 1020);
  background(220);
  //   colors for each button
  color_state_col = color(32, 37, 179);
  color_state_ven = color(35, 60, 98);
  color_state_arg = color(32, 37, 179);
  color_state_bra = color(100, 57, 79);
  color_state_bol = color(35, 60, 98);
  color_state_per = color(32, 37, 179);
  color_state_guy = color(35, 60, 98);
  color_state_par = color(24, 32, 89);
  color_state_uru = color(24, 32, 89);
  color_state_chi = color(100, 57, 79);
  color_state_fgu = color(24, 32, 89);
  color_state_sur = color(32, 37, 179);
  color_state_ecu = color(24, 32, 89);

  //original state colors after pausing
  originalColorCol = color_state_col;
  originalColorVen = color_state_ven;
  originalColorArg = color_state_arg;
  originalColorBra = color_state_bra;
  originalColorBol = color_state_bol;
  originalColorPer = color_state_per;
  originalColorGuy = color_state_guy;
  originalColorPar = color_state_par;
  originalColorUru = color_state_uru;
  originalColorChi = color_state_chi;
  originalColorFgu = color_state_fgu;
  originalColorSur = color_state_sur;
  originalColorEcu = color_state_ecu;

  circleX = width / 2;
  circleY = height / 2;
}

function draw() {
  if (img) {
    image(img, 0, 0, width, height);
  }
  
//references colors and sets sizes for each square 
  //COLOMBIA
  fill(color_state_col);
  rect(790, 135, 40, 40);

  //VENEZUELA
  fill(color_state_ven);
  rect(910, 80, 35, 35);

  //ARGENTINA
  fill(color_state_arg);
  rect(940, 700, 50, 50);

  //BRAZIL
  fill(color_state_bra);
  rect(1150, 400, 50, 50);

  //BOLIVIA
  fill(color_state_bol);
  rect(935, 435, 40, 40);

  //PERU
  fill(color_state_per);
  rect(800, 360, 40, 40);

  //GUYANA
  fill(color_state_guy);
  rect(1038, 125, 23, 23);

  //PARAGUAY
  fill(color_state_par);
  rect(1012, 490, 30, 30);

  //URUGUAY
  fill(color_state_uru);
  rect(1120, 700, 30, 30);

  //CHILE
  fill(color_state_chi);
  rect(800, 600, 40, 40);

  //FGUIANA
  fill(color_state_fgu);
  rect(1170, 90, 30, 30);

  //SURINAME
  fill(color_state_sur);
  rect(1090, 58, 25, 25);

  //ECUADOR
  fill(color_state_ecu);
  rect(650, 200, 30, 30);
  
  //increments joystick in the X or Y position

  if (joystick_X > 800) {
    circleX += 5;
  }
  if (joystick_X < 200) {
    circleX -= 5;
  }

  if (joystick_Y > 800) {
    circleY -= 5;
  }
  if (joystick_Y < 200) {
    circleY += 5;
  }
  
  //cursor shape and design
  
  fill(0, 0, 0, 100);
  circle(circleX, circleY, 30);
  stroke(255);
  line(circleX, circleY + 30, circleX, circleY - 30);
  line(circleX - 30, circleY, circleX + 30, circleY);
  JoyStickPressed();
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }

  console.log(circleX);
}
// to check when joystick is pressed and to play or pause the song playing for a specific country

function JoyStickPressed() {


  //Updating current time
  CurrentTime = millis();
  if (joystick_button == 1 && CurrentTime > LastTime + 2000) {
    // Check if the joystick is within the Colombia button
    if (
      circleX > 790 &&
      circleX < 790 + 40 &&
      circleY > 135 &&
      circleY < 135 + 40
    ) {
      if (!isColombiaPlaying) {
        Colombia.play();
      } else {
        Colombia.pause();
      }
      isColombiaPlaying = !isColombiaPlaying;
      color_state_col = isColombiaPlaying ? color(255) : originalColorCol;
    }

    if (
      circleX > 910 &&
      circleX < 910 + 35 &&
      circleY > 80 &&
      circleY < 80 + 35
    ) {
      if (!isVenezuelaPlaying) {
        Venezuela.play();
      } else {
        Venezuela.pause();
      }
      isVenezuelaPlaying = !isVenezuelaPlaying;
      color_state_ven = isVenezuelaPlaying ? color(255) : originalColorVen;
    }

    if (
      circleX > 940 &&
      circleX < 940 + 50 &&
      circleY > 700 &&
      circleY < 700 + 50
    ) {
      if (!isArgentinaPlaying) {
        Argentina.play();
      } else {
        Argentina.pause();
      }
      isArgentinaPlaying = !isArgentinaPlaying;
      color_state_arg = isArgentinaPlaying ? color(255) : originalColorArg;
    }

    if (
      circleX > 1150 &&
      circleX < 1150 + 50 &&
      circleY > 400 &&
      circleY < 400 + 50
    ) {
      if (!isBrazilPlaying) {
        Brazil.play();
      } else {
        Brazil.pause();
      }
      isBrazilPlaying = !isBrazilPlaying;
      color_state_bra = isBrazilPlaying ? color(255) : originalColorBra;
    }

    if (
      circleX > 935 &&
      circleX < 935 + 40 &&
      circleY > 435 &&
      circleY < 435 + 40
    ) {
      if (!isBoliviaPlaying) {
        Bolivia.play();
      } else {
        Bolivia.pause();
      }
      isBoliviaPlaying = !isBoliviaPlaying;
      color_state_bol = isBoliviaPlaying ? color(255) : originalColorBol;
    }

    if (
      circleX > 800 &&
      circleX < 800 + 40 &&
      circleY > 360 &&
      circleY < 360 + 40
    ) {
      if (!isPeruPlaying) {
        Peru.play();
      } else {
        Peru.pause();
      }
      isPeruPlaying = !isPeruPlaying;
      color_state_per = isPeruPlaying ? color(255) : originalColorPer;
    }
    if (
      circleX > 1038 &&
      circleX < 1038 + 23 &&
      circleY > 125 &&
      circleY < 125 + 23
    ) {
      if (!isGuyanaPlaying) {
        Guyana.play();
      } else {
        Guyana.pause();
      }
      isGuyanaPlaying = !isGuyanaPlaying;
      color_state_guy = isGuyanaPlaying ? color(255) : originalColorGuy;
    }

    if (
      circleX > 1012 &&
      circleX < 1012 + 30 &&
      circleY > 490 &&
      circleY < 490 + 30
    ) {
      if (!isParaguayPlaying) {
        Paraguay.play();
      } else {
        Paraguay.pause();
      }
      isParaguayPlaying = !isParaguayPlaying;
      color_state_par = isParaguayPlaying ? color(255) : originalColorPar;
    }

    if (
      circleX > 1120 &&
      circleX < 1120 + 30 &&
      circleY > 700 &&
      circleY < 700 + 30
    ) {
      if (!isUruguayPlaying) {
        Uruguay.play();
      } else {
        Uruguay.pause();
      }
      isUruguayPlaying = !isUruguayPlaying;
      color_state_uru = isUruguayPlaying ? color(255) : originalColorUru;
    }

    if (
      circleX > 800 &&
      circleX < 800 + 40 &&
      circleY > 600 &&
      circleY < 600 + 40
    ) {
      if (!isChilePlaying) {
        Chile.play();
      } else {
        Chile.pause();
      }
      isChilePlaying = !isChilePlaying;
      color_state_chi = isChilePlaying ? color(255) : originalColorChi;
    }

    if (
      circleX > 1170 &&
      circleX < 1170 + 30 &&
      circleY > 90 &&
      circleY < 90 + 30
    ) {
      if (!isFguianaPlaying) {
        Fguiana.play();
      } else {
        Fguiana.pause();
      }
      isFguianaPlaying = !isFguianaPlaying;
      color_state_fgu = isFguianaPlaying ? color(255) : originalColorFgu;
    }

    if (
      circleX > 1090 &&
      circleX < 1090 + 25 &&
      circleY > 58 &&
      circleY < 58 + 25
    ) {
      if (!isSurinamePlaying) {
        Suriname.play();
      } else {
        Suriname.pause();
      }
      isSurinamePlaying = !isSurinamePlaying;
      color_state_sur = isSurinamePlaying ? color(255) : originalColorSur;
    }

    if (
      circleX > 650 &&
      circleX < 650 + 30 &&
      circleY > 200 &&
      circleY < 200 + 30
    ) {
      if (!isEcuadorPlaying) {
        Ecuador.play();
      } else {
        Ecuador.pause();
      }
      isEcuadorPlaying = !isEcuadorPlaying;
      color_state_ecu = isEcuadorPlaying ? color(255) : originalColorEcu;
    }
    LastTime = millis();
  }
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}


//TO SET UP SERIAL CONNECTION
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 3) {
      // only store values here
      // do everything with those values in the main draw loop

      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      joystick_X = int(fromArduino[0]);
      joystick_Y = int(fromArduino[1]);
      joystick_button = int(fromArduino[2]);
    }
  }
}

Arduino Code:

int xValue = 0 ;
int yValue = 0 ; 
int bValue = 0 ;

void setup()	
{	
    Serial.begin(9600) ;
    pinMode(8,INPUT); 
    digitalWrite(8,HIGH);	
}	

void loop()	
{	
    xValue = analogRead(A2);	
    yValue = analogRead(A1);	
    bValue = digitalRead(8);	
    Serial.print(xValue,DEC);
    Serial.print(",");
    Serial.print(yValue,DEC);
    Serial.print(",");
    Serial.println(!bValue);

    delay(10);	
}

Schematic:

 

CHALLENGES:

I initially had quite a few issues in p5 trying to upload the song files. I realized that it was because I placed all the songs in a folder and P5 couldn’t find the path to that folder. I also had issues with organizing the square buttons on the screen, as well as cuing and pausing the music in combination with the joystick!

Improvements:

I would really like to make my p5 sketch a bit more elaborate and maybe add some sort of extra LED aspect to my arduino to make things a bit more exciting!

 

VIDEOS FROM THE SHOWCASE:

IMG_7438

IMG_7445

The showcase was incredibly fun and I really loved seeing everyone’s projects. I am very excited for building more fun projects in the future!!

 

PicturePuzzle- Final Project

Concept:

My project concept is one I came to fall in love with. I always liked the idea of taking pictures and so being able to create an interactive environment, where I am able to solve a puzzle using a picture I just took, I think is very fun. This I think is a metaphor for just how challenging but fun this class was. The puzzle could represent some of the challenges I faced throughout the class and building the puzzle represents the way in which my skills developed over the course of the 14 weeks. At the end when you look at the finished puzzle and feel proud of yourself for solving, represents how happy and proud I am of myself, not only to have finished the class, but having done so with new skills and I guess a love for physical computing. I feel like a genius, which is exactly what my LCD screen says when I complete the puzzle.

Schematic and Modelling:

Images of project:

User Testing videos:

How does the implementation work?

The project’s core idea is straightforward: manipulate the puzzle pieces using the joystick and follow on-screen instructions by pressing the buttons accordingly. The inspiration behind this setup actually came from the PS5 controller. I was dead set on incorporating a joystick into the design, and that sparked the whole concept.

Description of interaction design:

The design is meant to be user-friendly. I aimed for simplicity, assuming everyone knows the basics of a joystick. The on-screen instructions should guide players smoothly through the game. It’s a balance—simple enough for anyone to dive in yet not overly basic to bore them. That’s where the difficulty levels come in. I’ve got three, but I’ve only tested two myself, only managing to crack the first level. Give it a shot and see how far you can go!

Description of Arduino code + code snippets

The Arduino sketch I’ve got sets up a cool interface using buttons and a joystick to handle an LCD display and communicate using serial communication. First off, it gets things going by setting up pins for the buttons and joystick, along with initializing a LiquidCrystal display. The setup() function takes care of configuring serial communication, initializing pins, and prepping the LCD screen. The loop() function takes care of button presses and joystick moves. And when it detects these actions, it sends messages through serial communication. Each action—like a mouse click, difficulty level selection, joystick move, or even a command to snap a picture—is translated into a specific message. This sketch is a great listener too. It keeps an ear out for incoming messages through the serial port. When it gets specific ones like ‘TIMER:’, ‘MOVES:’, or ‘SOLVED’, it updates the LCD screen accordingly. So, you’ll see things like timer values ticking away, move counts racking up, and a sweet congratulatory message popping up when you crack that puzzle.

#include <LiquidCrystal.h>

const int XbuttonPin = 2;
const int SbuttonPin = 3;
const int TbuttonPin = 4;
const int CbuttonPin = 5;
const int joystickXPin = A0; // Analog pin for joystick X-axis
const int joystickYPin = A1; // Analog pin for joystick Y-axis
const int threshold = 50; // Threshold for joystick sensitivity
//bool isDifficulty = false;

LiquidCrystal lcd(6, 12, 11, 10, 9, 8);

void setup() {
  Serial.begin(9600);
  pinMode(XbuttonPin, INPUT_PULLUP);
  pinMode(SbuttonPin, INPUT_PULLUP);
  pinMode(TbuttonPin, INPUT_PULLUP);
  pinMode(CbuttonPin, INPUT_PULLUP);

  lcd.begin(16, 2);
  lcd.clear();
}

void loop() {
  if (digitalRead(XbuttonPin) == LOW) {
    Serial.println("MOUSE_CLICK");
    delay(1000); // Debounce delay
  }
  
  if (digitalRead(SbuttonPin) == LOW) {
    Serial.println('2');
    delay(100); // Debounce delay
  }
  
  if (digitalRead(TbuttonPin) == LOW) {
    Serial.println('1');
    delay(1000); // Debounce delay
  }
  
  if (digitalRead(CbuttonPin) == LOW) {
    Serial.println('3');
    delay(100); // Debounce delay
  }

  if (digitalRead(TbuttonPin) == LOW) {
    Serial.println('C');
    delay(100); // Debounce delay
  }
  
  int xVal = analogRead(joystickXPin); // Read X-axis value
  int yVal = analogRead(joystickYPin); // Read Y-axis value

  if (xVal < 512 - threshold) {
    Serial.println("LEFT");
    delay(500); // Debounce delay
  } else if (xVal > 512 + threshold) {
    Serial.println("RIGHT");
    delay(500); // Debounce delay
  }

  if (yVal < 512 - threshold) {
    Serial.println("DOWN");
    delay(500); // Debounce delay
  } else if (yVal > 512 + threshold) {
    Serial.println("UP");
    delay(500); // Debounce delay
  }
  if (Serial.available() > 0) {
    String message = Serial.readStringUntil('\n');
    
    if (message.startsWith("TIMER:")) {
      lcd.setCursor(0, 0);
      lcd.print(message.substring(6)); // Print the timer message
    } else if (message.startsWith("MOVES:")) {
      lcd.setCursor(0, 1);
      lcd.print("Moves: " + message.substring(6)); // Print the move counter message
    } else if (message == "SOLVED") {
      lcd.clear();
      lcd.setCursor(0, 0);
      lcd.print("You are a GENIUS!!!");
      delay(5000); // Display "Puzzle Solved!" for 2 seconds
      lcd.clear();
    }
  }
}

Description of p5.js code + code snippets + embedded sketch

My p5.js puzzle game lets you interact with images, customizable with varying difficulty levels. It progresses through welcome, instructions, and gameplay phases. The Puzzle class manages the game mechanics—moves, shuffling tiles, checking progress, and joystick input. The draw() function orchestrates screens, responding to commands from sources like Arduino. Background music sets the tone for welcome and instructions, fading during gameplay focus. The setup() function initializes the canvas, video feed, and initial puzzle grid, making it the core of this interactive experience.

function captureAndSetupPuzzle(video) {
  if (video) {
    source = video.get();
    source.loadPixels(); // Ensure pixels are loaded
 if (source.width > 0 && source.height > 0) {
    // Resize the source image to fit the canvas
    source.resize(width, height);
    video.hide();

    w = Math.floor(width / cols);
    h = Math.floor(height / rows);

    for (let i = 0; i < cols; i++) {
      for (let j = 0; j < rows; j++) {
        let x = i * w;
        let y = j * h;
        let img = source.get(x, y, w, h); // Get a portion of the image for each tile

        if (i === cols - 1 && j === rows - 1) {
          board.push(-1);
          puzzle.tiles.push(new Tile(-1, img));
        } else {
          let index = i + j * cols;
          board.push(index);
          puzzle.tiles.push(new Tile(index, img));
        }
      }
    }

    puzzle.board = board.slice();
    puzzle.simpleShuffle(puzzle.board);

    currentScreen = 'game';
    puzzle.startTimer();
  } else {
    console.error("Error loading the video source");
    }
  }
}
function joystick(puzzle, direction) {
  let xOffset = (width - puzzle.w * puzzle.cols) / 1.3;
  let yOffset = (height - puzzle.h * puzzle.rows) / 2.5;

  // Calculate the tile indices based on joystick direction
  let i = -1,
    j = -1;
  let blank = puzzle.findBlank();
  let blankCol = blank % puzzle.cols;
  let blankRow = Math.floor(blank / puzzle.rows);

  switch (direction) {
    case 'LEFT':
      i = blankCol + 1;
      j = blankRow;
      moveSound.play();
      break;

    case 'RIGHT':
      i = blankCol - 1;
      j = blankRow;
      moveSound.play();
      break;

    case 'UP':
      i = blankCol;
      j = blankRow + 1;
      moveSound.play();
      break;

    case 'DOWN':
      i = blankCol;
      j = blankRow - 1;
      moveSound.play();
      break;

    default:
      // Handle other cases or unknown commands
      break;
  }

  if (i >= 0 && i < puzzle.cols && j >= 0 && j < puzzle.rows) {
    puzzle.move(i, j, puzzle.board);
  }
  writeSerial("MOVES:" + puzzle.getMoves() + "\n");
 // puzzle.updateTimer(); // Update the timer
 // puzzle.displayTimer(); // Display the timer
}

Description of communication between Arduino and p5.js

In my project, the Arduino and p5.js are like buddies chatting through a USB connection using serial communication. The Arduino sends messages over to p5.js, which eagerly listens and understands what’s being said. Depending on these messages, p5.js swings into action—triggers different functions, or tweaks how the whole thing behaves. It’s like they’re choreographing a dance together! This connection allows the tangible aspects of my project—like button pushes or sensor readings from the Arduino—affect what happens on the digital playground of the p5.js sketch. It’s a smooth back-and-forth: the Arduino talks to p5.js, guiding its moves, while p5.js keeps the Arduino in the loop about how the game’s going—sending updates on moves made and puzzles solved. They’ve got this teamwork thing down, with the Arduino shaking things up in the digital world and p5.js keeping it informed about what’s happening in the game.

What are some aspects of the project that you’re particularly proud of?

One thing I’m really excited about is achieving two-way communication in my project. It was a bit of a hurdle because I’ve always been more comfortable with software like p5.js, which made tasks like controlling mouse clicks and key presses with Arduino a breeze. But the real challenge came when I needed to send information back from p5.js to the Arduino IDE. Figuring out how to establish that connection was a bit tricky, but once I got it working, it felt like a big win.

Another part that I’m super proud of is the capture process. Initially, I had the picture saved and then converted it into the grid. However, I realized this meant that everyone’s picture would end up stored on my laptop, and I wasn’t keen on that idea. So, I reworked the code to immediately convert the picture as soon as it’s taken. I love this feature because it ensures privacy by not saving anyone’s picture on my device, and it’s more immediate and seamless for the users.

Links to resources used.

SparkFun Inventor’s Kit Experiment Guide – v4.1 – SparkFun Learn

Coding Challenge 165: Slide Puzzle – YouTube

reference | createVideo() (p5js.org)

How to connect and use an Analog Joystick with an Arduino – Tutorial – YouTube

LED pattern with button control on Arduino | Arduino Tutorial – YouTube

Challenges:

One of the toughest parts I tackled was turning the picture into a grid and getting those tiles to shift around. The real challenge here wasn’t just making the picture the puzzle, but doing it on the spot—taking a photo and instantly turning it into a puzzle. It felt like a puzzle itself! For a while, I hit a roadblock. I could snap the picture, form the tiles, but they just wouldn’t budge. Turned out, they weren’t connecting to my puzzle class properly. It was like they were stuck in place. To fix this, I had to dive into the puzzle class and really fine-tune how the capturing worked. It was trial and error, a lot of experimenting. Then, I stumbled upon this function called video() in p5.js, and that was a game-changer. It helped me get things back on track and finally, my project started to click.

Future Work:

I’ve been thinking about how to take this project to the next level and turn it into more of a game. For me, this whole thing has been a way to relax and unwind, not really about competing. But I think adding a competitive element could make it way more interesting. I’ve been considering the idea of making it multiplayer, so that more than one player can get involved in the game. Imagine friends coming together, enjoying some puzzling challenges, and maybe even a bit of friendly competition. It could be a great way for them to hang out and have a good time. By making these changes, I believe it’ll become a fun escape for folks who enjoy puzzles or just want to kick back and have some light-hearted rivalry.

IM SHOWCASE!!!!

Final Project – MINI DISCO

Mini Disco

User testing:

Reflecting on the user testing of my Mini Disco project with my friend, I found that the experience was quite self-explanatory and intuitive. The design, featuring an arrow pointing towards the sound sensor, seemed to effectively guide the user without the need for additional instructions. My friend was able to figure out the interaction – singing or making noise to trigger the light show – quite easily.

From this testing, I realized the strengths of my project lie in its simplicity and the immediate engagement it offers. Users can interact naturally without needing a detailed explanation, which I believe is a key aspect of a successful interactive design.

However, I see an opportunity for improvement in the visual aspect of the project. Initially, I used cotton to diffuse the light from the RGB LED, but I think what might have been the better option was to replace it with a dome however with the lack of materials I wasn’t able to. My aim is to enhance the visual impact of the light display. and so I envision that a dome could better reflect and spread the light, creating a more immersive and expansive effect that more closely mimics the vibrant atmosphere of a disco.

This change, I believe, could elevate the overall experience, making it not just an interactive piece but also a more captivating visual spectacle. The challenge will be to integrate the dome in a way that complements the existing design while also enhancing the interplay of light and sound.

Concept:

Originally, I envisioned creating a dynamic light source that would visually respond to sound. My plan was to set up a microphone/sound sensor(analog signal) to capture the surrounding audio vibes, where different frequencies and volumes would trigger varied light displays in an RGB LED. For instance, high-pitched sounds would shift the LED towards blue hues, while deep bass notes would turn it red.

I intended to use P5.js for color mapping, transforming the intensity and frequency of the captured sound into dynamic, responsive color schemes. The idea was to have the visuals come alive with vibrant colors and gradients, creating a visually harmonious representation of the audio.

Despite a minor adjustment in my original plan, the essence of the project remains intact. Initially, I intended to use a frequency-sensitive sound sensor, but due to its malfunction, I had to opt for a readily available sensor that operates on a digital signal. This new sensor, while not detecting varied sound frequencies, adeptly measures volume levels furthermore the color transitions of the LED now respond to the loudness or softness of the surrounding sounds.

 

How does the implementation work?

Arduino Implementation:

In my Arduino setup, I began by establishing serial communication at a baud rate of 9600, a crucial step for enabling data exchange between the Arduino and my computer. I configured pin 8 as an input to connect my digital sound sensor, which serves as the project’s primary interactive element. Additionally, pins 9, 10, and 11 were set as outputs for controlling the red, green, and blue channels of an RGB LED, allowing me to create a wide range of colors. In the loop function, I constantly read the state of the sound sensor. If no sound is detected (soundData == LOW), I programmed the RGB LED to emit a blue light, however with sound, it glows red. This immediate visual feedback is achieved by manipulating the LED’s color through the changeLEDColor function, using analogWrite to adjust each color channel. Alongside controlling the LED, I also send the sound sensor’s state as serial data to my computer, where it’s utilized in the p5.js sketch for a corresponding visual display.

p5.js Sketch Implementation

In parallel with the Arduino setup, I developed a p5.js sketch to create a digital visual representation corresponding to the physical inputs from the sound sensor. The sketch initializes by creating a canvas and populating it with a series of particles, each represented by an instance of the Particle class. These particles are given random positions across the canvas, along with properties for size, color, and movement speed. The heart of the sketch lies in the readSerial function, responsible for reading and processing the serial data sent from the Arduino. This data, indicating the presence or absence of sound, is used to dynamically alter the behavior of the particles on the canvas. In the draw function, I update the background and set the text properties. If the serial connection is not yet established, the sketch prompts the user to initiate the connection. Once connected, the sketch confirms this with a display message and starts animating the particles based on the sensor data. The particles grow in size and move smoothly across the canvas when sound is detected, creating a visually engaging and responsive digital environment that mirrors the physical inputs from the Arduino.

Schematic

Description of Arduino code

Arduino code:
void setup() {
  Serial.begin(9600);
  pinMode(8, INPUT); // Sound sensor input

  // RGB LED pins
  pinMode(9, OUTPUT); // Red
  pinMode(10, OUTPUT); // Green
  pinMode(11, OUTPUT); // Blue
}

void loop() {
  int soundData = digitalRead(8); // Read the sound sensor
  Serial.println(soundData);      // Send sound data to serial for debugging

  if (soundData == LOW) {
    // Sound not detected - change LED to one color 
    changeLEDColor(0, 0, 255); // Blue
  } else {
    
   
    // sound detected - change LED to another color (e.g., red)
    changeLEDColor(255, 0, 0); // Red
     delay(50);
  }

  
}

void changeLEDColor(int redValue, int greenValue, int blueValue) {
  analogWrite(9, redValue);   // Red channel
  analogWrite(10, greenValue); // Green channel
  analogWrite(11, blueValue);  // Blue channel
}

 

Setup Function:

void setup() {
  Serial.begin(9600);
  pinMode(8, INPUT); // Sound sensor input
  pinMode(9, OUTPUT); // Red
  pinMode(10, OUTPUT); // Green
  pinMode(11, OUTPUT); // Blue
}
  • Initializes serial communication at a baud rate of 9600. This is used for debugging purposes to send data to the serial monitor of the Arduino IDE.
  • Configures the pin connected to the sound sensor (pin 8) as an input.
  • Sets up the RGB LED pins (pins 9, 10, and 11) as outputs. Each pin controls one color component of the RGB LED (red, green, and blue, respectively).

Loop Function:

void loop() {
  int soundData = digitalRead(8); // Read the sound sensor
  Serial.println(soundData);      // Send sound data to serial for debugging

  if (soundData == LOW) {
    // Sound not detected - change LED to one color 
    changeLEDColor(0, 0, 255); // Blue
  } else {
    
   
    // sound detected - change LED to another color
    changeLEDColor(255, 0, 0); // Red
     delay(50);
  }

 

  • Continuously reads the state of the digital sound sensor.
  • If sound is detected the LED changes to red by calling changeLEDColor with (255, 0, 0), which are the RGB values for red.
  • If no sound is detected the LED (soundData isLOW) the RGB LED is set to blue. This is achieved by calling the changeLEDColor function with the parameters (0, 0, 255), representing the RGB values for blue.
  • There is a short delay (delay(50)) at the end of the loop for stability and to control the rate at which the sensor reads data.

changeLEDColor Function:

void changeLEDColor(int redValue, int greenValue, int blueValue) {
  analogWrite(9, redValue);   // Red channel
  analogWrite(10, greenValue); // Green channel
  analogWrite(11, blueValue);  // Blue channel
}
  • A helper function that takes three parameters: redValue, greenValue, and blueValue, each representing the intensity of the respective color channel of the RGB LED.
  • The analogWrite function is used to set the brightness of each color channel. For example, analogWrite(9, redValue); sets the brightness of the red channel.

Description of the p5.js Sketch

p5.js Sketch:
let serial;
let latestData = "waiting for data";
let particles = [];
let cols, rows;
let particleCount = 100; // Adjust for more/less particles
function setup() {
  createCanvas(windowWidth, windowHeight);

  // Create randomly positioned particles
  for (let i = 0; i < particleCount; i++) {
    let x = random(width);
    let y = random(height);
    particles.push(new Particle(x, y));
  }
}

function readSerial(data) {
  console.log(data);
  latestData = data.trim();
}

function draw() {
  background('#00003f');
  textSize(30);
  textFont('Courier New');
  textAlign(CENTER, CENTER)

  if (!serialActive) {
      fill(0, 102, 153);
    text("Press Space Bar to select Serial Port", width / 2, height / 2);
  } else {
    text("Connected", 20, 30);

    let sensorValue = parseInt(latestData);
    particles.forEach(p => {
      p.update(sensorValue);
      p.display();
    });
  }
}

function keyPressed() {
  if (key === ' ') {
    setUpSerial();
  }
}
class Particle {
  constructor(x, y) {
    this.x = x;
    this.y = y;
    this.baseSize = 10; // Base size of the circle
    this.size = this.baseSize;
    this.color = color(random(255), random(255), random(255));
    this.xSpeed = random(-1, 1);
    this.ySpeed = random(-1, 1);
  }

  update(sensorValue) {
    // Resize based on sensor value
    this.size = sensorValue === 1 ? 30 : 10;

    // Update position for smooth floating
    this.x += this.xSpeed;
    this.y += this.ySpeed;

    // Bounce off edges
    if (this.x > width || this.x < 0) {
      this.xSpeed *= -1;
    }
    if (this.y > height || this.y < 0) {
      this.ySpeed *= -1;
    }
  }

  display() {
    fill(this.color);
    noStroke();
    ellipse(this.x, this.y, this.size, this.size);
  }
}

// Resize canvas when the window is resized
function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
}
  1. Setup and Particle Creation:
    • The setup() function initializes the canvas to cover the entire window. Within this function, I create multiple particles, each represented by an instance of the Particle class. The number of particles is determined by particleCount.
    • Each particle is randomly positioned across the canvas. This is done by assigning random x and y coordinates within the canvas’s dimensions.
  2. Serial Data Handling:
    • The readSerial(data) function is responsible for processing incoming serial data from the Arduino. This data represents the state of the sound sensor. The function trims any whitespace from the received data and stores it in latestData for further processing.
  3. Drawing and Animation:
    • In the draw() function, the background is set to a dark blue color ('#00003f').
    • The sketch checks the serialActive flag to determine if the serial connection is established. If not, it prompts the user to activate the serial port. Once connected, it displays “Connected” on the canvas.
    • The particle behavior is updated based on the parsed sensor value (sensorValue). Each particle’s size and position are adjusted accordingly.
  4. Particle Class:
    • The Particle class defines the properties and behaviors of individual particles. Each particle has its position (x, y), base size, color, and speed (xSpeed, ySpeed).
    • The update(sensorValue) method adjusts the particle’s size based on the sound sensor input. It also updates the particle’s position to create a floating effect. If a particle reaches the edge of the canvas, it bounces back, creating a dynamic, contained animation within the canvas boundaries.
    • The display() method draws each particle as an ellipse with its respective properties.
  5. Interactivity:
    • The keyPressed() function listens for a spacebar press to initiate the serial connection setup, a key part of the interaction between the Arduino and the p5.js sketch.
  6. Responsive Design:
    • The windowResized() function ensures that the canvas size adjusts appropriately when the browser window is resized, maintaining the integrity of the visual display.

Description of Interaction Design

  • Engaging Invitation:
    • Users are greeted with an inviting message on the project box, clearly stating: “Sing for the Party People.” This message sets the tone and clearly communicates what is expected from the user.
  • Sound Trigger:
    • As soon as the user starts singing or making noise, the embedded digital sound sensor within the box detects this audio input. The sensor is finely tuned to respond to a range of sounds from soft humming to loud singing.
  • Responsive Light Display:
    • Upon detecting sound, the sensor triggers a colorful light show from the RGB LED. The LED cycles through colors, creating a mini disco effect that transforms the space with vibrant hues.
    • The intensity, frequency, and duration of the user’s singing directly influence the light patterns, making each experience unique and personal.
  • Visual Feedback:
    • The LED serves as immediate visual feedback for the user’s actions. This feedback loop encourages continued interaction and exploration of different volumes of sound.
    • The changing colors of the LED create a playful and immersive environment, enhancing the joyous atmosphere of a disco.

Description of communication between Arduino and p5.js

Arduino to p5.js Communication:
  1. Serial Communication Setup:
    • On the Arduino side, I initialize serial communication in the setup() function using Serial.begin(9600);. This sets up the Arduino to send data over the serial port at a baud rate of 9600 bits per second.
    • In the main loop (void loop()), the Arduino reads data from the digital sound sensor connected to pin 8 using digitalRead(8);. This sensor detects the presence or absence of sound, returning either a HIGH or LOW signal.
  2. Sending Data from Arduino:
    • Depending on the state of the sound sensor, the Arduino sends this information to the connected computer via the serial port using Serial.println(soundData);. The data sent is a simple numerical value (0 or 1) representing the absence or presence of sound.
  3. Receiving Data in p5.js:
    • On the p5.js side, the sketch establishes a serial connection to receive data from the Arduino. This is done using the p5.SerialPort library, which facilitates serial communication in a web environment.
    • The readSerial(data) function in the p5.js sketch is responsible for reading incoming serial data. It processes the data received from the Arduino, trims any whitespace, and stores it in the latestData variable.
p5.js Processing and Visualization:
  1. Data Interpretation:
    • The p5.js sketch interprets the received data (latestData) as the state of the sound sensor. This data is then used to influence the behavior of visual elements within the sketch, such as the size and movement of particles.
    • The draw() function continuously updates the canvas, where each particle’s appearance and behavior are adjusted based on the sensor data. For instance, the presence of sound might cause the particles to increase in size or change position, creating a dynamic and responsive visual effect.
  2. Feedback Loop:
    • The seamless exchange of data between the Arduino and the p5.js sketch creates an interactive feedback loop. Physical input from the sound sensor directly influences the digital visualization, making the experience responsive to real-world interactions.

What are some aspects of the project that you’re particularly proud of?

Reflecting on my project, I feel a deep sense of pride, particularly in the creation of the physical component – the mini sound-activated disco club. This aspect of the project was not only a challenge but a testament to my creativity and technical skills. The process of bringing a conceptual idea to life, blending interactive technology with artistic design, was immensely fulfilling. Another aspect I’m especially proud of is my adaptability and problem-solving skills. When faced with the unexpected challenge of the original sensor breaking, I quickly adapted, demonstrating resilience and quick thinking, hallmarks of a true interactive media student. Utilizing a different sensor and modifying my project accordingly, I managed to preserve the essence of my initial concept. This ability to think on my feet and craft a functional and engaging project with the available resources, even though it diverged from my original plan, is something I take great pride in. It underscores my capacity to innovate and create meaningful interactive experiences, regardless of the obstacles encountered.

What are some areas for future improvement?

Reflecting on my project, I recognize several areas for future improvement, particularly influenced by the challenges and lessons learned during its development. One key area is the need for contingency planning in hardware-based projects. The unexpected malfunction of my original sensor forced me to significantly simplify my original idea, mainly due to time constraints and the limitations of the replacement sensor. This experience taught me the importance of having spare parts and tools readily available. It’s a lesson that will influence my approach to future projects, ensuring I’m better prepared for unforeseen setbacks.

Additionally, the limitations imposed by the replacement sensor, which could only read binary values (0s and 1s), restricted my ability to create a more complex and visually appealing p5.js sketch. This constraint became particularly evident in my efforts to craft a visually aesthetic sound visualizer. The binary input didn’t allow for the nuanced interpretation of sound that I had initially envisioned. Moving forward, I aim to explore more advanced sensors and input methods that offer a wider range of data. This will enable me to create more intricate and engaging visualizations in my p5.js sketches, aligning more closely with my original vision of an interactive and visually rich experience.

IM SHOWCASE

Final Project – Walking Buddy

CONCEPT:

For the final project I was inspired to create an assistive device called “Walking Buddy” – a voice-controlled walking guide. Based on voice commands provided by the user, the robot changes direction and continues to move until the next command. If it encounters an obstacle, the robot stops and warns the user by playing a tune through the speaker. All the interaction involved is through voice and sound making the design inclusive for all users.

 

 

 

 

 

 

IMPLEMENTATION:

The communication begins through P5 where speech library has been used to create a code that infers the voice command and sends relevant data to the Arduino. After receiving the data, the Arduino checks the input received from the ultrasonic sensor, if there is no obstacle detected, it implements the direction of motors according to the command. However, in the presence of an obstacle it sends out a tune to warn the user and stop. Further, the ultrasonic sensor has been mounted on a servo motor which allows it to scan the surroundings before turning left or right. An additional feature that I added involves moving an ellipse in the p5 sketch based on the direction of movement of the robot.

 

ARDUINO CODE:

#include <Servo.h>
#include "pitches.h"

#define Echo A0
#define Trig A1
#define motor 10
#define Speed 170
#define spoint 103

const int ain1Pin = 3;
const int ain2Pin = 4;
const int pwmAPin = 5;

const int bin1Pin = 8;
const int bin2Pin = 7;
const int pwmBPin = 6;

// notes in the melody:
int melody[] = {
  NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3, NOTE_A3,
};

// note durations: 4 = quarter note, 8 = eighth note, etc.:
int noteDurations[] = {
  4, 4, 4, 4, 4, 4, 4, 4
};

char value;
int distance;
int Left;
int Right;
int L = 0;
int R = 0;
Servo servo;

void setup() {
  Serial.begin(9600);
  pinMode(Trig, OUTPUT);
  pinMode(Echo, INPUT);
  servo.attach(motor);
  pinMode(ain1Pin, OUTPUT);
  pinMode(ain2Pin, OUTPUT);
  pinMode(pwmAPin, OUTPUT);
  pinMode(bin1Pin, OUTPUT);
  pinMode(bin2Pin, OUTPUT);
  pinMode(pwmBPin, OUTPUT); 
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}
void loop() {
  VoiceControl();
}

void moveBackward() {
  analogWrite(pwmAPin, Speed);
  digitalWrite(ain1Pin, HIGH);
  digitalWrite(ain2Pin, LOW);
  analogWrite(pwmBPin, Speed);
  digitalWrite(bin1Pin, HIGH);
  digitalWrite(bin2Pin, LOW);
}

void moveForward() {
  analogWrite(pwmAPin, Speed);
  digitalWrite(ain1Pin, LOW);
  digitalWrite(ain2Pin, HIGH);
  analogWrite(pwmBPin, Speed);
  digitalWrite(bin1Pin, LOW);
  digitalWrite(bin2Pin, HIGH);
}

void turnRight() {
  analogWrite(pwmAPin, Speed);
  digitalWrite(ain1Pin, HIGH);
  digitalWrite(ain2Pin, LOW);
  analogWrite(pwmBPin, Speed);
  digitalWrite(bin1Pin, LOW);
  digitalWrite(bin2Pin, HIGH);
}

void turnLeft() {
  analogWrite(pwmAPin, Speed);
  digitalWrite(ain1Pin, LOW);
  digitalWrite(ain2Pin, HIGH);
  analogWrite(pwmBPin, Speed);
  digitalWrite(bin1Pin, HIGH);
  digitalWrite(bin2Pin, LOW);
}

void stopMotors() {
  analogWrite(pwmAPin, Speed);
  digitalWrite(ain1Pin, LOW);
  digitalWrite(ain2Pin, LOW);
  analogWrite(pwmBPin, Speed);
  digitalWrite(bin1Pin, LOW);
  digitalWrite(bin2Pin, LOW);
}


int ultrasonic() {
  digitalWrite(Trig, LOW);
  delayMicroseconds(2);
  digitalWrite(Trig, HIGH);
  delayMicroseconds(10);
  digitalWrite(Trig, LOW);
  long t = pulseIn(Echo, HIGH);
  long cm = t * 0.034 / 2;; //time convert distance
  return cm;
}

int rightsee() {
  servo.write(20);
  delay(800);
  Left = ultrasonic();
  return Left;
}
int leftsee() {
  servo.write(180);
  delay(800);
  Right = ultrasonic();
  return Right;
}

void VoiceControl() {
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN,HIGH);
    int value = Serial.parseInt();
    if (Serial.read() == '\n') {
      Serial.println(value);
      distance=ultrasonic();
      //Serial.println(distance);
      if (distance <= 12) {
        stopMotors();
        value=0;
        for (int thisNote = 0; thisNote < 8; thisNote++) {

          int noteDuration = 1000 / noteDurations[thisNote];
          tone(12, melody[thisNote], noteDuration);

          int pauseBetweenNotes = noteDuration * 1.30;
          delay(pauseBetweenNotes);
          // stop the tone playing:
          noTone(12);
        }
        
      }
      if (value == 3) {
        moveForward();
      } else if (value == 4) {
        moveBackward();
      } else if (value == 2) {
        L = leftsee();
        servo.write(spoint);
        if (L >= 10 ) {
          turnLeft();
        } else if (L < 10) {
          stopMotors();
        }
      } else if (value == 1) {
        R = rightsee();
        servo.write(spoint);
        if (R >= 10 ) {
          turnRight();
        } else if (R < 10) {
          stopMotors();
        }
      } else if (value == 0) {
        stopMotors();
      }
    }
  }
  digitalWrite(LED_BUILTIN,LOW);
}

 

P5 SKETCH AND  CODE:

let dir = 0;
let robotX, robotY;
let img1;
let title;
let speechRec;

function preload(){
  img1=loadImage("img.png")
  title=loadImage("title.png")
  speakImg=loadImage("speak.png")
}

function setup() {
  createCanvas(600, 400);
  robotX = 460;
  robotY = 195;
  
  let lang = navigator.language || "en-US";
  speechRec = new p5.SpeechRec(lang, gotSpeech);
  let speech = new p5.Speech();
  
  //Checks for the end of text-to-speech conversion
  speech.onEnd = () => {
    isSpeaking = false;
    let continuous = true;
    let interim = false;
    speechRec.start(continuous, interim);
  };
  isSpeaking = true;
  speech.speak('Hi there!,This is your walking buddy. Join me to explore the world on foot. Use the commands Right, Left, Forward, Backward and Stop to navigate the directions. Finally, remember to stop when you hear the siren.')
  

  function gotSpeech() {
    console,log("Speech")
    if (speechRec.resultValue) {
      createP(speechRec.resultString);
      //Conditions to detect the direction command
      if (speechRec.resultString.toLowerCase().includes("right")) {
        dir = 1;
      } else if (speechRec.resultString.toLowerCase().includes("left")) {
        dir = 2;
      } else if (speechRec.resultString.toLowerCase().includes("forward")) {
        dir = 3;
      } else if (speechRec.resultString.toLowerCase().includes("backward")) {
        dir = 4;
      } else if (speechRec.resultString.toLowerCase().includes("stop")) {
        dir = 0;
      }
    }
  }
}

function draw() {
  stroke(0);
  background("rgb(244,227,68)");
  image(img1,30,140,170,260)
  image(title,30,20,300,180)
  
  fill(146,196,248)
  rect(340,40,240,310)
  fill(0);
  ellipse(robotX, robotY, 20, 20);
  fill(255);
  textSize(15);

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 340, 380);
  } else {
    text("Connected", 3400, 380);
  }
  
  if (dir == 1) {
    stroke(255, 0, 0); // Red stroke for right
  } else if (dir == 2) {
    stroke(0, 255, 0); // Green stroke for left
  } else if (dir == 3) {
    stroke(0, 0, 255); // Blue stroke for forward
  } else if (dir == 4) {
    stroke(255, 255, 0); // Yellow stroke for backward
  } else {
    noStroke(); // No stroke for stop
  }
  noFill()
  strokeWeight(2)
  ellipse(robotX, robotY, 30, 30);
  
  if (dir==1 && robotX < width - 40){
    robotX+=0.5
  }
  else if (dir==2 && robotX > 360){
    robotX-=0.5
  }
  else if (dir==3 && robotY > 60){
    robotY-=0.5
  }
  else if (dir==4 && robotY < height-70 ){
    robotY+=0.5
  }
  if (isSpeaking) {
    image(speakImg, 180, 210, 100, 70);
  }
    
}

function keyPressed() {
  if (key == " ") {
    setUpSerial();
  }
}

function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
   let fromArduino = split(trim(data), ",");
//     // if the right length, then proceed
    if (fromArduino.length == 1) {
    console.log(fromArduino[0]);
   }
    
    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    
    console.log(dir);
    let sendToArduino= dir + "\n";
    writeSerial(sendToArduino);
  }
}

 

The part of this project that I am proud of is achieving the voice control feature. I used ‘The Coding Train’ youtube tutorials to explore the p5 speech library and implemented both text-to-speech as well as speech-to-text conversions.

Coding Train Tutorial: https://youtu.be/q_bXBcmfTJM

CHALLENGES FACED:

Understanding the direction of movement of the motors was difficult initially but after a few trials I figured out the right code for each direction. Apart from this sending the text obtained from the commands of the user to the arduino did not always work as expected which made it challenging to understand what was wrong.

PROJECT TESTING VIDEOS:

The below videos include the robot following voice commands, detecting an obstacle and p5 screen.

https://drive.google.com/drive/folders/17U0GwIh4A0-HnNlQ5Dn-SARxTF9sP-2-?usp=drive_link

IM SHOW:

The showcase was a wonderful experience to see the creative work of others which also served as a sorce of inspiration for future projects. The concept of my project seemed interesting to people. However, I was encountered with an unanticipated situation where the robot was unable to work due to the detection of multiple sounds.

FURTHER IMPROVEMENT:

Some areas of future improvement would be to design a more efficient communication system with a precise description of the surrounding being conveyed to the user.

After the IM show, I felt that for such a robotic assistive device to be practical it would be necessary to have stronger sound selection to allow it to work even in crowded environments where multiple sounds can be detected.

Final Project: Human Following Robot

Concept

For my final project, I decided to create a human-following robot that I like to think of as a non-human pet, inspired by none other than Wall-E – that lovable robot from the movies. Just like Wall-E, my creation is meant to tag along beside you, sensing your presence and movement with its built-in sensors. It’s a robot that follows you around, imitating the way a curious pet might trail after its owner.

But there’s a twist – it’s not just an automatic follower. With P5JS, a programming tool, you get the reins, too. You can control it like you’re playing a video game, guiding it around with your keyboard and mouse. The idea struck me while watching Wall-E’s adventures, and I thought, why not blend that inspiration into something real? Something you can interact with, just like a pet that’s eager for your attention, whether it’s autonomously roaming or being directed by your commands.

Hardware Image

User Testing Videos

Key Components

  • Arduino Uno: The Arduino Uno acts as the brain of our robot. It’s a microcontroller responsible for processing data from sensors, making decisions, and controlling the motors and servo. The best part? It’s beginner-friendly, making it an ideal choice for those new to robotics.
  • Motor Driver: the powerhouse behind the robot’s movement. It precisely controls the motors that drive the wheels, ensuring our robot gracefully follows its human companion.
  • Ultrasonic Sensor: The ultrasonic sensor serves as the robot’s eyes, allowing it to measure distances. This is crucial for avoiding collisions and maintaining a safe following distance.
  • IR Sensor: Our robot needs to be smart enough to navigate around obstacles. That’s where the IR sensor comes in, allowing the robot to turn. By emitting and detecting infrared radiation, it enhances obstacle detection.
  • Servo Motor: It helps move the ultrasonic sensor, giving the robot flexibility.
  • Motors and Wheels: For our robot to follow, it needs reliable motors and wheels. The motor driver ensures these components work seamlessly, making our robot mobile and ready for adventure.
  • Piezo Speaker: Communication is key, even for robots. The piezo speaker provides audible feedback, alerting users that robots is ready to operate.

Schematic and Circuit Diagram

Implementation details

  • Interaction Design: The interaction design of my project centers on a user-friendly and intuitive experience. The robot operates in two modes: autonomous, where it uses sensors to follow the user around, and manual, where the user can control its movements through a P5JS interface. Switching between modes is seamless, catering to moments when you want a companionable presence without the effort or times when you prefer direct control.
  • Arduino Description: The Arduino code for my project serves as the brain of my pet-like robot. It integrates motor control with sensor inputs to enable the robot to follow a person autonomously or be controlled manually via P5JS. The code dictates how the robot moves in response to what the sensors detect, like proximity to objects or a person’s movements. It manages the logic for when the robot should move forward, turn, or stop to ensure smooth operation. Additionally, the code includes functions for playing melodies and controlling servo movements, giving the robot a lively and interactive character.

Code Snippet:

#include <SparkFun_TB6612.h>
#include "pitches.h"
#include <Servo.h>
//Motor Driver Pins
#define AIN1 3
#define BIN1 7
#define AIN2 4
#define BIN2 8
#define PWMA 5
#define PWMB 6
#define STBY 9

// Motor speed and control variables
const int offsetA = 1;
const int offsetB = 1;
int speed = 100;
int brightness = 0; // Variable to receive serial data for control

// Initialize motor objects with defined pins and offsets
Motor motor1 = Motor(AIN1, AIN2, PWMA, offsetA, STBY);
Motor motor2 = Motor(BIN1, BIN2, PWMB, offsetB, STBY);


//Ultrasonic Sensor
int distance;
long timetaken;
double feet, inch;

// Define ultrasonic sensor pins
#define echoPin 13
#define trigPin 12

// Define IR sensor pins
#define IRR A0  //pin for right sensor
#define IRL A1  //pin for left sensor


//Define Buzzzer pins
int speaker = 11;
int melody[] = {
  NOTE_C4, NOTE_G3, NOTE_G3, NOTE_A3, NOTE_G3, 0, NOTE_B3, NOTE_C4
};

// Melody and note durations arrays for the buzzer
int noteDurations[] = {
  4, 8, 8, 4, 4, 4, 4, 4
};

//Servo Motor initialization
Servo myservo;
int pos = 0; // Variable to store the servo position

void setup() {
    // Setup for ultrasonic sensor
  pinMode(trigPin, OUTPUT);  //ultrasonic sensor
  pinMode(echoPin, INPUT);

  // Setup for IR sensors
  pinMode(IRL, INPUT);  //left ir sensor
  pinMode(IRR, INPUT);  //right ir sensor

  //plays instrumental tones
  for (int thisNote = 0; thisNote < 8; thisNote++) {

    // to calculate the note duration, take one second divided by the note type.
    //e.g. quarter note = 1000 / 4, eighth note = 1000/8, etc.
    int noteDuration = 1000 / noteDurations[thisNote];
    tone(11, melody[thisNote], noteDuration);

    int pauseBetweenNotes = noteDuration * 1.30;
    delay(pauseBetweenNotes);
    // stop the tone playing:
    noTone(11); 
  }

  // Setup for servo motor
  myservo.attach(10);
  for (pos = 0; pos <= 180; pos += 1) {  // goes from 0 degrees to 180 degrees
    // in steps of 1 degree
    myservo.write(pos);  // tell servo to go to position in variable 'pos'
    delay(15);           // waits 15ms for the servo to reach the position
  }
  for (pos = 180; pos >= 0; pos -= 1) {  // goes from 180 degrees to 0 degrees
    myservo.write(pos);                  // tell servo to go to position in variable 'pos'
    delay(15);                           // waits 15ms for the servo to reach the position
  }
  pinMode(A3, OUTPUT);

  // Initialize Serial communication
  Serial.begin(9600);
}

void loop() {
    // Main loop for sensor reading and motor control
  int distance, readLeft, readRight;

  // Read ultrasonic sensor distance
  distance = ultra();
  Serial.println(distance);

    // Read IR sensor states
  readRight = digitalRead(IRR);
  readLeft = digitalRead(IRL);

  // Movement and control logic based on sensor readings
  if (readLeft == 1 && distance > 10 && distance < 25 && readRight == 1) {
    forward(motor1, motor2, speed); // Move forward
  } else if (readLeft == 1 && readRight == 0) {  //turn right
    left(motor1, motor2, speed);
  } else if (readLeft == 0 && readRight == 1) {  //turn left
    right(motor1, motor2, speed);
  } else if (readLeft == 1 && readRight == 1) {
    brake(motor1, motor2); // Brake the motors
  } else if (distance > 5 && distance < 10) {
    brake(motor1, motor2); // Brake if within a specific distance range
  } else if (distance < 5) {
    back(motor1, motor2, speed); // Move backward
  }

  // Remote control logic via Serial communication
  if (Serial.available() > 0) { // Check if there is any Serial data available
    // read the most recent byte (which will be from 0 to 255):
    brightness = Serial.read();

        // Conditional statements to control the robot based on the received byte
    if (brightness == 0) {
       // If the received byte is 0, move the robot forward
        // The function 'forward' is called with motors and speed as arguments
      forward(motor1, motor2, 200);
    } else if (brightness == 1) {
       // If the received byte is 1, move the robot backward
        // The function 'back' is called with motors and speed as arguments
      back(motor1, motor2, 200);
    }
  }
}

 

  •  Description of P5: In the P5.js code, there’s a dual-feature interface for my robot. It visually represents the sensor data, showing a value that decreases as you get closer to the robot and increases as you move away, mirroring the robot’s perception in real-time. Simultaneously, this interface allows you to control the robot’s movement. With simple commands, you can guide the robot to move forward or backward, offering a straightforward and interactive way to both visualize and manipulate the robot’s position and actions.

Code Snippet:

let serial; // variable for the serial object
let latestData = "wait"; // variable to hold the
let val = 0; // Variable to store a value for serial communication
let colorValue = 0;

function setup() {
  createCanvas(1000, 800);
  textSize(18);
  // serial constructor
  serial = new p5.SerialPort();

  // serial port to use - you'll need to change this
  serial.open("/dev/tty.usbmodem141101");

  // what to do when we get serial data
  serial.on("data", gotData);
}

// Callback function for processing received serial data
function gotData() {
  let currentString = serial.readLine(); // Read the incoming data as a string
  trim(currentString); // Remove any leading/trailing whitespace
  if (!currentString) return; // If the string is empty, do nothing
  console.log(currentString); // Log the data to the console for debugging
  latestData = currentString; // Update the latestData variable
}

function draw() {
  background(211, 215, 255);
  fill(102, 11, 229);

  // Map the latestData to a rotational degree value
  let rotDeg = map(latestData, 0, 1000, 0, 10000);

  // Check for the space bar key press to start
  if (key != " ") {
    // Display the starting screen
    textSize(30);
    fill(0, 0, 0);
    rect(0, 0, 1000, 800); // Draw a black rectangle covering the canvas
    fill(200, 200, 200); // Set text color
    text("PRESS SPACE BAR TO START THE HFR", width / 4, height / 2);
  } else {
    // Main interaction screen
    // Display forward and backward areas and instructions
    textSize(18);

    // Forward area
    fill(102, 11, 229);
    rect(890, 0, 110, 1000); // Draw the forward area
    fill(255, 245, 224);
    text("FORWARD", 900, 450); // Label for the forward area

        // Backward area
    fill(102, 11, 229);
    rect(0, 0, 110, 1000); // Draw the backward area
    fill(255, 255, 255);
    text("BACKWARD", 0, 450); // Label for the backward area

        // Draw the robot representation
    fill(35, 45, 63);
    rect(500, -100, 100, 600);  // Draw the robot's body
    fill(180, 101, 229);
    rect(500, 500, 100, -rotDeg); // Draw the robot's moving part

        // Additional robot features
    fill(200, 120, 157);
    rect(500, 500, 100, 80); // Base of the moving part
    fill(0, 0, 0);
    rect(460, 580, 40, -30); // Left wheel
    rect(600, 580, 40, -30); // Right wheel
    fill(255, 255, 255);
    text(latestData, 540, 560); // Display the latest data
    
        // Display control instructions
    fill("black");
    text("Control the Robot:\n\n", 470, 600);
    text(
      "Forward Movement:\n" +
        "- 'Forward' area on right\n" +
        "- Click to move forward\n" +
        "- Click again to stop\n\n",
      670,
      650
    );
    text(
      "Backward Movement:\n" +
        "- 'Backward' area on left\n" +
        "- Click to move backward\n" +
        "- Click again to stop\n\n",
      150,
      650
    );
    text("Move mouse to desired side and click to control movement!", 300, 770);
    textStyle(BOLD);


        // Serial communication based on mouse position
    if (!colorValue) {
      if (mouseX <= width / 2) {
        val = 1; // Set val to 1 if mouse is on the left half
        serial.write(val); // Send val to the serial port
        console.log("Left"); // Log the action
      } else {
        val = 0; // Set val to 0 if mouse is on the right half
        serial.write(val); // Send val to the serial port
        console.log("Right"); // Log the action
      }
    }
  }
    // Draw a circle at the mouse position
  fill(255, 255, 255);
  ellipse(mouseX, mouseY, 10, 10);
}
// Function to handle mouse click events
function mouseClicked() {
  if (colorValue === 0) {
    colorValue = 255;
  } else {
    colorValue = 0;
  }
}

  • Communication between Arduino and p5.js:

In this project, the Arduino sends sensor data to P5.js, allowing for a visual representation of proximity; the closer you are to the sensor, the lower the number, and vice versa. P5.js then sends back control commands to Arduino, enabling the user to maneuver the robot forward and backward. This bidirectional communication between Arduino and P5.js is streamlined through serial communication, using an application called SerialControl to effectively connect ports in P5.js. This setup ensures efficient data transfer and responsive control of the robot’s movements.

Something I am Proud of

I’m particularly proud of the hardware implementation aspect of my robot project. It was a journey that demanded considerable time, effort, and a variety of materials to reach the final outcome. The process of assembling and fine-tuning the hardware, from selecting the right sensors and motors to designing and building the physical structure, was both challenging and rewarding. Seeing the components come together into a functioning robot was a testament to the hard work and dedication put into this project. This aspect of the project stands out for me as a significant achievement.

Challenges Faced

One of the challenges I faced was with the P5.js control interface. When trying to remotely control the robot, it moved extremely slowly, and at times, it would completely stop responding, even though I was actively trying to move it. I spent a significant amount of time troubleshooting this issue, delving into various aspects of the code and communication protocols. Eventually, I came to realize that this might be a limitation within the system, possibly related to lag or processing delays, which seem to occur quite frequently.

Another challenge I encountered involved the power supply for the robot. Initially, I had a battery pack with four cells, totaling 6V, but my robot only required 4.5V. To adjust the voltage, I removed one cell and connected a wire to bridge the gap. However, this setup proved problematic; as the robot moved, the wire would shift its position, causing intermittent power loss and loss of control. The robot would continue moving uncontrollably until I reconnected the wire. To resolve this, I came up with a creative solution. I crafted a connector using aluminum foil, shaping it to fit securely on both ends of the battery compartment. This improvised connector ensured a stable connection, eliminating the issue of the wire shifting during movement. With this fix, the robot now operates smoothly without any control issues.

Future Improvements

In terms of future improvements for my project, one key area I’d like to focus on is enhancing the P5.js sketch to make it more interactive and engaging. I plan to introduce multiple pages within the sketch, each offering different functionalities or information, to create a more comprehensive and user-friendly interface. Additionally, I’m considering integrating sound into the P5.js environment. This could include audio feedback for certain actions or ambient sounds to make the interaction with the robot more immersive and enjoyable. These improvements aim to not only enrich the user experience but also add layers of complexity and sophistication to the project.

IM Show!

It was an honor to showcase my human-following robot at the IM show. Seeing the enthusiasm and curiosity of students and faculty members as they passed by to test my robot was a heartwarming experience. I was particularly thrilled to witness the interest of professors who have been integral to my learning journey. Among them was Evi Mansor, who taught me in the communications lab; her impressed reaction was a significant moment for me. Additionally, Professor Michael Shiloh, a well-known figure in the IM department, showed keen interest in my project. A special and heartfelt thanks goes to Professor Aya Riad, whose guidance and teaching were pivotal in developing the skills necessary to create such an innovative and successful outcome. The support and the lively interest of the audience made the event a memorable highlight of my academic journey.

Resources

  • https://youtu.be/yAV5aZ0unag?si=ZzwIOrRLBYmrv34C
  • https://youtu.be/F02hrB09yg0?si=d40SgnSfkBMgduA8
  • https://youtu.be/suLQpNPLzDo?si=G2rJR6YrsynycGoK
  • https://circuitdigest.com/microcontroller-projects/human-following-robot-using-arduino-and-ultrasonic-sensor
  • https://projecthub.arduino.cc/mohammadsohail0008/human-following-bot-f139db

 

Final Project

BEATHOVEN

CONCEPT

The concept is inspired by Incredibox, a website first introduced in 2009 that offered a unique experience for creating music. No musical talent or knowledge of music composition is required. It’s designed so that any combination of audio seamlessly works with each other, allowing individuals to create their own music pieces. I found playing music on this website very satisfying and rewarding.

This inspiration drove the concept behind “Beat-Hoven.” Instead of different musical instruments, the music mixed is beatboxing. For the visuals, initially colorless avatars stand idle; once assigned a beatbox, their outfits change accordingly. Additionally, everything is kept in sync with an 8 sec cycle.

IMPLEMENTATION

First, I started by creating the basic mechanism of the project. I researched how buttons worked for the first time and created a small model of a single button that plays a beat when clicked. However, with this first step, I faced many issues. The first button I used was faulty, which took me a lot of time to realize. Next, I realized that a single button press sent multiple true values at once instead of the one return value I was looking for. This caused a lot of issues with the implementation of the button; however, I figured out how to fix the problem.

Next, I started creating the foundation of the project by connecting the button press to start the audio of the intended player. The first issue I faced was how to send an instruction to stop the audio. Since the button only sends a high signal when pressed, I had to interpret it and check if the button pressed was intended to start or stop the audio. I was able to do that with a few checks, if functions, and flags.

After figuring out the start and stop of the audio, it was time to implement it on a larger scale. I started soldering the rest of the buttons and got them ready. Then I started importing the audio assets to each character one by one to p5. After that, I started linking them to the audio and adjusting the Arduino code to accommodate the buttons.

However, the challenges were not over. The next problem was creating the cycle that decides when an audio is played and when it is stopped to keep the beats in sync. Unfortunately, Google wasn’t helpful in any of the problems I faced since everything worked differently in my code. After 3 hours of figuring it out, I moved on to the next problem: Images! Since I had 8 characters, each character had its own audio, image, and logo. It was the image stage, and I faced many issues, especially since I had a background that was refreshing with each loop. The character that appeared when the button was pressed disappeared since the background was called in the next frame. I had to redesign the whole page to accommodate this.

After figuring that out and adjusting it so that the characters change when they’re not playing, I had to find a way to inform the user that the button-pressed signal has been received. I had this problem since the character does not play until the new cycle starts. So, if the user selected a character between cycles, they would not know if the press was successful or not, and they would press again, which would deactivate the character. I decided to use their logos. One would be colored, and the other monochrome, symbolizing that the character is not selected. After implementing it and linking it with a new function, the project started to take its form.

The next step was to work on the aesthetics. I designed a logo for the page, decided on the iconic name “Beat-Hoven,” and designed the homepage. However, the homepage was missing something – music! I found a mix that was created using beatbox and applied it for the background music. Next, I created the info page that explained the game. Now, the code was almost done, and it was time to work on the container design. I designed the box using a template from the web and laser-cut it on acrylic. Next, I assembled everything into the container and drilled a hole for the Arduino data wire. Finally, I printed an image of the characters and attached it to the box to make it stand out. And with this the project was completed.

 For the Arduino code:

I used 8 buttons which the Arduino received signals from and sent it to p5. It has a delay to only receive one input from the button per press to avoid the spamming of inputs.

For the p5

There are so many checks to ensure the program runs smoothly.
  • receives the input of the button
  • turns the flag related to the character to active if not active and turns it off if active
  • checks to see the next cycle has started
    • if activation flag is active
      • activate the character audio and display the image
    • if flag is deactivated
      • deactivate the character, stop the audio and change the image
  • parallel check
      • if the activation flag is active
        • display the colored logo immediately and do not wait for the new cycle
      • if flag is deactivated
        • display the monochrome logo for the character
      • if the page title is pressed
        • deactivate all characters
        • turn all the flags to deactivated.
        • return to the main page
        • start the background music
      • if the info page is pressed
        • switch the page and do not turn off the background music

Aspects I am particularly proud of

There are so many to mention, but here’s a few

creating a fully functioning game that was created by a team of professionals in a few days and nights

figuring out ways around the obstacles I faced

project turning out to be better than I imagined and not just satisfying the goals

learning how to use a laser cutter and how to solder for the first time

Areas for improvement

I would love to add animation to the phase to be in sync with the audio and maybe another version with different characters.

BEATHOVEN:

Showcase: showcase