Assignment 14: Create Your Own Cat (Final Project Documentation)

Concept

My final project is a pet cat experience. For someone who loves cats but was never fortunate enough to be blessed with one because her mother dislikes the idea of having a cat in the house, I want to take this opportunity to give those who are in the same situation like myself to experience the joy of having a cat, without upsetting their mothers.

I hope this cat will provide comfort for cat lovers like me, or even turn non-cat lovers into one!

Pictures and Videos

Some pictures of the physical project:

User testing video from the first version:

Demo of final version:

Implementation

Schematic:

The cat itself is built using cardboard. The Arduino and breadboard are placed inside the body of the cat, accessible by opening the back panel of the body.

The arm of the cat is connected to a servo motor protruding from one side of the cat, and on the bottom of this same side, an electronic sensor also protrudes.

The head of the cat is also attached to the main body of the cat. On the bottom of the head, there is a circular hole to allow the wires connecting the two force sensors on the head to be able to connect to the breadboard and Arduino placed in the main body.

Over on the p5 side, I used a class to easily switch between the different states of the experience (start, play, end). Buttons were also made for user input to enter the name of the cat of their choosing and a submit button to confirm the name and trigger the sending of the name over to Arduino to be displayed on the LCD.

For the gifts, I illustrated each of the gifts on PowerPoint. The gifts are of a bird, a mouse, a yarn ball, a toy fish, and a donut.

I uploaded a meow sound onto p5 that plays when the cat is pet (based on the readings of the force sensors from Arduino).

Interaction Design

The cat is built from cardboard, and it has a collar in which the user can choose a name of their liking to be displayed on the collar. Cats love pets and users will be able to pet the cat and have it meow in content as a response.

When users extend their hand towards the cat, the cat lowers its paw and hand the users a “gift”. This act of gift giving from a cat is “an expression of affection and trust” which “shows they value the users companionship and consider them part of their social group.” (Cited from here)

However, the “gifts” the cats bring can range from random toys, to animals. This will be reflected in the p5.js sketch wherein every time a user extends their hand, the cat will give them a new “gift”.

Arduino

Github link to code

On the Arduino side, it was used mainly to get sensor readings and sending them to p5. It was used to calculate distance based on the readings from the ultrasonic sensor, change the servo motor position based on the distance readings, and output the name from p5 onto the LCD.

Here is a snippet of the code of how I triggered the move of the cat’s arm depending on the distance reading. It was initially using delay, but since delay will stop the whole program, I used millis() for time, stated the interval for delay between each increment or decrement of the servo position, and used the following code:

int servInt = 5;

// check if it's time to update the servo position

if (time - servoTime >= servInt) {
  // update the last move time
  servoTime = time; 

  if (distance <= 10 && increasing) {
    // increment position
    pos++; 
    if (pos >= 90) {
      // change direction
      increasing = false; 
    }
  } else if (distance > 10 && !increasing){
    // decrement position
    pos--; 
    if (pos <= 0) {
      // change direction
      increasing = true; 
    }
  } 
  // move the servo to the new position
  myservo.write(pos); 
}

And here is the code for writing onto the LCD based on the name sent from the p5 sketch:

if (Serial.available()){

  name = Serial.readStringUntil('\n');
  name.trim();
  // only update LCD if the name has changed
  if (name != previousName) {
    // update the last displayed name
    previousName = name;
    // center-align the name
    int offset = (16 - name.length()) / 2; 
    // reset to the first row
    lcd.setCursor(0, 0);
    // clear the row without lcd.clear()
    lcd.print("                "); 
    // position the cursor to the beginning
    lcd.setCursor(offset, 0);
    // display the new name
    lcd.print(name);                     
  }
  
}

p5.js

Fullscreen sketch

On p5, I developed a class called Screen with 3 methods that would display the different states of the experience (start, play, and end). An input box and submit button was created for the name of the cat to be read from user input and then sent to Arduino. I also used p5 to play the cat’s meow when it is petted.

Here is a snippet of the code to randomize the gift and display the gift when the user’s hand is under the cat’s arm and the cat’s hand is lowered.

// in the main sketch

function randomizeGift(){
  while (!isDown){
    gift = gifts[int(random(0, gifts.length))]
    break;
  }
}

// in the Screen class, under the play() method

    // displaying gift when cat hand is lowered
    if (isDown){
      image(gift, width/2, height/2+h3)
      gift.resize(0,300)
    }

Communication between Arduino and p5.js

From Arduino:

Ultrasonic Sensor and Servo Motor
The ultrasonic sensor is placed on the side of the cat, under one of the cat’s paws that is raised. The reading from the ultrasonic sensor is used to calculate the distance of the hand from the cat. If the distance is smaller than a certain threshold, this triggers the servo motor to turn which causes the cat’s paw to lower and touch the user’s hand. Once the hand is lowered, it triggers the display of the randomized object the cat hands to the user on the P5 sketch. So to p5, Arduino will send the value of the calculated distance, and a boolean that states whether the hand is down or not.

Force Sensors
Two force sensors are placed on the cats head, so that when a user pets the cat, this force is recorded. Both of the force readings are sent to p5 to trigger the sound of a cat’s meow to play from P5.

From p5:

LCD Screen
The P5 sketch prompts the user to fill in an input box to give the cat a name of their choosing. This name will be sent to the Arduino to be displayed on the LCD screen which is bplaced on the cat’s neck, like a collar.

Highlights

I’m really proud of how everything came together, but specifically in how I managed to handle the communication between arduino and p5 in terms of the cat hand interaction. On the arduino side, having calculated distance, using this distance value as a condition that triggers the change in position of the servo motor for the cat’s hand to lower, and using a boolean value to state if the cat’s hand is lowered or not is one thing I’m proud of for being able to pull off. Then on the p5 side, using the boolean variable of whether the cat’s hand is down or not to create a function that randomizes the gift and then within a method in the class that I created, I used this function and then displayed the gift when the cat’s arm is down. I think managing this whole process and communication to work well is something that I’m very proud of.

Resources

Calculation of distance from ultrasonic sensor

Wiring of LCD

LCD code from Arduino example

Challenges

I initially had issues with the delay of the servo motor, since delay would stop all of the code

One of the greatest challenges that I had with this project was finding the perfect pressure/force sensor to use for the petting and meowing experience. I initially made a prototype of a pressure sensor using velostat and copper tape which worked, but when I tried implementing it on a bigger scale, it was not giving me a varying range of readings. I then turned to use a piezo sensor, but the results were similar. It was then that I found the small force sensor that was very sensitive and is exactly what I was looking for, just that it was a tad bit small, so I ended up using two of them to cover a larger area of the cat’s head to trigger the meow sound.

Future Improvements

For the future, as I have been advised by a user I had the pleasure of meeting at the IM Showcase, I would like to implement a more technical approach to the project. I would like to try using a sensor for facial expression to detect the user’s facial expression (and thus emotions) and based on this reading, the cat will have certain reactions (or perhaps facial expressions as well, displayed on a larger LCD screen) as a response to the user’s emotions. I think with this approach, the experience will create a deeper relationship between the user and the cat, just as a real owner and pet relationship.

Fall 2024 Interactive Media Showcase Documentation
Tuesday 10 December 2024

Assignment 13: Final Project User Testing

Progress

I finally finished with the hardware aspect of my project, in terms of making the head body and arm of the cat (the ears are still missing though..). During the process, I encountered some issues with my project, mainly with the pressure sensor. First, I made my own pressure sensor using velostat and that works as shown in the final project proposal for Assignment 12. However this was too small for what I actually need for my final piece, so I tried making it on a larger scale but the pressure sensor did not give me a variety of readings and only gave me high values of readings (in the 1000s) and it rarely varies. I tried using the piezo sensor, but it also does not vary. So at the moment, the pressure sensor is not a functioning part of my project.

Other than that, I have made progress with the interface on the p5 side, finished building the arm and have tested it so that it moves with the servo motor and I have also implemented the lcd to reflect what is written on the input box in the p5 sketch.

User Testing Video

Feedback and Reflections

For this user testing, I asked one of my friends to test my project. He was able to figure it out because I was debugging right in front of him, so he saw what actions I made and knew what to do. But if he didn’t see me do it, he said he would have been confused because I didn’t give instructions for how to trigger the cat’s hand to move.

There were also parts where I couldn’t finish in time to implement, that being illustrating the different images I want to pop up on the p5 sketch whenever the cat’s hand move. So he said the cat’s hand moving right now seems a bit pointless (and I had to explain what was meant to happen).

After this user testing, I realized there are still a lot more improvements and edits that I need to make to my project. One being to give clear instructions on the p5 sketch so that users know exactly what to do. Another thing is to find a way to get the pressure sensor working. And finally, finish with the illustrations so that I can implement it to my p5 sketch.

Assignment 12: Final Project Proposal

Concept

The plan for my final project is to create a pet cat experience. For someone who loves cats but was never fortunate enough to be blessed with one because her mother dislikes the idea of having a cat in the house, I want to take this opportunity to give those who are in the same situation like myself to experience the joy of having a cat, without upsetting their mothers.

The cat will be built from cardboard, and it will have a collar in which the user can choose a name of their liking to be displayed on the collar. Cats love pets and users will be able to pet the cat and have it meow or purr in content.

When you extend your hand towards the cat, the cat will give you its paw and hand you a “gift”. This act of gift giving from a cat is “an expression of affection and trust” which “shows they value your companionship and consider you part of their social group.” (Cited from here)

However, the “gifts” the cats bring can range from random toys, to dead animals. This will be reflected in the P5.js sketch wherein every time a user extends their hand, the cat will give them a new “gift”.

I hope this cat will provide comfort for cat lovers like me, or even turn non-cat lovers into one!

Arduino

Ultrasonic Sensor and Servo Motor
The ultrasonic sensor will be placed on the side of the cat, under one of the cat’s paws that is raised. The reading from the ultrasonic sensor is used to calculate the distance of the hand from the cat. If the distance is smaller than a certain threshold, this triggers the servo motor to turn which causes the cat’s paw to lower and touch the user’s hand. Once the hand is lowered, it triggers the display of the randomized object the cat will hand to the user on the P5 sketch.

Pressure Sensor
The pressure sensor will be placed on the cats head, so that when a user pets the cat, this pressure is recorded. This will then trigger the sound of a cat’s meow to play from P5.

P5

LCD Screen
The P5 sketch will prompt the user to fill in an input box to give the cat a name of their choosing. This name will be sent to the Arduino to be displayed on the LCD screen which will be placed on the cat’s neck, like a collar.

Progress

So far, I have developed a test run code for the communication between Arduino and P5, in terms of the pressure sensor, ultrasonic sensor, servo motor, and LCD screen. In this testing sketch, the pressure reading affects the background color of the sketch.

Arduino code

P5.js code of testing sketch

I have also started developing the interface of the final sketch on a separate P5 sketch. So my next step is to further develop the final P5 sketch and then incorporate the code I developed in the testing sketch into the final one. I will also have to build the actual cat from cardboard.

 

Assignment 11: In Class Exercises (Serial Communication)

Exercise 1 – Moving Ellipse

Prompt: Make something that uses only one sensor  on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5

Arduino Code

P5.js Sketch

Exercise 2 – LED Brightness

Prompt: Make something that controls the LED brightness from p5

Arduino Code

P5.js Sketch

Exercise 3 – Bouncing Ball

Prompt: Take the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul) and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

Arduino Code

P5.js Sketch

Assignment 11: Preliminary Concept for Final Project

I want to create a mini pet cat experience! The idea is to build a physical figure of a cat and have it connect to a p5.js sketch which shows the cat’s current mood. Users can pet the cat, and using some sensor (pressure sensor? or flex sensor?), the arduino will send this reading to p5.js and displays the cat’s mood on the screen.

Another way I could implement the cat is if I have it illustrated in p5.js. This way, users can also play yarn with the cat and watch the cat on the screen move to the user’s hand movements (perhaps tracked by an ultrasonic sensor).

This third way of implementation is I think on the harder side to implement, but I could try making a physical figure of a cat and use wheels so that it is able to move. Perhaps I could make the user control this movement using an ultrasonic sensor as well.

Reading Reflection 11: Design Meets Disability

As someone who has been wearing glasses for almost all her life, I never considered it as an assistive device (even though it is, since I would be blind without them). It really just proves how the use of eyewear has become so normalized, and somehow capitalized into this fashion wear. I’m all for it honestly, I love when people get creative with self expression and using a variety of glasses is one of the ways people do so. I think the same can be said about contact lenses. They are technically an assistive device too, yet it is also commonly used to enhance one’s look; another form of self expression.

I really appreciate the comment on simplicity; it reminds me of the saying “less is more.” Sometimes it’s so easy to go overboard with features and always wanting to improve or update a work in progress, but it only ends up messy and overwhelming to work with. So this was a great reminder to keep it simple, because not everything that has extensive features are always considered good; as long as it does what it’s intended to do, I think that’s a success. And sometimes if we focus too much on adding features, we tend to lose track of the original intention, so it’s definitely better to keep it simple and try to not lose sight of what’s intended.

Assignment 10: If fairies played the piano 🧚‍♀️ (Musical Instrument)

Concept

Aysha and I designed a musical instrument that merges digital and analog elements to create an interactive, light-based piano. Our idea centers around a glowing wand that functions as a control interface. When moved over the keys, the wand activates notes from the C major scale (C, D, E, F, and G) like magic, and the brightness of its glow determines the octave. We integrated a potentiometer to adjust brightness, allowing users to easily shift the octave up or down. Additionally, we added a switch to toggle the instrument off, which prevents accidental note activation—particularly useful to avoid unintended sounds from ambient light sources like flashlights.

Details

To bring our vision to life, we used five photoresistors to detect light from the wand and mapped each sensor’s range to specific notes (C, D, E, F, and G) and their octave scales. By setting sensor thresholds from a default minimum to a maximum value that a flashlight might produce, we could dynamically adjust the octave based on the brightness the photoresistor detects. Essentially, the brighter the wand, the higher the octave, allowing for an expressive range in tone.

For the wand itself, we created a purple glow using an RGB LED, giving the instrument an ethereal fairy-like visual quality. A potentiometer is attached to the LED to control brightness, making it easy for users to adjust the octaves.

Demo

(peep Aysha’s amazing vocals)

 

Code snippet

To vary the brightness of the RGB LED we used a potentiometer. By using the sensor reading from the potentiometer, we used it as a percentage and fed it through a setColor function (from here) similar to how the alpha parameter does in an RGBA function. This way we can vary the intensity of each of the red, green, and blue LEDs in the RGB LED to control the overall brightness of the purple color.

void  loop() {
  // reading sensor value from potentiometer
  sensorVal = analogRead(potPin);
  brightness = (double)sensorVal / 1023;
  Serial.println(brightness);
  
  setColor(170, 0, 255, brightness); // purple color
}

// function from (https://projecthub.arduino.cc/semsemharaz/interfacing-rgb-led-with-arduino-b59902)
void setColor(int redValue, int greenValue,  int blueValue, double brightValue) {
  analogWrite(redPin, (double) redValue * brightValue);
  analogWrite(greenPin,  (double) greenValue  * brightValue);
  analogWrite(bluePin, (double) blueValue  * brightValue);
}

Code

Wand Code

Keyboard Code

Circuit illustration

Schematics

Keyboard:

Wand:

Reflection

This was a really fun project to make! I think in the future if we had more space we could have added more photoresistors to act as more keyboard keys and thus have more notes that are playable, or add a second wand which allows two fairies to play the piano!

Reading Reflection 10: A Brief Rant on the Future of Interaction Design

After reading Bret Victor’s rant on how technology is growing and how the future is illustrated as filled with fingers swiping on a screen, a phenomenon he calls “picture under glass”, I found myself thinking of how the current events could look like for the future.

Yes there is the image of a developed future with advanced technologies everywhere. But seeing how the internet and its trend culture affects the development people’s preferences, maybe the future isn’t set to just be about advanced technologies.

Take the digicam trend right now: people are going back to using digicams over their iPhone cameras to get that aesthetic retro look to their photos. Then there is also that Y2K fashion trend. Again and again, people are finding things from ‘back then’ interesting and bringing it back to its peak.

I personally like this part about society; finding the joy in things that other people used to enjoy in another era of time, that may have been long forgotten.

Assignment 9: Watch out for cars! (Analog I/O)

Concept

I wanted to do some kind of distance sensor, so I tried to use the ultrasonic sensor, however I didn’t fully understand how it works yet, so I opted to use the photoresistor as my sensor. The reading from the photoresistor will affect the brightness of the yellow LED: the less light there is (the closer the car is tot he sensor), the brighter the yellow LED is, thus lighting up as a warning. As for my switch, I attached one connection to a strip of copper tape and the other to the car: whenever the car is in contact with the copper tape, the red LED lights up.

Photos and Videos

Code Snippet

// constraining and mapping sensor reading to range 0 - 255
sensorVal = constrain(sensorVal, 700, 725);
brightness = map(sensorVal, 725, 700, 0, 255);

This is a snippet of a code that I’m proud of: it’s when I used the map function to convert the sensor readings into analog output values, however I had to use the reverse logic since I want the LED to get brighter when there is low readings from the sensor (inverse relationship).

Circuit Illustration

Arduino Setup

This was made using Tinkercad.

Code

GitHub

Reflection

I want to try to fully understand the Ultrasonic sensor and then using it next time for projects regarding ranges.

Reading Reflection 9

Physical Computing’s Greatest Hits (and Misses)

This reading was very insightful and interesting! I really enjoyed learning about all the different kinds of physical computing and how it has developed. I love the reference to the Scooby-doo painting, I never thought it could be a type of physical computing.

There’s this idea of physical computing being used as a horror factor. In the idea of ‘interactive dolls’, to me, if they move on their own or make a sound of their own, it is slightly disturbing and unsettling to think about. Imagine seeing one move in a dark room and then it making a sound; I’d be very freaked out. But I guess my freaked out reaction is what makes ‘interactive dolls’ thrive in the market; the fact that they can incite such a strong reaction out of someone (though it’s not a positive reaction).

Making Interactive Art: Set the Stage, Then Shut Up and Listen

Tom Igoe highlights the importance of not interfering with the audience’s experience to an interactive piece. I think that this is a crucial note to consider for artists because the main idea behind an interactive art is to let the audience interact with the piece; it’s part of the art experience. Their confusion, excitement, all these reactions (or the lack thereof) is what makes a piece interactive.

In his words, artists are supposed to ‘listen’ to the audience. This reminds me of the three components to interactivity (according to Chris Crawford): listening, thinking and speaking. In the same way an interactive piece is supposed to listen to the audience’s actions, an artist should listen to the audience’s reaction to their piece; only then will the artist truly see the effect their piece has on the public.