Final Project Documentation

Concept / Description

My project was inspired by simple robots / AI from the 90s and early 2000s (like the Tamagotchi pets) made to just be fun toys for kids. In our current age, we’re so used to advanced AIs that can complete complex thoughts, but I wanted to inspire a sense of nostalgia and comfort from this robot. It also serves as an anchor point to see how far we’ve come in the past 2 decades as more “intelligent” AI develop. The main interaction and premise of this robot are centered around its hunger and feeding it. It starts off neutral, but as you feed the robot, it gets happier. However, if you overfeed it, it’ll get nauseous. If you don’t feed it at all, overtime, it’ll get incredibly sad. You need to watch out for its needs and make sure it’s in this Goldilocks state of happiness and being well-fed. The robot loves attention, so if you hold its hand, it’ll also get happy regardless of its hunger levels. However, if you hold its hand with too much force, it’ll feel pain and get sad.

The music and sounds from the p5 sketch use 8bit audio to tie in the retro feel of the robot. The limited pixels and display from the LCD screen also give a sense of limited technology to take you back a few decades.

Video Demonstration:

Cleaner version: https://drive.google.com/file/d/15zkLTwSH97eqe1FHWSYq188_5F6aHUkX/view?usp=sharing

Messier version:

https://drive.google.com/file/d/1rzX4EbBVYXzRDgda-7Dk08BkqQ0m9Qx8/view?usp=sharing

Media (photos)

Implementation

Link to sketch: https://editor.p5js.org/bobbybobbb/full/yeMCC3H4B

p5 and Arduino are communicating with each other by sending each other values like the emotional state of the robot and FSR values. p5 controls the emotional value (each number represents a different emotion) and sends it to Arduino so that the LCD screen will display the correct facial expression and the LED lights will display the corresponding colors. The emotional state also controls the servo motors that act as the legs. The force sensitive resistor values get sent to p5 to control sadness and happiness since they act as hands being held. Interactions also correspond with specific sounds, which I’m particularly proud of as it adds a lot more atmosphere to the experience. For example, holding hands triggers a specific sound, holding the hands too hard also triggers another sound, feeding the robot triggers another sound, the hunger bar going down triggers a sound, and feeding on a full stomach also triggers a different sound.

Once I had all my functionality implemented like the code and the circuit, I moved on to beautifying the robot by building a casing for it. The wires and circuit make it hard to make a simple box for the robot, so I had to do a lot of paper-prototyping at first to get the shape and dimensions of casing. By using paper, I could easily cut and paste pieces together to fit across the robot. Even if I made mistakes, the adaptability of paper made it simple to fix. Once I found the right dimensions, I created Illustrator files to laser cut pieces of acrylic out. From there, I needed to drill sides together to create a 3-dimensional box shape.

Early prototype:

Video of early prototype (Had to make sure all the functionality worked before the visuals came in):

https://drive.google.com/file/d/1RJzqBWGN9Tan1qQ-CqXS2n1jlQ580AKP/view?usp=sharing

User  Testing

When user testing, peers commented on the user interface of the p5 sketch and mentioned how it’d be nice if the sketch matched the physical body of the robot better.  They also mentioned the awkward holding of the robot (before it was encased). I was at a loss for how to build the casing of the body, and so I asked some of my peers who are more experienced with these kind of things for suggestions. I ended up using the L shaped brackets to help make the box and laser cutting my box out of acrylic under the advice of Sumeed and David, and with the help of IM lab assistants.

Difficulties

Communication between p5 and Arduino was difficult to implement because my computer crashed at some point from the code. I wasn’t sure what I did wrong, so I referred to the example from class, replicated it and changed some values to test out simple functionality at first. Once I made sure Arduino and p5 were communicating in real time, I started building my project from there.

Most of my difficulties came from hardware and building the physical robot since I’m most unfamiliar with hardware compared to software. For example, I wanted the FSR to resemble hands poking out the robot, but upon taping down the FSR, I realized that depending on where you tape the FSR, this’ll affect the sensor readings. There’s also very limited room on the base plate I’m using to hold the Arduino and breadboard for all the wiring involved. For example, I wanted everything to be contained in a neat box, but the Neopixel wires stick out quite a bit. I ended up just making a bigger box to counteract this.

Using Neopixels was a huge part of my project and a must. To use them, I needed to solder wires to the Neopixels, which took a really long time because instead of soldering into a hole, I’m soldering to a flat surface and have to make sure the wires stick on to that flat copper surface. Sometimes, the wires would fall off or it’d just be really difficult to get the wire to stick to the solder on the copper surface. After soldering came the software; I tested using the example strandtest by Adafruit, but it didn’t have the correct outcome even though the lights turned on perfectly. They weren’t displaying the right colors. Mind you, I randomly took these from the IM lab, so I had no idea what type of Neopixels they were. It simply came down to testing out different settings for the different types of Neopixels that exist until I hit the right one.

The LCD screen is also technically upside-down on the robot body because it’s the only way for maximum room on the breadboard to put wires in. Since I had no other option but to put the screen upside down, I had to draw and display all the pixels and bytes upside down. This required a lot of coordination and rewiring my brain’s perspective.

Future Improvements

In the future, I want to use an alternate source of power for the servo motors and Neopixels because every time the servo motors run, the LCD screen blinks and is less bright because the motors take up a lot of power. Every time the Neopixels switch from one color to another, the LCD screen is also affected. I think hooking up a battery to the circuit would solve this problem. In the future, I think more sensors and ways to interact with the robot would also be nice.

Week 11 – Reading Response

Design Meets Disability

When reading about how eyewear went from medical necessity to iconic fashion accessories, I thought about how headphones could go down that route as well. Hearing aids are designed to be small and discrete, which can even hinder their abilities. However, nowadays in popular fashion, big chunky headphones are worn as part of the outfit. For example, people purchase overpriced AirPod Maxes despite better quality noise-cancelling choices out there for cheaper. However, Maxes are a symbol of status and have a clean slick look to them that help complete the outward appearance someone’s trying to uphold. Ear jewelry is a huge part of the fashion industry and part of everyday accessorizing for some people, but at the same time, hearing aids are shameful. If accessibility and fashion designers could work together to create something people with hearing impairments can be proud of , it’d be like the difference between consuming bitter medicine versus gummy vitamins.

I really liked reading about Hugh Herr with prosthetic legs that do more than legs can do. They actually help him perform acts in climbing that able-bodied people cannot. Perhaps prosthetics shouldn’t aim to replicate limbs but to achieve beyond what’s possible. There’s so much more freedom to create and add; prosthetic limbs shouldn’t fit inside a box to replicate what society deems as normal. This ties back to the idea of designing “invisible” and discrete products, implying an underlying shame attached to them and having a disability. However, confidence stems from being proud of oneself. If these products inherently disregard its user/wearer, that’s hard to achieve. Having more interface designers in this field could alleviate such problems.

Week 11 – Serial Communication

Features:

P5 sketch:

Arduino Code:

// Inputs:
// - A0 - sensor connected as voltage divider (e.g. potentiometer or light sensor)
// - A1 - sensor connected as voltage divider 
//
// Outputs:
// - 2 - LED
// - 5 - LED

int leftLedPin = 2;
int rightLedPin = 5; // analog

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin, left);
      analogWrite(rightLedPin, right);
      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.print(sensor);
      Serial.print(',');
      Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

 

Final Project IDEA

Final project idea:

  • Create something similar to tamagachi – 3D print casing – spherical with a space for an LCD screen
  • LCD Screen will emote emotions – happy, sad, angry, barf
  • You can feed the vivagachi via p5 – this will in turn make it happy in the physical world – also play sounds from computer
  • If you feed it too much – it will make a barf face
  • https://www.thingiverse.com/thing:6455210
  • How it works:
    • P5 will be in charge of all the code, it’ll just send arduino all the expressions. For example, [1,2,3,4] – each represents a diff emotion
      • 1 – happy – after being fed
      • 2 – sad – after being hit 
      • 3 – scared (light sensor for darkness)
      • 4 – full/barf
      • 5 – neutral
    • Can also have a sensor ..? – if it’s too dark it’ll make a scared face
    • Servo motor – control arms/hands – when happy, arms swing back and forth
    • Colors! – use neopixels – depending on the emotional transition – so each emotion has a color associated with it – depending on the transition, it’ll lerp one color to another. 1 to 2 – yellow to blue, for example
    • Force sensor on its hand..?if you hold it it’ll be happy, but if you press too hard, it’ll be sad (threshold)
    • Tilt sensor – if it’s upside down, say it’s dizzy

Week 10 – Reading Response

A Brief Rant on the Future of Interactive Design + Follow-up

The first minute of the Microsoft video envisioning the future seemed really cool to me, but as it went on, the video kept repeating the same ideas over and over again. It felt like our future was limited to one or two movements. The rant definitely opened up my eyes to the subconscious abilities our hands possess. Maneuvers and motions we’ve been doing since young have become so natural we don’t realize just how powerful this sense is. The rant and response to comments made about the rant reminded me of the movie Wall-E where in the distant future, all the humans become so reliant on screens, they become unable to use their body. Living life through a screen when we’re blessed with so many degrees of motion immobilizes you; we see it even now as people get more glued to screens and are constantly sitting or lying down. I do wonder though what some potential solutions to this “picture under glass” future would be. I’m thinking about somehow incorporating textures, weight, and 3D objects because the main problems mentioned were how our hands have the ability to sense and manipulate things from touch, but a 2D glass screen avoids all of that. Or maybe centering designs around actions we can perform like flipping pages, pinching things, twisting, squishing, etc. Maybe even taking inspiration from bigger actions like planting flowers,  steering and feeling the torque of the wheel, or feeling water and how it sways under the force of your hands.

Week 10 – Instrument

Description:

Oyun and I decided to create a musical instrument using force sensitive resistors (FSR) that reminded us of drumpads. There are 3 resistors in total, and each represents a different octave. Moreover, if you touch multiple at the same time you again get different octaves. There are different combinations you can try, totaling to 7 altogether. These FSR are used as digital inputs in this case, but we have a potentiometer that moves you along the scale within each octave. By turning the potentiometer, you get the full range of an octave, and by touching the FSR, you get different octaves. We also use an LCD screen to visually represent what you’re playing. Each octave has its own symbol/emoji on the screen and depending on the note you’re playing, the emoji slides across the screen. The higher the note, the farther across the screen it slides.

Video Demonstration:

video demonstration

Image of circuit:

Difficulties:

At first, we wanted to use the FSR as analog inputs and attempted many ideas of using it, but ended up on this method of using them as digital inputs because we liked that you can make different combinations. The FSR reminded us of drumpads, which is why we chose them. We also tried soldering the FSR to wires to make them easier to connect to the breadboard, but that ended up taking way too long, so we just stuck them directly to the breadboard.

We ended up using many wires and at some point, our breadboard became a bit overcrowded, and there was one time when we didn’t realize that one of the resistors was connected to 5V instead of GND.

To detect whether or not someone touched the FSR at first, we checked whether or not the reading was above 0, but realized quickly that even without touching the FSR, it was rarely ever 0. It teetered on values just above 0, so we instead used a threshold of 100.

Another huge problem we encountered in one of our past ideas was using multiple speakers at the same time. We were going to have each FSR correspond to their own speaker, but found out that the tone function doesn’t allow for multiple outputs.

Code Snippets:

Printing the custom emojis:

// make some custom characters:
byte Heart[8] = {
0b00000,
0b01010,
0b11111,
0b11111,
0b01110,
0b00100,
0b00000,
0b00000
};
// LCD screen
if (lastPotInd != potInd) {
  lcd.clear();
}
lcd.setCursor(potInd, 0);
lcd.write(byte(melodyInd));

2D array of notes:

int melody[7][12] = {
  {NOTE_C1, NOTE_CS1, NOTE_D1, NOTE_DS1, NOTE_E1, NOTE_F1, NOTE_FS1, NOTE_G1, NOTE_GS1, NOTE_A1, NOTE_AS1, NOTE_B1},
  {NOTE_C2, NOTE_CS2, NOTE_D2, NOTE_DS2, NOTE_E2, NOTE_F2, NOTE_FS2, NOTE_G2, NOTE_GS2, NOTE_A2, NOTE_AS2, NOTE_B2},
  {NOTE_C3, NOTE_CS3, NOTE_D3, NOTE_DS3, NOTE_E3, NOTE_F3, NOTE_FS3, NOTE_G3, NOTE_GS3, NOTE_A3, NOTE_AS3, NOTE_B3},
  {NOTE_C4, NOTE_CS4, NOTE_D4, NOTE_DS4, NOTE_E4, NOTE_F4, NOTE_FS4, NOTE_G4, NOTE_GS4, NOTE_A4, NOTE_AS4, NOTE_B4},
  {NOTE_C5, NOTE_CS5, NOTE_D5, NOTE_DS5, NOTE_E5, NOTE_F5, NOTE_FS5, NOTE_G5, NOTE_GS5, NOTE_A5, NOTE_AS5, NOTE_B5},
  {NOTE_C6, NOTE_CS6, NOTE_D6, NOTE_DS6, NOTE_E6, NOTE_F6, NOTE_FS6, NOTE_G6, NOTE_GS6, NOTE_A6, NOTE_AS6, NOTE_B6},
  {NOTE_C7, NOTE_CS7, NOTE_D7, NOTE_DS7, NOTE_E7, NOTE_F7, NOTE_FS7, NOTE_G7, NOTE_GS7, NOTE_A7, NOTE_AS7, NOTE_B7}
};

Mapping potentiometer value to an array index:

potInd = map(potReading, 0, 1000, 0, 12);

Logic for changing the octaves using FSR combos and selecting correct array index:

// all 3
if (fsrReading1 > 100 && fsrReading2 > 100 && fsrReading3 > 100) {
  melodyInd = 5;
}

Future Improvements:

Next time, we want to find a way to make the FSR lie flat on a surface so touching them feels more intuitive and more like a drumpad. We can also try controlling different factors about the sounds like the volume. We can also make the note progressions more predictable so that users can go through different notes more easily and play an actual song.

Week 9 – Reading Response

Physical Computing’s Greatest Hits (and misses):

The first category that popped up was a theremin-esque instrument and that’s exactly how I described my week 9 assignment with sound produced based on distance/motion. It’s interesting how motifs get repeated constantly in physical computing. Core functionalities repeat, but how you make it different and your own spin on the functionality makes physical computing worth pursuing. This blog brought up an important point about interactions that have meaning. Rather than simply mimicing a user’s actions or mapping one to one interactions, thinking about why certain interactions are valuable can fuel a physical computing project even more. For example, I really liked the accessibility examples provided like the one where a someone uses their wheelchair as a painting tool as opposed to their limbs.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

Interactive art is a mix of design and art — your piece should invoke the right emotions without having the tell the users how to feel. Just like how a sign doesn’t need to say push or pull on a door — it should be intuitive (design principles). An interactive piece is a conversation, not a one-sided story where the artist tells the user how to feel or approach the piece. The piece can grow and evolve with feedback or reactions the users provide. By listening to their side of the conversion, you can learn a lot about how people behave with your piece and ways to change it. This brings me back to a previous reading response we had about interactivity as not just a response to users’ actions, but a tool for users to create with. By giving users a sense of agency, they’re able to creative the art with you. We as artists might feel inclined to spell out our intentions for audiences and explain how complex our projects are, but if a project is truly complex and thoughtful, it shouldn’t need explanation.

Week 9 – Analog and Digital Sensors

Description:

I really wanted to try using the ultrasonic motion sensor for this week’s assignment and control both audio and visuals with it. Being able to have your hand control something without touching it seemed cool to me, kind of like the theremin instrument that allows musicians to wave their hand in the air to produce different notes. My assignment is much simpler, but has a similar idea in that the closer your hand gets to the sensor, the lower the pitch. I also have LEDs hooked up that get dimmer the closer your hand gets and brighter the farther away. For the digital sensor, I have a button controlling my 4 LEDs; the button allows you to toggle between the LEDs individually. Once you’ve gone through all the LEDs and press the button again, all of them turn on. 

Video demonstration:

Schematic:

Difficulties:

I had a hard time with the button because it wasn’t as simple as pressing it to toggle between the LEDs. When you press the button, you’re technically holding it and giving it HIGH voltage for multiple frames. That means within one button press, I’m toggling between who knows how many LEDs. However, I only want one button press to change one thing. I did some scouring online and found this state-changing example that made me realize I can just detect when the button goes from LOW to HIGH (off to on). By keeping track of the last button state and current button state, I can detect when the user clicks the button. That instance determines whether or not to toggle between the LEDs: https://docs.arduino.cc/built-in-examples/digital/StateChangeDetection/

Button state change:

// compare current and previous button states, see if there's change
if (buttonState != lastButtonState) {
   // if HIGH, button went from off to on
   if (buttonState == HIGH) {

Mapping distance to LED brightness:

analogWrite(LEDarr[currentLED], map(dist, 0, 20, 0, 255));
if (dist < 3) {
  analogWrite(LEDarr[currentLED], 0);
}

It was also difficult making the different pitches noticeable since the motion sensing isn’t the most accurate and there’s not a huge range to work with. To counteract this, I chose notes that were very far apart from each other to make the change in sound due to distance more apparent. I also made sure to print out the distances to how far a person would comfortably go away from the sensor and used that as my max.

Looking forward:

In the future, it would be nice if maybe I map a specific LED/color to a specific note or melody so that when I play songs from the speaker, the lights match accordingly. It would also be cool if I can play a song from the motion sensor and choose what notes to play given the distance/motion.

Week 8 – Reading Reflection

Attractive things work better:

I think the most important point to me when reading this essay was the idea of context being a vital part of design. Depending on the time of day, mood, and situation, certain objects might be preferred over others. For example, sometimes I switch out the type of bag/purse I bring outside depending on where I’m going. The aesthetic and functionality of the bag both matter when choosing for the right occasion. Taking into account negative moods and stress when designing/using an object also rings true in this example. Let’s say I’m going somewhere with a lot of walking around/physical activity. I don’t want a clunky bag with unnecessary pockets and components; I want something easy to carry even if it means looking a little more plain. When I’m doing a lot of physical activity, I’m going to feel more stressed out. That’s why choosing a simpler bag to avoid unnecessary panic when digging for items in my bag would be smarter. People think differently when they’re under pressure or anxious

Her Code Got Humans On The Moon

I had known that women were a crucial part of the start of computer programming from a previous media class that talked about computers and the necessity of a female workforce during WWII. However, even when were women were working on the same things as men, they were portrayed in media as doing clerical or busy work. They weren’t acknowledged for how much they contributed. These were women with degrees in mathematics and various STEM fields looked down upon and dismissed despite their qualifications. Going on to the 60s with space exploration, it’s still evident how womens’ opinions aren’t as valued despite how critical they are. Hamilton’s expertise was crucial for Apollo’s success and software engineering as a whole. It’s important to remember that these people were our predecessors and a backbone in the knowledge we have available today. Even though this field has been historically male-dominated, we have to remember that women were a very important part of this field’s emergence.

Week 8 – Unusual Switch

Brainstorming:

I had a hard time coming up with a unique switch idea, but my process entailed coming up with different body parts that can be used to put pieces of metal together. I thought back to a design class I took in the past and how the professor talked about accessible design; she mentioned doors with handles that turn like this which don’t require you to use your hands. You can use your elbow for example to rotate the handle and from there, pull with your elbow as well. Sometimes, when I have dirty hands from eating food, I’ll use my elbow to open doors like this to access the bathroom. 

I decided to use my elbow to trigger a switch. I taped a wire to my elbow and had a strip of copper tape on the table. This copper strip had the other part of the circuit taped to it, so as soon as the wire on my elbow comes into contact with the copper strip, a bunch of LEDs should turn on.

My first attempt was a bust. I just took the two wires connected by the buttons and put them together, but I realized that it took a while for the LEDs to turn off. I looked at the serial monitor and saw that the button was technically “on” (1) even after the circuit was broken. I realized after a while that it was because I didn’t ground the circuit. With the button and breadboard, a resistor is connected to the wire and goes to GND.

Failed attempt:

After realizing this, I taped the resistor and second wire together so that the electrons can go to GND and no longer light up the LEDs. Now, the LED immediately turns off after my elbow lifts up.

F I N A L    V I D E O:

Picture of the setup:Code:

Similar to our in-class code, but this time, I have 4 LEDs. The Serial.println line is very helpful for debugging!

Serial.println(buttonState);
// if button is pressed, turn LED on
if (buttonState) {
  digitalWrite(10, HIGH);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(11, HIGH);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(12, HIGH);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(13, HIGH);  // turn the LED on (HIGH is the voltage level)
}
// if button is not pressed, turn it off
else {
  digitalWrite(10, LOW);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(11, LOW);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(12, LOW);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(13, LOW);  // turn the LED on (HIGH is the voltage level)
}

Looking forwards:

I was thinking since I’m using a strip of copper, maybe I can somehow use the distance between the wires on the copper strip to determine which color LED to turn on. It’d be cool if I can measure how far away the wires are from each other on the copper strip using analog read instead of digital.