Final Project | Interactive Mr Octopus

Introduction

Interactive Mr Octopus is a wireless robot that is design to simulate a pet. It can be controlled using the arrow keys on a laptop. It has emotional states depending on its surrounding conditions and the digital version on the laptop responds according to the current emotional state of the Physical Robot. It also has sound interaction depending on its current emotional state. The project was inspired by a quest to create something fun and challenging to simulate a moving robot that has emotions.

Githubp5.js Sketch

Cover Photo

 

I also had an Instruction sheet for the IM Showcase in addition to the onscreen instructions on p5. This was because I observed that very few people would actually read the instructions in the beginning. A good way to make them read was to make them tangible by putting them on a piece of paper which somehow enhanced their ‘influence’ . Here is an image of the Instruction Sheet :

Instruction text Key Bind visuals

Here’s a Demonstration video from the IM Showcase:

Inspiration

This started off as a joke. Literally . My friends and I had won a plushy at a game store in Dubai and we joked about how it would be so cool if I could make it move and give character to it. It sounded like a crazy and cute idea. So, I said – why not? I wanted to make something ambitious for my final project that would move wirelessly. I had initially considered a giant plushy of a Pokemon character- JigglyPuff –  but realized that it was too big and I would need something smaller. So, I changed my concept to using an Octopus plushy and decided to give character and emotions to it .

I had borrowed and tried and tested lots of different sensors and modules from the IM Lab before deciding what to do and which ones to use.

19 BOOKINGS !! However, these gave me a very good idea of the resources available and how I could use them.

Most of the sensors were unreliable on a moving object and so I just used two types of sensors. Experimenting with the modules gave me an idea of how to set up wireless communication.

Concept and Interaction Design

The project consists of two main interfaces :

The Mr Octopus Robot is capable of moving forward, left, right and backward. It houses the main Octopus plushy and the movement system (using motors).  A Piezosensor on top of its head is capable of detecting touch and classifies the types of touch as – (1)no touch , (2) light pat , (3) hit/pinch/ squeeze . A Photosensor senses the lighting conditions and classifies them into (1) low light(dark) , (2) medium(normal) light , (3) bright light . The emotional state of the robot is decided based on a combination of inputs from these two sensors. Since each sensor has 3 inputs each , the Octopus has a total of 9 unique emotional states.

The Project also has a  p5 interface that allows the user to control the Mr Octopus Robot . The user can also play audio through the p5 interface depending upon the environment that the robot is in . The p5 interface receives the emotional state from the robot and displays a sketch/plays audio accordingly.

The bidirectional communication between the p5 interface and the Octopus Robot is enabled by 2 Arduinos and 2 XBee modules. The overall project is a unique, and fun to interact with robot that simulates emotions-  making it an exciting project for anyone interested in robots that have emotions or are interactive.

Implementation

Hardware

Construction

I wanted to build my robot by using 2 layers of wood. The bottom layer of wood has  the Arduino for the robot , the motors , power supply and a breadboard with the motor drivers.

The top layer of wood has an Octopus plushy , Piezosensor , Photosensor, A Bluetooth speaker and a breadboard . The breadboard is connected to the main Arduino Uno through jumper wires that pass through holes drilled into the top layer.

The whole structure- that is the two layers are held together by means of velcro and folded cardboard on top of the motors .

I used the method of testing each small part of the project separately and perfecting it before integrating everything together

Below, I have given details of the steps involved in construction of the robot :

Choosing Chassis material

I used thin plywood that I  was able to get from the scene shop. I drew an outline of my Octopus plushy on the plywood to get a sense of scale and left extra space for breadboard and any additional components that I would need to add in the future , then I cut two rectangles of equal size using a handsaw (Each one would form a layer of the Chassis). The scale of the framework was much bigger than the usual size of a remote-controlled robot because I needed space for the Octopus plushy on top.

Making the Lower level

I unit-tested the motors first . Since I needed four wheels, I connected two motors in parallel to each side of the motor driver. Then I wrote test code for movement, and taped the motors temporarily on their correct places to see if they work properly.

Connections

 

Taped Motors

Testing the movement

Once everything worked, I glued the motors to the lower level using glue gun. However, after sometime I noticed that the glue gun didn’t work well in some areas of the wood and the motors would easily come out. So, I used super glue to stick two of my 4 motors on the lower level.

Making The Upper Level.

The upper level would just contain the plushy and a breadboard for the photosensor and piezosensor connections (the photosensor would have to be at the top to detect light).

Since the Arduino Uno was on the lower level, I would need some way of connecting wires on upper level with the lower level. I measured the diameter of the jumper wires and drilled 6 holes into the upper layer above the place where the Arduino was on the lower level to allow the wires to pass through. The diameter of the holes was just smaller than the jumper wires so that the ends could be held in their place . Thus, The wires didn’t pass but just the metal ends passed through.

Holes for the wires Demonstration of how wires from lower layer were eventually connected

There was another issue. The Arduino that I used in the lower layer would be taller than usual because I would have to use an XBee shield for wireless communication. There would also have to be enough space between the two layers to allow the wires to connect. This meant that I could not just stick the top layer on top of the motors directly , rather I needed some sort of padding in between to increase the height between the tp and the bottom layers.

For this purpose, I cut strips of cardboard, folded them , stuck each fold with glue gun to use as padding. One end of the cardboard padding would be attached to the motor via velcro and the other end would be attached to the top layer via velcro. Below is an image showing this: Velcro on top layer

How the layers are joined and the amount of spacing between them

The use of velcro was vital as this allowed me to take the top layer off whenever I wanted.

Testing Sensors

Once the motors were working , it was now time to test the sensors . For this, I used a separate breadboard and Arduino and wrote some test code to test some values and print them out on the Serial monitor to check whether the sensors were working correctly and test their range as well as the circuit connections.

Testing sensors separately

The above image is of me testing a sound sensor and a photosensor. Although I spent a lot of time in testing and figuring out how to use the sound sensor, I decided not to include it in my final project as it was very unreliable and picked up small vibrations as sound too.

Eventually, I decided to use a piezosensor for sensing pressure on the plushy and referred to this circuit diagram – Piezosensor Circuit Diagram

Here, they recommended a 1 MegaOhm resistor but for my case , I found out that 330 Ohm resistor worked well and gave good decently spaced out values  . I had some confusion while referring to the above diagram so I made the connection of the piezosensor as a classic voltage divider circuit that we studied in class. I made test code to log the values of the sensor to see how sensitive it was and what values it gave when pressed with various degrees of force.

NOTE: (For anyone trying to do this in the future), the hardest part of using the piezosensor(surprisingly) was soldering metal  on the outer ring. (Please try to find an already soldered sensor) . What happens is that you have to wait for the whole sensor to get hot enough to melt the solder and this takes a very long time. I successfully soldered one sensor but then the solder came out after some time. Then, I tried with another one and the sensor broke when I applied excessive force. Eventually , everything worked fine with the third one and I was able to use that in my final project.

Integrating Power

Since I had everything on the robot running wirelessly, I needed to integrate power . Initially I had just supplied 6V to the Arduino on the robot thinking that it would be enough to power both the Arduino as well as the motors. However, it turned out that 6V was not enough to run the whole thing. So, I used a separate 6V power supply for the motors by powering the Vm pin of the motor driver with it . Even then the motors were not running at full speed. I checked the documentation for the motors and was surprised to learn that EACH MOTOR required power of 4.5 Volts to run properly which made the total power requirement 18 Volts to run the entire system properly.

Motors that came with the kit

 

However, I was supplying only 6 Volts which is a third of what is required. Despite this, At maximum speed, the motors would work and the whole robot would move at a decent pace . Also, since I did not want the robot to go too fast, this was fine as long as I ensured that the batteries were replaced frequently.

Each 6V supply consisted of 4 AA batteries connected in series. The Arduino was powered through the jack. One thing I noticed was that the motors consumed a lot of power and I had to replace the batteries for the motors quite frequently. I realized that if the batteries drained even a little bit, the Voltage they would supply would go below 5 Volts and would not be enough to run the motor properly.  I had tried replacing 6V supply with a 9V battery but after consulting with Professor Shiloh , I realized that the internal resistance of a single 9V battery meant that it supplied LESS current than 4 AA batteries connected in series. So, I stuck with the 6V power supply.

Figuring out power

I arranged the cells in a battery connector from consumables .

Putting everything together (Final Integration)

This step involved sticking everything to the wooden frame at the appropriate place. For the lower level, I kept the breadboard in the center, The Arduino close to one of the edges (so that I could access its port and change code when I wanted) , and the power supply close to the Arduino (so that I could easily plug and unplug it) . For the upper layer, I arranged the plushy to the front of the board , and kept the wireless speaker and breadboard at the back. For sticking the breadboards, I just peeled the sticker at the back . The upper layer was easily joined using the velcro with cardboard padding described earlier.

I placed the plushy on top , soldered some wires and put the piezosensor on top of it .The wires are held in place by tape.

Initial placement  Placement of piezo sensor and photosensor Final placement on top layer

Wireless Communication

I am using an XBee module for wireless communication. An XBee module uses radio communication . The special thing about these modules is that once configured, they can send and receive data AT THE SAME TIME. The AT THE SAME TIME part is very important as  a standard radio module like NRF24L01 is not capable of this. In such modules, you will have to write code to have it receive and send data at different times. However, XBees save us from this hassle. Here is the link to the wireless kit by sparkfun – https://www.sparkfun.com/products/15936 (All components of this kit are available in the IM Lab booking system as of May 2024). The Board along with the shield looks like this : Arduino Board with XBee Shield and XBee Transmitter

I had an Arduino connected to my laptop with an XBee module – say XBee1 mounted on top with an Xbee Shield. I had another arduino on the main robot with another XBee module – say XBee2 mounted on top with another Xbee Shield.

I downloaded XCTU on my PC and configured each XBee according to the instructions in the Sparkfun tutorial  using the below table as reference :This configuration is important to have everything running

What this does is it basically allows the two XBees to communicate with one another via radio communication.

My communication network then  is as follows:

p5 <-> Arduino for communication <-> XBee1 <->  XBee2 <->Arduino on Robot 

The communication is bidirectional at every step. It is a 5 step bidirectional communication.

XBee1 takes information from Arduino_for_communication  and forwards it to XBee2. At the same time , it listens for data from XBee2.

Similarly, XBee2 takes information from Arduino_main_robot  and forwards it to XBee1. At the same time , it listens for data from XBee1.

I included the <SoftwareSerial.h> library for interfacing with the XBee’s . Initially I had tested their communication and configuration and tried sending messages through XCTU which worked quite well. XCTU provided me a way to debug by seeing what messages were sent and received by the XBee.

Using XCTU to debug

 

Software

Github Link – Github_ Mr Octopus

The software consists of the p5 sketch , and two sketches for each of the arduinos . The arduino connected to the computer is called Arduino_for_comms and the one on the main robot is called Arduino_main_robot.

Arduino_for_comms reads data from p5 and forwards it to the XBee module . The XBee module on Arduino_main_robot reads this and forwards it to the arduino on the main robot. At the same time, Arduino_for_comms reads the data from XBee module and forwards it to the p5 sketch.

Arduino_main_robot reads data from the XBee module and carries out movement action according to it. At the same time, it also send data to the XBee module which forwards it to Arduino_for_comms.

One key thing to note is that Arduino to p5 communication and Arduino to Sensor communication relies on Integers , However XBee to XBee communication relies on characters. I thus needed an effective way to switch between these two data types. Neglecting this initially caused a lot of complications that are lengthy to be explained here and took a lot of time to debug  but it all boils down to using the right data type.

Software side for Arduino

As mentioned, there were two Arduinos and thus two Arduino codes. The link to both of the .ino files is here :

(1) Arduino_for_comms

Link to Code- Arduino_for_comms.ino

Summary of the Code:

Listens for data from p5 sketch and sends any data it hears to the XBee module – which then forwards it to the Xbee on Arduino_main_robot  . At the same time it listens to data from the Xbee module mounted on top and sends it out to the p5 sketch.

(2) Arduino_main_robot

Link to code  – Arduino_Main_Robot.ino

Summary of the Code:

Listens for data from sensors – combines it to output a character and sends  it to the XBee module – which then forwards it to the Xbee on Arduino_for_comms  . At the same time it listens to data from the Xbee module mounted on top and moves the motors accordingly.

In addition, I had also used several Arduino sketches for unit testing motors, sensors and communication . The following is a link to these sketches :

Test Code for Mr Octopus

Software Side for p5

Here is  the p5 sketch:

To check and test, run it on Chrome , setup the serial , and you can change Val1 in the code to switch the sketch,  press option to play introduction sound when sketch is happy and shift to play sound according to the emotional state

Description of emotional states of the robot

The p5 sketch changes according to the environment or emotional state of the robot. The robot has the following emotional states:

  1. Peaceful (Normal light , no touch)
  2. Vibing in the light (High light, no touch)
  3. Sadness in the dark (Low light, no touch)
  4. Loved normal (Normal light, Gentle touch)
  5. Loved Bright (Bright light, Gentle touch)
  6. Loved Dark (Low light, Gentle touch)
  7. Hurt normal (Normal light, Hit/pinch)
  8. Hurt dark (Low light, Hurt/pinch)
  9. Hurt Bright (Bright light,Hurt/pinch)

The p5 sketch changes according to the emotional states of the robot . These states are determined by the following readings from the piezo sensor and photosensor ( I tweaked them a little to adjust for light in the arts center).

For Piezosensor :

  • >= 70 is counted as a hit
  • >20 and <70 is counted as a pat
  • <20 is counted as no touch

For Photosensor :

  • >900 is counted as bright
  • <900 and >550 is counted as normal light
  • <550 is counted as dark .

 

Design elements

Visuals

I have a different image for each of the emotional states. The images are listed below :

Octopus hit in dark
Octopus hit in normal light
Octopus hit in bright light
Octopus normal in bright light
Octopus loved in dark
Octopus loved in bright light
Octopus loved in normal light
Octopus peaceful – normal light and no touch
Octopus sad in darkness – no light and no touch

Every Visual above was generated using a mixture of snapchat filters, the LUMI app  . Some features were drawn by SPen on my mobile phone. Snapchat filters were very useful for the whole visual generating process.

Audio

I used the following audio files in my project :

  1. Octopus Trimmed.mp3 – which is an audio that introduces the Octopus, it is a snippet of the following YouTube video : PinkFog Octopus hello
  2. Peace_Trimmed.mp3 – This is the peaceful sound effect that I adapted from Kung Fu Panda.
  3. Sad_Piano.mp3 – Sad sound sourced from Pixabay
  4. Loved_music.mp3 – Calm soothing sound effect from Pixabay
  5. Vibe_music.mp3 – Groovy sound effect from Pixabay.

The sounds are played according to the emotional state of the Octopus . The introduction sound can be played by pressing the (option) key when the octopus is in an emotional state that is NOT SAD.

Fonts

I have used the following Fonts in my project:

  1. ArtFully Regular.ttf
  2. Hartford.ttf
  3. NoteToSelf – Regular.ttf
  4. NoteWorthy Bold.ttf
  5. Sportfield Varsity Stacked.ttf
Summary

Here’s a summary of all assets I have used (code snippet from p5):

Asset Summary

User Testing

The users reported that the sideways movement is a bit slow. I suspect this is due to the 6V power supplied for the motors instead of the 12V that they require (Each motor requires around 3V -> but I have supplied all 4 motors with 6 Volts) . This is fine as I do not want the robot to move too fast.

Initially I was using a click mechanism for playing sound but a user said that using the option key and the shift key would be much better so I decided to use these keys in addition to the click mechanism. This is also reflected in the Instructions sheet.

I don’t have any particular videos of User testing but I asked for advice from Lab assistants and other people working at the IM Lab. Here’s a video of a user testing movement and the introduction interaction

I realized the movement should be faster and the reason it wasn’t fast enough was because the batteries had drained and were supplying less voltage than usual so I just took the top off (which was easy since it is attached by velcro) and replaced the batteries.

IM Showcase

The Showcase went amazing!! People gave really good reviews and a lot of people enjoyed playing with the robot and interacting with it. At times , they would drive it over to random people and say hello. Several people said that the project was technically impressive and cute.

A lot of people took photos with and of my project . There is a slight issue though – I barely took any videos 🙁  – However, I do have some recordings of people trying it out and they are attached below :

I felt very happy watching people interact with it and really enjoyed watching them get surprised when trying out different interactions such as the introduction sound and pressing with force on the piezosensor(on the top of the Octopus) that triggers hurt sound and hurt image.

Potential Future Developments

This project could be potentially improved by :

  • Making a stronger frame or chassis that would enable the whole structure to move faster without risk of damage.
  • Integrate more sensors such as sound sensor and human presence sensor.
  • Play audio through a Music maker shield rather than using a bluetooth audio speaker.
  • Easier braking system – right now ,  because of the way the code and character conversion works the only way to brake is to press the down key followed by either left or right arrows. With a little bit more code editing and correct power for the motors, I could fix this and change the key binding for brake to something simpler such as a spacebar.
  • A system for autonomous movement

 

Challenges and Things learned

Heads up to anyone making something similar in the future- You have chosen an ambitious project !! The following are some challenges I faced that I can recall and how I fixed that . Hopefully, this can be of some help to you –

  1. Integrating power – The DAGU motors we are given in the sparkfun kit need a power of 4.5 V EACH for optimal running. This sounds like a lot and it is a lot but be mindful that anything less than this and the motors may not function properly or may turn slower than expected. This is the issue I faced when the motors were not working correctly
  2. Wireless Communication – Use XBees : People have used NRFL01 modules previously because they are cheaper and smaller but if XBees are available in the booking system , use them. The difference is that XBees can SEND and RECEIVE data at THE SAME TIME. Check out the sparkfun tutorial on setting them up and you should be fine . This is not possible for the NRF modules and it is a hassle to achieve wireless bidirectional communication with them (people have done it in the past , but it’s more difficult than just using XBee) . NOTE: XBee modules used pins 2 and 3 for Rf-Tf on Arduino UNO for communications so do NOT connect anything to these pins. I missed that and spent a lot of time debugging.
  3. Use VELCRO: Lifesaver !! I could dismantle by whole project to replace new batteries or upload code to the Arduino or rewire connections because I had connected the layers with Velcro. Velcro is super super useful .
  4. Soldering on a Piezosensor – Very difficult !! Try using a sensor that already has soldered wires. If not check the construction section of this documentation. I faced a lot of difficulty soldering them. Some are very sensitive and break if you apply too much pressure.
  5. Playing Audio in p5 – Always set play mode to ‘restart’ if you are calling play() in loop . you can use setVolume() function to adjust the audio in your sketch (you have to include a library but this is very useful) .
  6. Make Room for Recalibration- I ended up gluing my Arduino to a hard to reach place in the lower level . This was a serious issue as I faced difficulty trying to reprogram it . Eventually, I was able to somehow sneak the connector in. If you are using light or infrared sensors, you will HAVE TO RECALIBRATE them while setting up as the lighting during showcase is different from lighting in IM lab . Be mindful of this and make sure you can recalibrate easily.
  7. If using XBee, be mindful of the datatypes they can send and receive, use XCTU for debugging. I spent a lot of time debugging because I used the wrong datatype.
  8. Try to reuse the starter code given by professor and adapt it accordingly – it’s way easier than writing from scratch which is what I tried doing initially.

Reflections (Proud Of !!)

When I had selected this topic , I knew it was a fun challenging project to work on. Looking at past documentation, I realized that very few students had implemented bidirectional wireless communication before and this is generally difficult to implement. I spent several days trying to configure my XBees and setting the power for my project correctly. Then , I spent several hours trying to figure out how to convert between appropriate datatypes for the 5 way communication. at one point, I thought that I wouldn’t be able to complete on time .

Despite that I was able to not only set up bidirectional wireless communication, but was also able to create a great design for the p5 sketch which I am really proud of . The project at the end turned out better than my expectations and the positive reviews and appreciation from Professors and Students at the IM Showcase made me very happy !!

There were lots of things I had to learn on my own for this project – from setting up XBees, to integrating power, soldering wires the right way, testing several sensors, making a chassis, using a piezosensor- It was a great experience. At the end , I was able to deliver on the high expectations I had for myself for the final project and I am very proud of that .

 

Special Thanks to ……

I would like to Thank the following people . This project wouldn’t be possible without their help, support and guidance-

  • Professor Aya Riad for teaching the course, following through with my project, encouraging me to make innovative projects , and helping me with ideas.
  • Professor Michael Shiloh for help with debugging and testing  the motors +help with soldering on the piezosensors.
  • Stefania and Ume for their help with using IM equipment and support .
  • All the Lab Assistants – Basil , Khadijah , Ramsha, Raya, Aadhar, Moeez, Dania, Arslan, Aya for helping and assisting me in my project as well as dealing with all of my check ins and check outs .
  • Sanansh Garg for allowing me to kidnap his Octopus Plushy and for User testing.
  • Swostik Pati and Sri Pranav Srivatsavai for guidance on how to set up bidirectional communication , for their amazing documentation  – and for starting the joke to put a Jigglypuff on top of a car.
  • Nikhil Mundra for the mini JBL Speaker that made wireless audio possible.
  • All of my Classmates across all sections especially in mine.
  • Everyone who came to the IM Showcase .
  • Everyone else who helped me , provided support and kept company . It was a pleasure working with you all !!

 

 

 

 

 

 

 

Design meets Disability | Creative Response | Week 12

This week’s reading talks about how design is important,  and sets a trend,  even in the medical field. The compare and contrast between different approaches to design in the case of eyewear, prosthetics, hearing aids etc was very interesting to me . I thought – is it just marketing that causes these differences or is it something at a much deeper level? The heading” good design on any terms ” compelled me to think about why this phrase was worth giving a thought. The example of Charles and Ray Eames making a leg splint that was ‘designed’ well illustrated this concept.

The discussion on fashion v/s discussion was intriguing too. The author shows how these two are not necessarily mutually exclusive as in the case of eyewear. He talks about a balance between simplicity and overly complex/colorful/designed things. One good example he gave that stuck to my mind is that of AirPods along with the quote “if I had more time, I would have written a shorter letter” which means that just because something is simple doesn’t mean that a lot of thought was not required into making it . \Infact,  genius can be found in simplicity.

I hope to embrace these concepts in my final project and design something user-friendly and simple .

Assignment | Week 12

 EXERCISE 01: ARDUINO TO P5 COMMUNICATION

The task was to make something that uses only one sensor on arduino and makes the ellipse in p5 move on the
horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5.

We utilized a simple set-up consisting of a potentiometer. We mapped its values to the x-position of the ellipse on p5. The ellipse moves across the x-axis as the potentiometer is turned.

Arduino Code:
void setup() {
  Serial.begin(9600); // Initialize serial communication at 9600 baud rate
}

void loop() {
  int sensorValue = analogRead(A0); // Read the value from the potentiometer
  Serial.println(sensorValue);      // Send the value to the serial port followed by a newline character
  delay(50);                        // Delay to prevent overwhelming the serial buffer
}
P5 Sketch:
let rVal = 0;
let alpha = 255;


function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {

  background(255);
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    // Print the current values
    text('Potentiometer Value = ' + str(rVal), 20, 50);
    //text('alpha = ' + str(alpha), 20, 70);
  }
  let xpos = map(rVal, 0, 1023, 0, width);  // Map the sensor value to the canvas width
  ellipse(xpos, height / 2, 50, 50);  // Draw an ellipse at the mapped position
  
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
   
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 1) {
      
      rVal = int(fromArduino[0]);
      
    }

   
  }
}

EXERCISE 02: P5 TO ARDUINO COMMUNICATION

Make something that controls the LED brightness from p5.

We used a slider in p5 and connected the led to a PWM pin. The slider controls the brightness level of the LED.

Arduino Code:
//Arduino Code

// Week 11.2 Example of bidirectional serial communication

// Inputs:
// - A0 - sensor connected as voltage divider (e.g. potentiometer or light sensor)
// - A1 - sensor connected as voltage divider 
//
// Outputs:
// - 2 - LED
// - 5 - LED

int leftLedPin = 10;
int rightLedPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);



  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      analogWrite(leftLedPin, left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.print(sensor);
      Serial.print(',');
      Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}
P5 Sketch:

Code:
let rVal = 0;
let alpha = 255;
let left = 0; // True (1) if mouse is being clicked on left side of screen
let right = 0; // True (1) if mouse is being clicked on right side of screen

function setup() {
  createCanvas(640, 480);
  textSize(18);
  ledSlider = createSlider(0, 255, 0);
  ledSlider.position(10, 40);
  ledSlider.style('width', '200px');
}

function draw() {
  // one value from Arduino controls the background's red color
  //background(map(rVal, 0, 1023, 0, 255), 255, 200);
  background('white');

  // the other value controls the text's transparency value
  fill('black');

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
    // Print the current values
    //text('rVal = ' + str(rVal), 20, 50);
    //text('alpha = ' + str(alpha), 20, 70);
  }

  left = ledSlider.value();
  console.log(left);
  right = 0;
  // click on one side of the screen, one LED will light up
  // click on the other side, the other LED will light up
 
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 2) {
      // only store values here
      // do everything with those values in the main draw loop
      
      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      rVal = int(fromArduino[0]);
      alpha = int(fromArduino[1]);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }
}



//Arduino Code
/*
// Week 11.2 Example of bidirectional serial communication

// Inputs:
// - A0 - sensor connected as voltage divider (e.g. potentiometer or light sensor)
// - A1 - sensor connected as voltage divider 
//
// Outputs:
// - 2 - LED
// - 5 - LED

int leftLedPin = 2;
int rightLedPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);



  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin, left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.print(sensor);
      Serial.print(',');
      Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}
*/
EXERCISE 03: BI-DIRECTIONAL COMMUNICATION

Take the gravity wind example and make it so: every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor.

Arduino Code:
arduino code : //Arduino Code

// Week 11.2 Example of bidirectional serial communication

// Inputs:
// - A0 - sensor connected as voltage divider (e.g. potentiometer or light sensor)
// - A1 - sensor connected as voltage divider 
//
// Outputs:
// - 2 - LED
// - 5 - LED

int leftLedPin = 10;
int rightLedPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);



  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    //int left = Serial.parseInt();
    int right = Serial.parseInt();
    int left = abs(right-1);
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin,left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.println(sensor);
      //Serial.print(',');
      //Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}
p5 sketch:

Please Click on the sketch and open it in Chrome to view :

Video Demonstration:

 

Reflections

This assignment taught us bidirectional communication. It was a useful stepping stone for arriving at the code for the final project . We had some issues in assignment 3 for communication between p5 and arduino but eventually we were able to solve it.

Overall, there was a lot to learn and it was good practice for coding for the final project

 

Final Project Idea | Moving Jigglypuff ?

Concept

This started off as a joke. A friend won a Jigglypuff at a fair and we joked how attaching wheels and a speaker to it would be amazing and how I should do it for my IM Final project.

So, I have decided I will do it for my IM Final Project.

I want to build A moving robot that has a Jigglypuff (which is a Pokemon) plushy at the top . It uses a gesture sensor to detect gestures and respond accordingly. The robot can be controlled through the laptop via P5. A speaker will be used to produce sounds depending on the ‘mood’ the JigglyPuff is currently in .

 

Some sensors that I plan on experimenting with and seeing what works :

  1. Infrared distance sensors (to check for collisions)
  2. Gesture sensors
  3. Speakers apart from the buzzer
  4. Some kind of camera module (or Go Pro that will stream video to P5)
  5. Maybe a makey makey that would allow sound response to certain touches. (such as if you pat the head of JigglyPuff)
  6. Maybe a mini-projector to project something’s on command

Initial Things to do

  1. Think of a container for the JigglyPuff. What kind of container will look good ? A large 3-D printed bowl? A decorated Wooden Cart ? The Whole module needs a reliable design and I would like to avoid sewing stuff into the plush.  Since it is a rounded plushy, placing components on top of it is not very practical.
  2. Code for the movement of the container.
  3. Check out gesture sensor and code if-else statements
  4. Attach a camera module to transfer input to P5.
  5. Get different sound clips from The Pokemon series for different emotions of JigglyPuff that are to be played for different gestures.
  6. Use Posenet to detect poses ?

Expected challenges

  1. Designing the container
  2. Connecting video feed to P5
  3. Coding for movement in P5
  4. May need multiple microcontrollers for the video feed if connecting to a goPro does not work
  5. Finding the sound clips and a speaker that can play them

 

 

Week 11 | Creative Reading response

“A Brief Rant on the Future of Interaction Design” by Bret Victor is a rant about the lack of innovative vision for future technologies. Seeing the video at the beginning, I first thought that this seems like a great future. However, as I read the article, I increasingly came to the realization that the future shown in the video could be much better and visionary . Victor’s criticism of the use of only fingers got me thinking that if we could do so many things with just a finger, how many possibilities could be unlocked if we used our entire hand? He talks about this in detail and heavily criticized visions of the future that focus on just screens. This made me realize that I should look to make something that involves interaction from the entire hand and involves as much touch interaction as possible using tactile interfaces such as buttons.

Although Victor does not offer a solution to his rant, his way of thinking inspires me.  Although he does not present a solution, he states the problem very clearly and I agree with him. I would love to work on something that uses tactile input from more than just a fingertip for my future projects and for my Final project for this course too .

Week 11 | Paper piano

Concept

While looking for ideas for the musical instrument,  I came across something called HID (Human Interface Devices). These are devices that can be used to substitute for Devices that are used to interact with the computer, such as a keyboard or a mouse.  Unfortunately , The Arduino UNO board that we use is not HID compatible. Why did we need this? because we wanted to do something unique and use something unconventional for the digital input part of the assignment. This is when we came across the MakeyMakey board by Sparkfun. Using this board, we can use anything with a weak conductivity as a button that send an input to the computer. We can then use the computer as an interface between Arduino and Makey Makey and use the output of Makey Makey as an input to the Arduino via the Serial.

Once we had this figured out, It opened great possibilities. For the purpose of this assignment, we decided to draw a piano on a piece of paper and use each part as a button ( so a sound plays whenever you touch the paper).

Technical Plan (Abstract overview)

For Arduino UNO
  1. Takes input from the serial , and reads it
  2. Converts the letter input to a corresponding frequency (taking digital input)
  3. Offsets the frequency by reading the value on the potentiometer (taking analog input)
  4. Uses tone function to play this frequency on the buzzer
For Makey Makey
  1. Takes input from the paper buttons and send a keystroke to the keyboard corresponding to what button is pressed.
  2. Presses enter automatically so that the keystroke is entered in the serial of the Arduino board.

Other Technicalities 

  • We need to make sure that the piano sounds good , for this we will use notes corresponding to an actual scale . The default scale we are using here is the C major Scale on the 4th Octave consisting of the notes C , D, E ,F , G, A and B .
  • While drawing the piano , we need to make sure that the keys are separated and each key separately conducts electricity.
  • Use alligator clips properly and make sure the connections work
  • Since the Arduino board default firmware can only output 6 letters ( wasdfg),  we need to configure it to replace some other outputs with letters. Here, we have reprogrammed it to output ‘h’ instead of the Up arrow and used ‘ ‘ (Blank space) as an input as well .

To use 

  • Connect both Arduino UNO and Makey Makey to the computer
  • Upload Code on Makey Makey
  • Open new Window of Arduino IDE and upload code on Arduino UNO
  • Open the Serial monitor and click on it to place the computer cursor there.
  • Start taking input from Makey Makey

Code

Code for the Arduino UNO board :

#include "pitches.h"
#include <Keyboard.h>

const int BUZZER_PIN = 8;          // Pin for the buzzer
const int potentiometerPin = A0;   // Pin for the potentiometer
const int duration = 200;          // Duration of the tone

unsigned long lastPotReadTime = 0; // Variable to store the last time potentiometer was read
int offset =0;

void setup() {
  Serial.begin(9600);
  pinMode(BUZZER_PIN, OUTPUT);
}

void loop() {
  // Read the potentiometer value only once every second
  unsigned long currentTime = millis();
  if (currentTime - lastPotReadTime >= 1000) {
    offset = analogRead(potentiometerPin) / 3; // Offset the frequency based on potentiometer reading
    lastPotReadTime = currentTime;
    
    // Print the potentiometer value
    Serial.println(offset);
  }

  // Check if there's any character available in the serial buffer
  if (Serial.available() > 0) {
    char receivedChar = Serial.read();
    int frequency = 0; // Frequency value for the tone

    // Map received character to frequency
    switch (receivedChar) {
      case 'w': frequency = NOTE_C5; break;
      case 'a': frequency = NOTE_D5; break;
      case 's': frequency = NOTE_E5; break;
      case 'd': frequency = NOTE_F5; break;
      case 'f': frequency = NOTE_G5; break;
      case 'g': frequency = NOTE_A5; break;
      case 'h': frequency = NOTE_C6; break;
      case ' ': frequency = NOTE_B5; break;
      default: break; // Do nothing if character is not recognized
    }

    if (frequency != 0) {
      tone(BUZZER_PIN, frequency + offset, duration); // Play the tone
      delay(duration); // Wait for the tone to complete
      noTone(BUZZER_PIN); // Stop the tone
    }
  }
}
//CCDCFE
//CCDCGF
// for happy Birthday first two lines

Relevant Code for the Makey Makey board

  1. To press enter (write newline) along with the character
void updateInputStates() {
  inputChanged = false;
  for (int i=0; i<NUM_INPUTS; i++) {
    inputs[i].prevPressed = inputs[i].pressed; // store previous pressed state (only used for mouse buttons)
    if (inputs[i].pressed) {
      if (inputs[i].bufferSum < releaseThreshold) {  
        inputChanged = true;
        inputs[i].pressed = false;
        if (inputs[i].isKey) {
          Keyboard.release(inputs[i].keyCode);
        }
        if (inputs[i].isMouseMotion) {  
          mouseHoldCount[i] = 0;  // input becomes released, reset mouse hold
        }
      }
      else if (inputs[i].isMouseMotion) {  
        mouseHoldCount[i]++; // input remains pressed, increment mouse hold
      }
    } 
    else if (!inputs[i].pressed) {
      if (inputs[i].bufferSum > pressThreshold) {  // input becomes pressed
        inputChanged = true;
        inputs[i].pressed = true; 
        if (inputs[i].isKey) {
          Keyboard.press(inputs[i].keyCode);
          // Print the key code before pressing Enter
          Keyboard.write('\n');
        }
      }
    }
  }
#ifdef DEBUG3
  if (inputChanged) {
    Serial.println("change");
  }
#endif
}

2. To configure the outputs of makey makey

#include "Arduino.h"

/*
/////////////////////////////////////////////////////////////////////////
// KEY MAPPINGS: WHICH KEY MAPS TO WHICH PIN ON THE MAKEY MAKEY BOARD? //
/////////////////////////////////////////////////////////////////////////
  
  - edit the keyCodes array below to change the keys sent by the MaKey MaKey for each input
  - the comments tell you which input sends that key (for example, by default 'w' is sent by pin D5)
  - change the keys by replacing them. for example, you can replace 'w' with any other individual letter,
    number, or symbol on your keyboard
  - you can also use codes for other keys such as modifier and function keys (see the
    the list of additional key codes at the bottom of this file)

*/

int keyCodes[NUM_INPUTS] = {
  // top side of the makey makey board
 
  KEY_UP_ARROW,      // up arrow pad
  KEY_DOWN_ARROW,    // down arrow pad
  KEY_LEFT_ARROW,    // left arrow pad
  KEY_RIGHT_ARROW,   // right arrow pad
  ' ',               // space button pad
  MOUSE_LEFT,        // click button pad
  
  // female header on the back left side
  
  'w',                // pin D5
  'a',                // pin D4
  's',                // pin D3
  'd',                // pin D2
  'f',                // pin D1
  'g',                // pin D0aa
  
  // female header on the back right side
  
  'h',      // pin A5
  MOUSE_MOVE_DOWN,    // pin A4
  MOUSE_MOVE_LEFT,    // pin A3
  MOUSE_MOVE_RIGHT,   // pin A2
  MOUSE_LEFT,         // pin A1
  MOUSE_RIGHT         // pin A0
};

///////////////////////////
// NOISE CANCELLATION /////
///////////////////////////
#define SWITCH_THRESHOLD_OFFSET_PERC  5    // number between 1 and 49
                                           // larger value protects better against noise oscillations, but makes it harder to press and release
                                           // recommended values are between 2 and 20
                                           // default value is 5

#define SWITCH_THRESHOLD_CENTER_BIAS 55   // number between 1 and 99
                                          // larger value makes it easier to "release" keys, but harder to "press"
                                          // smaller value makes it easier to "press" keys, but harder to "release"
                                          // recommended values are between 30 and 70
                                          // 50 is "middle" 2.5 volt center
                                          // default value is 55
                                          // 100 = 5V (never use this high)
                                          // 0 = 0 V (never use this low
                                          

/////////////////////////
// MOUSE MOTION /////////
/////////////////////////
#define MOUSE_MOTION_UPDATE_INTERVAL  35   // how many loops to wait between 
                                           // sending mouse motion updates
                                           
#define PIXELS_PER_MOUSE_STEP         4     // a larger number will make the mouse
                                           // move faster

#define MOUSE_RAMP_SCALE              150  // Scaling factor for mouse movement ramping
                                           // Lower = more sensitive mouse movement
                                           // Higher = slower ramping of speed
                                           // 0 = Ramping off
                                            
#define MOUSE_MAX_PIXELS              10   // Max pixels per step for mouse movement

/*

///////////////////////////
// ADDITIONAL KEY CODES ///
///////////////////////////

- you can use these codes in the keyCodes array above
- to get modifier keys, function keys, etc 

KEY_LEFT_CTRL
KEY_LEFT_SHIFT		
KEY_LEFT_ALT		
KEY_LEFT_GUI		
KEY_RIGHT_CTRL		
KEY_RIGHT_SHIFT		
KEY_RIGHT_ALT	
KEY_RIGHT_GUI		

KEY_BACKSPACE		
KEY_TAB				
KEY_RETURN			
KEY_ESC				
KEY_INSERT			
KEY_DELETE			
KEY_PAGE_UP			
KEY_PAGE_DOWN		
KEY_HOME
KEY_END				
KEY_CAPS_LOCK	
    
KEY_F1				
KEY_F2				
KEY_F3				
KEY_F4				
KEY_F5				
KEY_F6				
KEY_F7				
KEY_F8				
KEY_F9				
KEY_F10
KEY_F11				
KEY_F12			

*/	
                                           
                                           

 

Connections and Schematics

 

Images and Video

 

Challenges

  • Could not use a direct serial communication between Arduino and Makey Makey because the Makey Makey has only 2 digital output pins. The work around to this was using the PC as an interface.
  • Had to find the right notes so that the piano would sound good , after a bit of experimentation , the 4th Octave of the C scale sounded very good so we decided to go with it.
  • Had to find a correct value for the potentiometer offset, which we found through some experimentation.
  • Had to find a way to enter the output of makey makey in the serial of Arduino UNO for communication . Used the Laptop keyboard as an interface for this.
  • Had trouble using analogRead and Tone together. So,had to use the concept ofBlinkWithoutDelay using mills() function to take analogRead input only once every second.

Reflections and Further Improvements

This could be improved by :

  • Adding more notes
  • Better range
  • Improving aesthetics
  • It could be possible to ground without using one hand so that both hands are free.
  • Since we are using tone(),  we cannot press multiple keys at the same time .

Overall, The project turned out very well. Exploring using a different board also using Arduino to make a project that is interactable in a creative way was a great experience. This inspires us to explore further and discover new types of boards and sensors.  We hope to use what we have learned in our final projects too .

 

 

 

 

Week 10 | Proximity-activated LEDs

Concept

For this week’s assignment , we were required to take an analog and a digital input from two sensors to control two LEDs . I decided to use a slide switch for the digital input and the ultrasonic sensor for the analog input (distance of an object from the sensor). The slide switch is used to switch between the two LEDs that are of different colors . The ultrasonics sensor is used to detect the distance of an object from it. As an object gets closer and closer, the LED gets brighter and brighter. This could be  as some kind of an alert switch that grows bright when an object gets closed to it . It could be used for automatic proximity lighting or in security systems.

Schematics

I used tinkerCAD to simulate my project and generate the schematics below:

Schematic Diagram

View on TinkerCAD

 

List of Components

 

 

Code

I am using a library called newPing that handles input from the sensor HC-SR04 (ultrasonic distance sensor). The code is given below:

#include <NewPing.h> // Include the NewPing library

//Created by Aadil Chasmawala

const int LEDPin1 = 3; // Define LEDPin1 as a constant integer
const int LEDPin2 = 5; // Define LEDPin2 as a constant integer
const int switch_PIN= 10; //switch PIN
int brightness1;
int brightness2;
NewPing Mysensor(8, 7, 200); // Trigger pin = 8, echo pin = 7, max_distance = 200cm

void setup() {
  pinMode(LEDPin1, OUTPUT); // Set LEDPin1 pin mode to output
  pinMode(LEDPin2, OUTPUT); // Set LEDPin2 pin mode to output
  pinMode(switch_PIN,INPUT); 
  Serial.begin(9600); // Initialize serial communication
}

void loop() {
  int distance = Mysensor.ping_cm(); // Get the distance in centimeters
  Serial.println(distance); // Print the distance to the serial monitor
  if(distance > 50){ //if distance is more than 50 , switch the LED off
    brightness1=0;
    brightness2=0;
  }
  else{
  // Map the distance to the brightness of the LEDs
  brightness1 = map(distance, 0, 50, 255,0); // Adjust the range as needed
  brightness2 = map(distance, 0, 50, 255,0); // Adjust the range as needed
  }

  // Set the brightness of the LEDs
  bool state = digitalRead(switch_PIN);
  Serial.println(state);

  if(state == HIGH){ // if switch is in one state (towards 5V)
  analogWrite(LEDPin1, brightness1);
  analogWrite(LEDPin2,0);
  }
  else if (state == LOW){ //if the slide switch is towards ground
  analogWrite(LEDPin2, brightness2);
   analogWrite(LEDPin1,0);
  }

  delay(100); // Wait for a short time before taking another measurement
}

 

Video Demonstration/ Images

Link to the video- https://youtu.be/Unpllzq1EK8

Challenges and Reflections

I had some issues in using push buttons for the digital input as there would be a noticeable delay between when the button was released and the detection of that release. However, this was resolved by using a slide switch instead of a button switch.

Since I am using a library called <newPING.h>, I don’t have to worry about sending and receiving pulses from the trigger and echo pins respectively. The ping_cm handles it . At some point, the sensor was not working as expected so I printed the value of distance in the serial monitor for debugging and was able to resolve the issue.

Overall, I am happy with the way the brightness of LED changes with the distance and the quick switching of LED using a slide switch. For future projects, I hope to continue experimenting with different inputs and use them in creative ways.

Week 10 | Creative Reading Response

“Physical Computing’s Greatest Hits ” was a very interesting and thought provoking article . It talks about different forms of Interactive projects related to physical computing . Several interesting ideas were presented . It provoked me to think about ideas for my IM Final project. Ideas such as the hands-as-cursor and the glove are really interesting to me and I hope I can build something inspired by them for my Final Project. Ideas like these that turn the human hand into a method for controlling an object or a screen is something I would love to work on . 

“Making Interactive Art” focuses more on the role of us as designers to create an experience of discovery for the user .The idea that the main goal is not to create an entire conversation but rather to start a conversation is something I would like to keep in mind while designing something interactive . This ties well with the idea of signifiers by Don Norman in the previous readings.  The idea of art that inspires rather than conveys has always been appealing to me. This notion of taking the user through a journey rather than specifying a fixed way of interacting with the work reminds me of the phrase “The journey is more beautiful than the destination ” . I hope to keep this in mind in my upcoming projects . 

Screen Distance Switch | Creative Switches – Week 9

Concept

For this week, we had to make a creative switch. I wanted to use one of the sensors that came along with the Arduino kit to build something simple that could be useful if improved upon later on. I noticed the ultrasonic sensor HC-SR04 that came along with the kit and checked out its documentation to see how it works . Although we weren’t required to use code for this, I realized that all I needed to do to create a switch with this sensor was to install a library and connect it to an input pin . 

Thinking about applications , I recalled reading about the screen distance feature on some mobile phones where users who use their phones at less distance from their face for extended periods of time get a warning .

I set 0-50 cm as the range where the LED light ups to indicate that you are too close to the screen and placed the breadboard and the sensor on my table just above where I usually use my laptop. 

Images 

Video Demonstration

Video link – https://youtu.be/iuf76Mcibu8 

Code

The Arduino code is below : 

#include <NewPing.h> //include the header file 
// Aadil Chasmawala - 31st March 2024
const int LEDPin = 11; // Define LEDPin as a constant integer
NewPing Mysensor(8, 10, 200); // trigger, echo, max_distance = 200cm

void setup() {
  pinMode(LEDPin, OUTPUT); //set LED pin mode to output 
}

void loop() {
  int distance = Mysensor.ping_cm(); //get the distance 
  if(distance > 0 && distance < 50){ //if distance is too close 
    digitalWrite(LEDPin, HIGH); //light up LED
  }
  else{
    digitalWrite(LEDPin, LOW); //switch off LED
  }
  delay(100); //to ensure that the LED doesnt blink too quickly
}

I used a library called NewPing to easily create a NewPing object that can take input from the ultrasonic sensor HC-SR04 . The trigger pin is 8 ,ECHO pin is 10 , max_distance of operation is 200 cm and the LEDPin (for output to the LED ) is 11 . 

This code along with the connections creates the effect of a switch that lights up when one moves close to it , 

Challenges and Reflections 

One of my resistors in the circuit wasn’t working and it took me quite some time to figure out what was wrong  . 

I had initially thought of using a specific sound as a trigger (like a specific song or something simpler like 2 claps  that lights up the LED )  but realized that I would need a sound sensor for this. Hopefully, I will try building something with a sound sensor for one of the assignments as it is very interesting to me . 

In addition , the ultrasonic sensor is very useful too ,  however it operates only in one direction and to accurately get a sense of direction , you would have to use multiple ultrasonic sensors –  something that ?I could explore in future projects . 

 

Week 8a Reading Reflection

This Week’s reading focused on the importance of aesthetics and presented the idea that aesthetics , along with utility are both equally important . Don Norman’s example of the three teapots is pretty interesting and illustrates the idea of how personal preferences depends on mood . 

The idea of ‘affect’ that Norman discusses was something new to me . He talks about positive and negative ‘affect’ and how each of these can be useful depending on the situation . From a design perspective, it makes sense to make designs with high utility if the intended use is in highly-stressful situations . On the other hand, if the intended use is in a more relaxed environment, giving weight to aesthetics can have a highly positive impact on the user. 

I was thinking about the question of what is more important ? Usability or beauty . After this reading , I think the answer is that it depends from situation to situation and as designers , it is our responsibility to think about what exactly should we emphasize. 

The article ‘Her Code Got Humans On The Moon’ taught me the importance of looking at and dealing with edge cases even if they might be trivial . Despite Margaret Hamilton’s seniors telling her that : 

“(We had been told many times that) astronauts would not make any mistakes,” (she says.) “They were trained to be perfect.”

She still added notes to the code which turned out to be very useful when an astronaut did indeed make a mistake . 

Both these articles gave me 2 important lessons that I wish to keep in mind when designing something in the future – 

1) Both Usability and Aesthetic are broadly speaking ,equally important , however , their importance also depends on their specific use cases that must be considered . 

2) It is always useful to look at edge cases and prevent unintended errors – even if they are unlikely to occur.