Final Project – Sense of Music

Concept

It took me some time to come up with the idea for my final project. I could not decide for sure because while most of my peers were either expanding their midterm project or creating a game that would use Arduino as a user-control input, I wanted to create something that correlated with my interests and would be more unique for my developer experience. Thanks to the assignment where we needed to find out the way to control music with the Arduino sensor, and my love for music, I decided to do something associated with music, particularly enhancing its experience.

Since childhood, I have always liked to play with music controls. At some point, I even wanted to become a DJ. While such a dream never came true, a couple of weeks ago I realized that I could utilize Arduino to create something similar to the DJ panel (a simplified version of it). Moreover, I can make images on the computer to interact with the music and visualize it. After thorough research, I found out that there is a built-in p5.js library that allows to work with the sounds. This is exactly what I needed.

My project now consists of the physical part – the box powered by Arduino controls and sensors that allow a user to interact with music and the p5.js, and a digital visualization that is based both on the user actions and the music characteristics (which can also be changed by the user to the certain extent).

Project Demonstration

Implementation

Interaction Design:

A user interacts with the Arduino through box that has a number of input controls.
– 3 buttons on the right side turn on 3 different songs
– 3 buttons on the top provide control over the visualization of music style and effects on the digital screen
The red button in the top left corner is a play/pause button
– White and Black buttons on the left increase and decrease the pitch speed of the music
– Potentiometer controls the volume
– Ultrasonic Distance Sensor controls the High Pass and Low Pass filters

The input from Arduino through Serial Communication allows to influence the music as well as the image that is generated inside p5.js. 

Code:

p5.js: Using the p5.FFT the music characteristics are converted into the image. My code analyses the music, e.g. frequencies, and creates a dynamic visualization. There are 3 main music visualizers that I created – Circle (set by default), Sunburst (can be turned on/off using the button), and Particles (set by default, can be switched off). All of those patterns are based on the same idea of synchronizing with music amplitude. Function fft.getEnergy() measures the bass range in Hz. The value then passed further and is used to control the waveform of the Circle, length of lines of Sunburst, and speed of Particles.

Additionally, I leveraged the knowledge from the previous projects to equip Sense of Music with features like a fullscreen toggle or Rotation effect that can also be activated by pressing the corresponding button. Moreover, as a bonus, I decided to incorporate one of my favorite projects of this semester into Sense of Music. The Lines’ visual effect, although is not influenced by music, can be activated using one of the buttons as a substitute for Particles. Personally, I enjoy combining the Lines with the Rotating effect.

Arduino: Code for Arduino is quite straightforward as well as the connections on the board (see the schematic below). I am really glad I learned how to use and connect potentiometers and ultrasonic distance sensors in class, so I had no problem with intuitively using them for my project.

Link for p5.js code with Arduino code as a comment on the bottom:
https://editor.p5js.org/rbashokov/sketches/SwAigPMVa

Thoughts and Conclusion

I am extremely happy with the final project that I have created. More than that, I am happy and thankful for the Intro to IM class and professor Mang for helping and assisting me through the whole semester. I learned a lot in this class. More specifically, I, in some way, stepped out of the environment I got used to by constantly making myself think outside the box and create something unique, something that can be called interactive art. Other than that, the most difficult part for me was creating the physical box and setting up the buttons and sensors on it. The last time I did something similar was in middle school, so I had literally 0 knowledge of how to approach it. However, I managed to figure out a way to create schematics of the box and print them using the laser cut and connect the wires and buttons using glue. I am still very far away from being close to the results I would like to achieve, but for now, I feel like I made progress and reached the goals I set for this class at the beginning of the semester.

Speaking of my final project, Sense of Music was a great idea, and its implementation was good but not perfect. While the code was not a problem, there is still a lot to build on top of what I have for now to further enhance the user experience. The same goes for the physical parts – the box and the DJ panel. I could make them look more aesthetic and pleasant by using different styles of buttons, perhaps adding more colors to the box, etc. Unfortunately, the time was very limited, especially considering my trip to Shanghai that took the whole National Holiday break. Nevertheless, I am satisfied with the results, and as I see it, the fact that there is a lot of room for project improvement shows that the idea is actually great.

Thank you!

 

Sense of Music – User Testing

Today I conducted the the alpha version user testing of my final project, and I was more than happy to hear the reaction and feedback. The person did not know anything about my project except for the fact that it was connected to music. The reaction and a small part of the feedback can be found in the video below.

First of all, I want to note that I have not finalized my final project yet as some laser cutting as well as connection fixing need to be done. Nevertheless, the core functions that represent my overall idea work as intended. My friend emphasized that the controls are quite intuitive and he enjoyed the process of exploration of what each of the controls stands for. Moreover, he managed to combine the effects in a way that even I have never tried, which is certainly a great thing because it shows that the project incentivizes user’s curiosity and interest.

The thing I will work on in the next couple of days is the signs written on the panel that will prompt the purpose of the controls, because my friend, although explored most of the functions, did not pay attention to one of them. I will try to find a balance so as not to make the explanations too heavy to leave room for exploration. I have also decided, thanks to a recommendation from my friend, to borrow a set of headphones from the equipment center and connect them to my laptop during the IM showcase to create a more ‘ambient’ experience around my project by fully connecting the users with the music and the visualizations on the screen.

Overall, I loved to hear that my project is especially perfect for people who like music because it is exactly what I kept in mind when I was coming up with this idea for my final project. Music enjoyers are my target audience, so I will do my best to finish the project as I see it, and, hopefully, give a chance to many people to test it during the IM showcase.

Final Project Concept

For my final project, I decided to stick to the idea of controlling music using Arduino sensors. I absolutely love listening to music, so this project is a kind of implementation of personal interest. At some point in time in my childhood I was thinking of becoming a DJ, so here it is, my final project will be a bunch of effects and filters for a song controlled by the self-made DJ panel that will incorporate sensors, switches, and the buttons from the Arduino kit.

What will Arduino do?

I will turn the objects and controls from the kit into the DJ toolbox.

1) I will use buttons and switches to control the songs. Pause, play, repeat, etc. Moreover, apart from using the real songs (thinking of doing 4 of them), I am planning to incorporate the controls for sounds that DJs use to enhance the listener’s experience, e.g. to make the transitions between songs smoother.

2) I will use sensors like ultrasonic distance sensors or potentiometers to control the music. A user will be able to apply filters like isolate basses, high notes, speed up/slow down, switch the songs, etc. using the motions and movements of hands. I am not sure about how diverse my DJ toolset will be, because it will depend heavily on the functions inside p5.js.

3) Potentially, if I have time, I will also incorporate the control of music visualization using buttons or switches or sensors.

What will p5.js do?

Music visualization is the part that I want to implement through p5.js. In my opinion, if I find the correct guidance on how exactly to accomplish that, it will significantly improve the project by adding nice visual effects to it. It is cool to hear the music, but it is even cooler to see it. I want the canvas to change according to the rhythm, tone, and other features of the song that I will be playing, corresponding to the filters that will be put on by the Arduino DJ panel.

 

 

 

 

Final Project Concept

While thinking about my Final Project, I decided that I want to do either something useful or something connected to art/music. Thus, I so far have two ideas that I will be deciding between during the upcoming week.

1) Useful – Morse Code (or anything similar) translator. I will create the physical device that will take the input as a Morse Code (dashes and dots by pressing the button long or short period of time), and then translate it to the words on the screen of p5.js. At the same time, the person will be able to type the text on the laptop and it will be played as a sound on the p5.js.

2) Art/Music – I want to combine those two terms by expanding on the usage of the sensors. I want to connect a bunch of distance/light sensors and let the user control the music and the drawings on the canvas of p5.js. Just like with the assignment where we played the Jingle Bells song, I will use the sensors to speed up the music, change its tone of, and also add sounds on top of the song that is playing to allow a user to create the music. At the same time, it will also be reflected on the canvas as a unique kind of art. This is a very ambitious project that I am not 100% sure how to accomplish, but I will do my best to think about it during the week.

Production – Week 11

Exercise 1:

let circleX;

function setup() {
    createCanvas(500, 500);
    noFill();
}

function draw() {
    background('white');
    if (!serialActive) {
        console.log('ARDUINO IS NOT CONNECTED'); //output to check if Arduino connected or not
    }
  
    if (serialActive) {
        fill('violet')
        ellipse(map(circleX, 0, 1023, 0, width), height / 2, 100, 100); // using map to make the circle move
        console.log(circleX) //output position to check
    }
}

function keyPressed() {
    if (key == " ")
        setUpSerial();
}

function readSerial(data) {
    if (data != null) //
        circleX = int(data);
}            
 


// ARDUINO CODE

/* 


void setup() {
    Serial.begin(9600);
    pinMode(A0, INPUT);
}

void loop() {
    Serial.println(analogRead(A0));
    delay(5);
}
 
*/

 

Exercise 2:

let brightnessLVL = 0;

function setup() {
    createCanvas(500, 500);
}

function draw() {
    if (!serialActive) {
        console.log('ARDUINO IS NOT CONNECTED')
    }
  
    if (keyIsDown(UP_ARROW)) { 
      brightnessLVL += 1; 
    } 
    
    else if (keyIsDown(DOWN_ARROW) && brightnessLVL > 0) {
      brightnessLVL -= 1;
    }
    
    console.log(brightnessLVL)
}

function keyPressed() {
    if (key == " ")
        setUpSerial(); // Start the serial connection
}

function readSerial(data) {
    writeSerial(brightnessLVL);
}
 

// ARDUINO CODE

/*
 
void setup() {
    Serial.begin(9600);
    pinMode(10, OUTPUT);
}

void loop() {
    analogWrite(10, Serial.parseInt());
    Serial.println();
    delay(1);
}
 
*/

 

Exercise 3:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let checkBounce;
let outputBounce;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
    
  if (!serialActive) {
        console.log('ARDUINO IS NOT CONNECTED')
        fill('red')
    }
  
  if (serialActive) {
    textAlign(CENTER, TOP);
    textSize(24); 
    fill('green'); 
    text('ARDUINO IS CONNECTED', width / 2, 10);
}
  
  if (position.y > height-mass/2) {
      checkBounce = 1;
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
      }
  else {
      checkBounce = 0;
  }
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed(){
  if (key==' '){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
    
  if (key == "h") {
    setUpSerial();
  }
}

function readSerial(data) {
    if (data != null) {
        wind.x = map(int(data), 0, 1023, -2, 2);
        
        let sendToArduino = checkBounce + '\n';
        writeSerial(sendToArduino);
    }
}

// ARDUINO CODE

/* 

 
const int LED_PIN = 10;
const int POT_PIN = A0;

void setup() {
    Serial.begin(9600);
    pinMode(LED_PIN, OUTPUT);
    pinMode(POT_PIN, INPUT);

    while (Serial.available() <= 0) {
        Serial.println("0");
        delay(300);
    }
}

void loop() {
    while (Serial.available()) {
        int LEDtrigger = Serial.parseInt();
        
        if (Serial.read() == '\n') {
            digitalWrite(LED_PIN, LEDtrigger);
            Serial.println(analogRead(POT_PIN));
        }
    }
}
 
*/

 

Reading Reflection – Week 11

As technologies rapidly develop in our world, more and more disabilities can be treated to help each disabled person live a normal life without limitations. However, those technologies are often treated by disabled people as “embarrassing” because they explicitly state the disability and show it to society. Disabled people are often treated differently in society. They are treated as “sick” people and are often shown compassion and willingness to help, which is good and ethically right, but some of them do not wish to be treated that way. Some of them do not want to be pitied. This is exactly why I found “Design meets disability” a very interesting reading and enjoyed looking at the author’s perspective on this matter. I really liked the description of how glasses transformed into a stylish accessory from the plain medical device. It is a quite remarkable achievement – transformation into the essential object of fashion, and of course, for other devices created for people with disabilities it would be ideal to do the same. However, I think that it is still hard to achieve.

First of all, the glasses is a small object that can does not substitute the human body but rather complements it. This is why it can be called an accessory – it is something people add on as an extra element of their outfit. Sunglasses, for example, are worn even by people who have a perfect vision, simply because to looks good on them. The optimal goal for other devices should be exactly the same – to look more as a complementary object and less as an artificial substitute. The scientific progress pushes forward this idea, and the artificial arms and legs are now looking more and more “human”, which positively contributes to the perception by other people. Yes, the idea of embracing these artificial devices by plugging them into the fashion described by author (e.g. Aimee Mullins) contributes positively as well, but I believe that this is a temporary step to treat people with disabilities equally and decrease their level of feeling uncomfortable in the society rather than the ideal solution. The ideal solution would be to build the artificial arm to look the same as the normal arm. Giving an example, let’s refer to the Star Wars movies:

Why did Luke get an almost real replacement hand while Anakin got a robotics metal hand? : r/StarWars

The first prosthesis belongs to Luke Skywalker and reflects the technological advancements for the period between the prequel movies and the original trilogy. The second prosthesis belongs to Luke’s father, Anakin Skywalker. Obviously, the first looks much better than the second one. (By the way, Anakin wore the glove to cover his mechanical arm).

Overall, the technological inventions in the field of protheses are very important to make the people with disabilities all around the world feel included and equal to other members of society. As of now, while we are waiting for future advancements, it is a great idea to try to use them as elements of fashion just like the author suggested.

Jingle Bells – Speed Variation

Concept 

In this assignment, I collaborated with @Nelson and we both love Christmas. The famous song Jingle Bells brings memories of the the times. So we explored various possibilities and decided to come up with speed variation of the Jingle Bells melody with respect to distance.

Here is the demonstration Video:

Schematic 

Here is the Schematic  for our Arduino connections:

Code:

In the implementation of our our idea, we searched for possible combinations of the notes and durations to match the Jingle Bells melody and stored them in an array. We then implemented the code mapping distance with durations. The variations in durations for each note make it seem playing faster or slower. Here is the code:

#include "pitches.h"
#define ARRAY_LENGTH(array) (sizeof(array) / sizeof(array[0]))

// Notes and Durations to match the Jingle Bells 
int JingleBells[] = 
{
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_G4,
  NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4, NOTE_F4,
  NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_E4, NOTE_D4, NOTE_D4, NOTE_E4,
  NOTE_D4, NOTE_G4,
};

int JingleBellsDurations[] = {
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4, 4, 4, 4, 4, 4, 4,
  4, 4
};

const int echoPin = 7;
const int trigPin = 8;;
const int Speaker1 = 2;
const int Speaker2 = 3;
int volume;

void setup() 
{
// Initialize serial communication:
  Serial.begin(9600);
  pinMode(echoPin, INPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(Speaker1,OUTPUT);
}

void loop() 
{
  long duration,Distance;
  
// Distance Sensor reading
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  Distance = microsecondsToCentimeters(duration);

// Map Distance to volume range (0 to 255)
  volume = map(Distance, 0, 100, 0, 255);  
  volume = constrain(volume, 0, 255); 

// Play melody with adjusted volume


 playMelody(Speaker1 , JingleBells, JingleBellsDurations, ARRAY_LENGTH(JingleBells), volume);
  
// Debug output to Serial Monitor
  Serial.print("Distance: ");
  Serial.print(Distance);
  Serial.print("    Volume: ");
  Serial.print(volume);
  Serial.println();
}
// Get Centimeters from microseconds of Sensor
long microsecondsToCentimeters(long microseconds) 
{
  return microseconds / 29 / 2;
}
// PlayMelody function to accept volume and adjust note duration
void playMelody(int pin, int notes[], int durations[], int length, int volume) 
{
  for (int i = 0; i < length; i++) 
  {
// Adjust the note Duration based on the volume
    int noteDuration = (1000 / durations[i]) * (volume / 255.0);  

// Play the note with adjusted Durations
    tone(pin, notes[i], noteDuration);
// Delay to separate the notes
    delay(noteDuration * 1.3);  
    noTone(pin); 
  }
}
Reflection

This week’s assignment was especially interesting because we had a chance to collaborate on the project and combine our imagination and skills to create something original. I really enjoyed working with @Nelson. We worked great as a team by first coming up with an idea and then adjusting the concept to choose the ideal balance between randomness and something ordinary to achieve the final result.

I believe that this project has a lot of potential for future improvements, and perhaps I will use some parts of this week’s assignment for my future ideas and projects. Looking forward to the next weeks of the course!

Reading Reflection – Week 10

I really enjoyed reading this article and the Q&A based on it. Watching the video from Microsoft, reminded me of watching the Iron Man movie when I was a kid. The moments of Tony Stark working on armor using super advanced 3D technologies always impressed me, and I imagined how, perhaps, I could do the same when I grew up. It is quite common to see widely published fiction movies, books, articles, etc. as an inspiration for future technologies. The other day I was watching the video of how some things depicted by authors or artists in the previous centuries were actually implemented in real life. Moreover, many of them are used every single day – take a plane, for example. At the same time, other products of our predecessors’ imagination have not yet been feasible. Time machines, teleports, and other stuff that we often consider out of our current range of scientific skills. I made this long introductory statement to come to the idea that not all the things that we foresee and, in many cases, ‘make up’ based on our perception of what could happen in the future, actually become true.

Going back to the video from Microsoft and the article by Bret Victor on it, I, personally, mostly agree with the main statements made by the author. First, while I would be highly impressed if one showed me this video when I was 10 years old, watching from the current perspective of the experienced technology user, I can surely say I would not like to live in such a world. I feel that it is incredibly over-interactive. Excess of a technology can often hurt the technology, and this is exactly the case. As Bret Victor mentioned, we already spend too much time using electronic devices (btw his article was published in 2011, and now we use gadgets much much more). If we create the whole world around us based on augmented reality, it will mess up our initial perception of life, as how I see it. I am currently kind of skeptical about augmented reality as a whole, but who knows, maybe in 20 years I will change my mind. Anyway, I resonate with the author on the part that the future described in the video is pretty far from ideal. We should not abandon our core activities and ‘natural’ interactions. Otherwise, it will not only distort our lifestyle but also, as scientifically proven, will make our brains less developed.

The only remark I would make is that as of now, I feel like neither voice control nor, as I have mentioned above, augmented reality can be perceived as an adequate substitute for the fingers and mouse control. Yes, thinking about the future of interactive technologies being solely dependent on our finger clicking and primitive hand movements is quite sad, but I guess we got so used to them that any quick transition would be extremely difficult and inconvenient. At the same time, innovations are happening extremely fast, and who knows, maybe a couple of decades from now we will adopt another method and will be using it every 5-10 minutes of our lives as we do with our phones.

Reading Reflection – Week 9

Physical Computing’s Greatest Hits (and misses)

I really liked the idea that you should not give up on your idea, even if it was already implemented. Instead, if you actually have thoughts on how this particular idea can be improved, it is definitely worth trying to implement it. We are all different in our own ways, so one can come up with something new that the other people did not even think about. There is always room for uniqueness, and if we will stop thinking about something that was done by other people, and rather focus on our imagination and unconventional perspective, we can achieve extraordinary things. This applies not only to the art field but also to life in general.

Other than the author’s idea, I quite enjoyed reading about the examples that he gave. I have seen or at least thought of many of the devices that are mentioned, but I am 100% sure that each of them can be enhanced in one way or another. Looking through some of them in particular, I noticed how people’s imagination can turn into something interesting and original. Some of the works that I liked the most are Troll Mirror and Hourglass Mirror by Daniel Rozin.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

I fully agree with the author’s ideas in this reading. First of all, I really liked the line drawn between interactive artwork and ‘conventional’ art like sculpture and painting. It brings me back to the previous readings about interactivity and what is important to keep in mind while designing it. A user should be able to intuitively catch the idea of what to do without explanations needed. Secondly, I liked the comparison between the director and the artists. You do not want to ‘label’ your interactive artwork by imposing/prompting your idea to the people. They need to reflect on your artwork from their own perspective, which can potentially bring deeper understanding and open unique angles of interpretation of the idea, helping you to work on future improvements of the artwork.

I now understand why the professor did not allow us to describe or explain our midterm projects during the showcase. It was actually quite useful to see and hear (shut up and listen) authentic reactions and feedback from my classmates. It helped me understand which sides of my work I would need to improve in the future to ease the interaction and perception.

Analog input & output – Antiprocrastination Iphone Case 2.0

For this assignment, I decided to release an update for the Antiprocrastination Iphone Case as I promised in my last week blogpost.

Concept

The shortage of the conductive fabric (could not find it in the IM Lab where it was supposed to be) made me find a new way to track the presense of the phone on the surface. This time, I used the conductive tape, which was not that convenient to use, but I experimented with a normal tape to attach it to the phone case and put together the wires.

For my analog input, I decided to use the potentiometer that the professor showed us in class, but use it in an unusual way – track the movements in front of the tracking surface to enhance the phone protection. Whenever an object (hand in our case) is trying to touch the phone, the red led light turns on to signal the extraneous movements. Using the coding, I set the value of the potentiometer under which the led light would turn on. Moreover, I also added the analog output using the map() function that I used multiple times in p5.js to project the range of values of one object to another. Thus, the closer the hand to the phone, the brighter the red led light is. To achieve such an effect I needed to play with the values inside the map() function (as described in the comments for my code), and switch the order of the values for the led light.

Code

// pin definitions
const int potentiometerPin = A3;     // potentiometer
const int ledPin = 3;      // LED connected to analog-reading capable pin number 3

// the setup routine runs once when you press reset:
void setup() {
  // initialize serial communication at 9600 bits per second:
  Serial.begin(9600);
  
  // initialize 
  pinMode(ledPin, OUTPUT);
}

// the loop routine runs over and over again forever:
void loop() {
  // read the input on analog pin
  int sensorValue = analogRead(potPin);
  
  // setting up the mapping so that would work as intended - 
  // the closer the hand, the brighter the led. Putting the values 0 and 255 vice-versa for this
  // and limiting the values of the sensor from 500 to 825 (I put 825 intentionally, so that red led would be bright if the phone is removed from the wires)
  int ledBrightness = map(sensorValue, 500, 825, 255, 0);

  // print out the values we read to give more intuition: 1st for potentiometer and 2nd for led
  Serial.print("Potentiometer: ");
  Serial.print(sensorValue);
  Serial.print("   LED: ");
  Serial.println(ledBrightness);
  
  // setting the LED brightness
  analogWrite(ledPin, ledBrightness);
  
  delay(1);  // delay in between reads for stability
}

Schematic

Video

Reflection

For my first assignment, I did not use any code. Here, I needed to figure out the syntax and commands required to make things work. So far I find it pretty challenging to think in both dimensions – physical and digital, but it is also extremely interesting. Although I did not yet implement the sound effects as I wanted to when the phone was removed from the surface, I am happy with the result that I achieved, especially with how fast I managed to find the substitute for the conductive fabric.

I am looking forward to next week’s classes and assignments to further enhance my Arduino skills.