Final Project Proposal: JJR (Jump Jump Revolution)

JJR (Jump Jump Revolution) is a remake of the famous video games series DDR (Dance Dance Revolution) but with a twist. This time, the user won’t dance on the platform, but will play a game instead. They won’t control a character in the game, they will be the character!

I want to create an interactive game utilizing a physical platform reminiscent of Dance Dance Revolution (DDR) for user control.

This project integrates Arduino for the physical interface and P5JS for the game itself. The user will navigate through the game by stepping on different tiles of a wooden platform, triggering corresponding actions in the game.

Disability Meets Design

Reflecting on the insights I gained from this reading, it is evident that achieving the right balance in designing for individuals with disabilities is crucial. Leckey’s approach to creating visually appealing furniture for kids with disabilities, without making it overly conspicuous, resonates as a prime example. The cautionary note about radios incorporating screens, hindering accessibility for visually impaired individuals, serves as a reminder that simplicity often triumphs in making things universally functional. Exploring multimodal interfaces, such as beeping buttons and flashing lights, emerges as a potential game-changer for those with sensory issues, providing diverse interaction avenues. The reading emphasizes the diversity among individuals, illustrated by the varying preferences of two people with visual impairments in their devices, debunking the notion of a one-size-fits-all solution. It prompts questions about the delicate balance designers must strike in making things accessible without introducing unnecessary complexity. Additionally, it raises inquiries about the role of designers in the accessibility landscape and underscores the importance of prosthetics aligning with both functionality and personal style. The reading broadens the perspective on design and accessibility, challenging conventional checkboxes and encouraging a more profound examination.

Pullin’s “Design Meets Disability” offers a comprehensive exploration of the intersection between disability, fashion, and design. Pullin adeptly goes beyond mere usability, delving into core concepts of inclusion and empowerment, particularly when viewed through the lenses of inclusivity and disability. The book showcases how assistive technology, such as glasses, hearing aids, and prosthetic limbs, has evolved into fashion statements, transforming basic needs into unique expressions of identity and personal style. Pullin emphasizes the significance of designs being both functional and aesthetically pleasing, challenging the perception that functionality must compromise visual appeal. This alignment of good design and utility has the potential to alter public perceptions of disability aids, fostering social acceptance and appreciation. The discussion highlights the importance of maintaining functionality without sacrificing simplicity, benefiting individuals with disabilities and contributing to universally usable designs that enhance overall quality of life. The reading underscores the need to pay meticulous attention to the preferences and needs of individuals with disabilities during the design process, challenging assumptions and societal stigmas. Ultimately, it encourages the creation of interactive designs that are not just functional but also user-friendly, simple, and fashionable, promoting inclusivity and thoughtfulness in the world.

Physical Computing’s Greatest Hits (and misses)

Considering the position of interactive artwork, this reading has prompted me to contemplate my perspective on interactivity. Tigoe’s perspective resonates with me, viewing interactive artworks as akin to performances. The artist constructs a stage for interactors, transforming them into the performers within this theatrical space.  Andrew Schneider’s piece, while appearing as a fixed narrative from a distance, offers diverse interactions in group settings, providing a more rewarding experience than a singular interpretation of museum paintings. Exploring the greatest hits and misses further complicates this perspective. Even seemingly straightforward interactions, like an LED lighting up upon approach, possess untapped potential for development. The originality of an idea lies not in the interaction itself but in the contextualization and open interpretation it offers. This nuanced approach to context and interpretation is particularly appealing. It leads me to contemplate the possibility of creating a more contextualized theremin, building on the potential for exploration within a defined setting.

Emotion & Design: Attractive Things Work Better + Margaret Hamilton

Donald A. Norman’s “Emotions and Attractive” and “Her Code Got Humans on the Moon—And Invented Software Itself,” provided me with valuable insights that resonate deeply. One standout story is that of Margret Hamilton, serving as an inspiring example for young women aspiring to make their mark in a predominantly male-dominated field. Her journey, which started without a clear goal, ultimately led to the creation of software crucial to the Apollo missions at NASA, laying the foundation for modern computer software. It embodies the timeless principle of “just do it.”

The remarkable journey of Hamilton, from initiating something without a clear plan to creating an industry worth $400 billion, illustrates the incredible results that can emerge from hard work and dedication. This narrative connects with Norman’s readings by highlighting that Hamilton didn’t create something just for its own sake; her work was both functional and aesthetically appealing, leaving an indelible mark on history. It emphasizes the importance of striking a balance between usability, aesthetics, intention, and purpose.

Margret Hamilton didn’t chance upon the code we now recognize as software; she meticulously crafted it with usability and a form of beauty appreciated by programmers of her time. Her intentional approach propelled her to the position she holds today, serving as a genuine source of inspiration for my own career aspirations. Her story encourages me to pursue a path that combines functionality and artistry, mirroring her groundbreaking work.

Musical Instrument (Nourhane & AYA)


This assignment was complicated in terms of finding a concept for it. We had a couple of cool ideas to do at the beginning (ie. soda cans drums) but their implementation was a fail. We decided to use our favorite part of the kit: the ultrasonic sensor, and get a concept out of it.

Both our previous assignments used ultrasonic sensors as distance detectors. So we thought to base this assignment on the same concept.

This musical instrument is inspired by the Accordion instrument (with a twist):

Using a plastic spiral, we are mimicking the same movement of an accordion to produce different notes. The distance between the hand and the sensor is what is producing the different notes (each distance range was assigned a different note). The circuit has a switch to turn on/off the instrument.

Here is a video of what I’m talking about:

IMG_8161 2


Again assembling the circuit was a hard part but it worked! The code is pretty simple and consists of an ultrasonic sensor and a switch. The `loop` function repeatedly triggers the ultrasonic sensor, measures the distance to an object, converts it to centimeters based on the speed of sound. Depending on these values, it determines whether to play a musical note. If the distance is outside a defined range or the force is below a certain threshold, it stops playing the note. Otherwise, if sufficient force is applied, it maps the distance to an array of musical notes and plays the corresponding note using a piezo buzzer.

int trig = 10;
int echo = 11;
long duration;
long distance;
int force;

void setup() {
  pinMode(echo, INPUT);

  pinMode(trig, OUTPUT);


void loop() {
  digitalWrite(trig, LOW); //triggers on/off and then reads data
  digitalWrite(trig, HIGH);
  digitalWrite(trig, LOW);
  duration = pulseIn(echo, HIGH);
  distance = (duration / 2) * .0344;    //344 m/s = speed of sound. We're converting into cm

  int notes[7] = {261, 294, 329, 349, 392, 440, 494}; //Putting several notes in an array
  //          mid C  D   E   F   G   A   B

  force = analogRead(A0); //defining force as FSR data

  if (distance < 0 || distance > 50 || force < 100) { //if not presed and not in front

    noTone(12); //dont play music


  else if ((force > 100)) {  //if pressed

    int sound = map(distance, 0, 50, 0, 6);  //map distance to the array of notes
    tone(12, notes[sound]);  //call a certain note depending on distance



Though I found this assignment difficult, I enjoyed the process of implementation and finding a concept. If I could improve the instrument, I would work on the notes and their frequency to sound more natural. Now it sounds like an EKG machine with notes instead of an instrument, but I think it fulfills the requirement of fictional instrument.

The Future of Interaction Design and Responses: A Brief Rant on the Future of Interaction Design

In this reading, Bret Victor’s critique of the mainstream vision for future technological advancements in interactive design sheds light on the limitations of current technologies in fostering genuine interaction. Victor challenges the prevailing emphasis on touch-screen efficiency and advocates for a more hands-on approach, rooted in his perspective shaped by a different technological era. He questions the seamless integration of physical and digital experiences, emphasizing the importance of tactile engagement. Victor also expresses concerns about children’s overreliance on digital devices, foreseeing potential risks to their healthy development. Together, these perspectives, juxtaposed in order, highlight the collective call for a more thoughtful and inclusive approach in shaping the future landscape of interaction design.

The author underscores the duty of interactive designers to prioritize accessibility, especially for those lacking specific capabilities. While admiring the remarkable potential of human abilities, the author confronts the difficulty of finding equilibrium between harnessing these capabilities and rectifying inherent inequalities. The imperative for continuous research, particularly in domains such as tangible interfaces, is highlighted.

Week 9: PAY HERE!


This week, I chose to work with an ultrasonic distance sensor to create a quirky barcode scanner. Since we still did not cover the sensor in class, I referred to SparkFun reference list and this video to know more about the sensor.

The LED lights light up and the piezo beeps once anything close (within a specific distance) is detected so that it mimicks the scanning of products’ barcodes. A switch button is also there in case the LEDs need to be switched manually (also to meet the requirements).  I handmade a funny setup of a cash register that you can check below.


I used the following elements: BreadBoard/LEDs/Ultrasonic Sensor/Piezo Buzzer/Switch Button/Resistors/Wires/Arduino Uno, to make the following circuit:

The result was this funny looking cash register:


Here is a video of how it works: cashier

//pins numbers
const int trigPin = 9;
const int echoPin = 10;
const int ledPin1 = 11;
const int buzzer = 3;
const int ledPin2 = 6;

long duration;
int distance;
bool buzzerOn = false;  

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  pinMode(buzzer, OUTPUT);
  pinMode(ledPin1, OUTPUT);
  pinMode(ledPin2, OUTPUT);

void loop() {
  //to clear trigpin
  digitalWrite(trigPin, LOW);

  //set trigPin on HIGH state for 10 micro seconds
  digitalWrite(trigPin, HIGH);
  digitalWrite(trigPin, LOW);

  //read echoPin & return the sound wave 
  duration = pulseIn(echoPin, HIGH);

  //calculating the distance
  distance = duration * 0.034 / 2;

  if (distance <= 50) {
    tone(buzzer, 2000); //frequency to make buzzer louder
    digitalWrite(ledPin1, HIGH);
    digitalWrite(ledPin2, HIGH);
    buzzerOn = true;  //buzzer is on
  } else {
    if (buzzerOn) {
      buzzerOn = false;  //buzzer off
    digitalWrite(ledPin1, LOW);
    digitalWrite(ledPin2, LOW);
//Prints the distance on the Serial Monitor
//Serial.print("Distance: ");

Basically, what’s happening here is that the code uses an ultrasonic sensor to measure distances based on the time it takes for a sound wave to bounce off an object and return. The trigger (trigPin) sends out a short ultrasonic pulse, and the echo (echoPin) detects the reflected pulse. The duration of the pulse’s travel time is then measured, and the distance is calculated based on the speed of sound. If the calculated distance is less than or equal to 50, the code activates the piezo buzzer at a frequency of 2000 and turns on two LEDs to indicate that an object is detected. The `buzzerOn` flag is used to prevent the buzzer from continuously sounding when an object remains within the detection range. When the distance exceeds 50 units, the code turns off the buzzer and the LEDs.


The code was the most challenging part in this assignment. I watched few tutorials (including the one referenced) to figure out how to work with the HC-SR04 and how to code the piezo buzzer part so that it is responsive to the distance. Also, I juggled many hardware issues including a short circuit costing me a sensor and almost my laptop plus a lot of damaged wires. Looking at the result now, I think it was worth taking the risk and working with something I never used before.


Sparkfun Kit:

UltraSonic Sensor:




WEEK 8: Unusual Soulmates


When two soulmates meet, both of their worlds light up. That’s exactly what’s happening in this assignment. I chose to use mine and my bestfriend’s matching bracelets to make the circuit complete and the LED light up. If the bracelets don’t touch, the LED won’t go on.


I used the following elements to make the circuit work:

-The matching bracelets (the conductive part is the gold rods)

-The following elements from the Sparkfun Kit:

-The circuit:


Here’s the video (click to download): IMG_7984


I had so much fun working on this assignment. The process was straightforward and I didn’t face any challenges because the idea is simple. If I do this assignment again, I would maybe try to make it more aesthetic. For now, I like how it turned out because it matches the requirements of the assignment efficiently.


Midterm project – Merhba, Alf Merhba


This midterm project was honed to be a simplistic learning experience, where users get to learn new words related to household, in the Moroccan dialect through sound and interactivity. It is a project that holds to my heart because it represents a part of my culture and my dialect. I can’t be happier to share it, now that it’s done!

The idea turned out exactly how it was imagined and planned which makes me very proud of the way I’ve come from the first assignments which were purely trial and error. After this assignment, I feel like I grasp more what should be done, how and when. From brainstorming to design to implementation, it went smoothly.

I present to you the outcome of my midterm: Merhba, Alf Merhba (Welcome a Thousand Welcome!):

Link to the full screen sketch :


-Home Screen:  Click on audio graphic for the pronunciation of the title (this sentence is used to welcome people to your space). Click on PLAY to start the experience. -Instructions: You’ll be presented with a screen containing the idea of the experience, as well as how to proceed. Click on left arrow to go back to the home screen, or right arrow to proceed.

-Inside the house: Now you have six spaces to choose from (room 1, room 2, bathroom, kitchen, dining room and lounge). Click on any space to discover the pronunciation of Moroccan words for objects related to that space.

-In any space x: Click on any object and discover how it sounds in this dialect! Use the arrow (on top or in the bottom), to go back to the house.


  1. Brainstorming: I didn’t spend a lot of time in this step because as soon as I read “midterm … experience”, I knew I wanted to do something related to something I identify with, but at the same time fun and useful for users to spend few minutes of their time interacting with. It took a few days to choose this concept among three other interesting concepts.
  2. Design: I was slowed down a bit by this step because I had many ideas for my concept that I was struggling to imagine in a visual format. I knew from the beginning that I wanted the setting to be a house, but didn’t know if I should have a single space hand drawn with interactive elements or something else. Many thanks to my friend Nourhane for helping me decide on a vector based image with multiple spaces, to be made interactive. Then, I designed the home screen and instructions screen by myself to align with the color palette of the house.
  3. Sound Recordings: My favorite part! I had so much fun recording the pronunciation of the words. I had to re-do many of the recordings multiple times because they either sounded weird, mispronounced or noisy. This part took about half a day.
  4. Implementation: Implementing all the fun ideas was always my least fun part! Luckily enough this time, I enjoyed this part where I got to try different ways to make the project work. I was first confused with layering because I had 9 layers to include, but later on managed it by creating a SceneManager class.

The hardest part in the implementation of this project was the clickable object. Since my objects were a part of the image itself, I could not figure out any other way but using conditional statements for each and every object (about 60 objects). I measured the position of each object in order to make it clickable and add sound to it. The process to do this was not complicated, but very monotonous, but I’m glad it works. The only downside about this method is that only a specific part of each object is clickable. As a result, if a part of the sofa is hidden by a table, only the clear part is clickable. Depends on where the user decides to click, some parts may not work if clicked. Now that I think about it, I think it would have been a better idea to have vectors of the objects as separate images as well as a separate empty space. This way I could have created a Clicked Object class, have instances for each object image and make them clickable by using a conditional whenever  X object is selected.



There isn’t a part I’m specifically proud of this time, because I’m proud that the whole code worked without errors. I would say I appreciate all the efforts and time I put into measuring the position of each and every object.

if (sceneManager.onRoom2){
     //to go back to house
     if(mouseX>=0 && mouseX<=107){
       if (mouseY>=0 && mouseY<=49){
    sceneManager.onWScreen = false;
    sceneManager.onHouse = true;
    sceneManager.onRoom = false;
    sceneManager.onRoom2 = false;
    sceneManager.onKitchen = false;
    sceneManager.onBathroom = false;
    sceneManager.onDiningRoom = false;
    sceneManager.onInstructions = false;
    sceneManager.onLounge = false;
     //mirror object
     if (mouseX>=0 && mouseX<=206){
       if(mouseY>=61 && mouseY <= 274){;
     //picture object
     if (mouseX>=259 && mouseX<= 368){
       if (mouseY>=128 && mouseY<=217){;
     //painting object
     if (mouseX>=421 && mouseX<= 757){
     if (mouseY >= 87 && mouseY<= 251)
     //bed object
     if(mouseX>=390 && mouseX<=795){
       if(mouseY>=341 && mouseY <= 500){;
     //desk object
     if(mouseX>=0 && mouseX<=205){
       if(mouseY>=390 && mouseY<=600){;
     //lamp object
     if(mouseX>=273 && mouseX<=324){
       if(mouseY>=347 && mouseY<=465){;
     //lamp 2 object
     if(mouseX>=872 && mouseX<=921){
       if(mouseY>=345 && mouseY<=467){;
     //table object
     if(mouseX>=223 && mouseX<=377){
       if (mouseY >= 472 && mouseY <= 600){;
     //table 2 object
     if(mouseX>=810 && mouseX<=945){
       if(mouseY>=493 && mouseY<=600){;
     //plant object
     if (mouseX>=0 && mouseX<=86){
       if(mouseY>=287 && mouseY<=383){;
     //vase object
     if(mouseX>=815 && mouseX<=861){
       if(mouseY>=336 && mouseY<=465){;
     //clock object
     if(mouseX>=833 && mouseX<=938){
       if(mouseY>=124 && mouseY<=224){;

One element of the project that is missing is text. I could not figure out how to make the text for each object pop in an aesthetic way without creating a mess, so I decided to give up on it. However, I still want to improve the project even after turning it in, so I will work on finding an efficient and visually pleasing text to incorporate text.

Overall, I’m happy of the result of this project, I hope the users will also have a fun (and bug-free) experience with it!

Midterm Progress- Merhba, Alf Merhba


Title translation: Welcome, a Thousand Welcome.

When I saw the coffee shop experience example, I was inspired to create an educational/cultural experience that has a fun aspect to it. The concept of this midterm project will be a dialect learning experience from which the user will be able to learn a few words in Darija the Moroccan dialect, in a fun way.

Here is a preliminary user guide for what I am referring to:



Press PLAY to start


Press arrow for next screen


Hover the mouse and choose one of the six spaces

FRAME n: 1/6 space

Click on an object to hear the word for the object in Darija

Click on arrow to go back



-SOUND RECORDINGS: I’ll be recording all the audio files for the objects. My goal is to at least have 10 recordings per space (6 spaces makes them 60 recordings in total). I’m anxious about managing all the files as well as how will I be able to incorporate all files into the project without making it too heavy and laggy. I’ll test it and in case it does not work, I’ll bring down the number to 4 recordings/space.

-SCENES MANAGEMENT: I have about 9 scenes in this project. The scene switching routine will be a lot of work. I started looking into libraries to help with this matter and found P5 Scene Manager  a library which may make my life much easier. Still looking into how it works.

-HOW TO INCLUDE TEXT: I want to include the English word of each object along with each object. I’m still brainstorming ways to make this possible.

Looking forward to see the result!