Final project concept

Concept

For my final project, I plan to create a game using Arduino and P5js. This game will allow user to use physical components (boxes, figures) in real life to control the game happening on the computer.

Particularly, I want to create a platformer game where player can alter the platforms or the obstacles. The physical components will include some card board boxes that looks like the platforms on the computer. When player moves these boxes or obstacles, the platforms on computer will also move. Based on this mechanism, I will design a few levels of puzzle for the player. The goal of the game is to move the boxes around so that the game character on the computer can get from one point to another.

An introductory or example of this game can be moving the boxes (2 different platforms) so that the game character can jump across.

Week 11 Reading Reflection

In this week’s reading, the author studies and discusses the relationship between Style, fashion, and disability. One of the easiest and best examples of this that was given is in eyewear. Eyewear strikes the balance perfectly between showing off your style whilst being a tool of solving a problem of disability. The author believes that, even for larger forms of assitive products, such as legwear and armwear, there should be close attention payed to the aesthetics of these products. The author signifies the importance of product designs being inclusive, involving both designer and user. The aesthetics or design of a product should not be payed with any less attention to detail just because it is an assistive device. The author introduces the idea of universal design, in which the problems of accessibility and the needs and wants of a consumer is tackled. This reading helped me reflect upon the ideology that goes behind designing things, thinking about the intended audience and allowing not just a small population to be able to utilize a product.

FINAL PROJECT IDEA

For my Final project, I am still unsure of what I want to do. However, something came to mind that I think is a cool concept. I want to create an Interactive mood lamp with music integration.  So basically, the lamp changes colors and plays music based on the user’s actions and mood. The project will integrate the Arduino for a sensor-based interaction and P5 for a visual interface. For now I’m thinking of having 3 modes:

    • Mode 1: Calm (ambient sounds and soft colors).
    • Mode 2: Party (dynamic colors and upbeat music).
    • Mode 3: Focus (single-color light with lo-fi beats).

For the P5 visualization, which I’m still unsure if I’ll keep it this way, I might do a real-time visualization of soundwaves and color patterns that mirrors the lamp’s output on a screen.

A mood lamp for your reference:

mood lamp

For the design im planning to do something similar. I want to input the LEDs from the Arduino inside, like a lamp jar, to give. the aesthetic of the mood lamp, I still need to look into this more and see if there are other ways to do this.

Week 10 – Reading Response

For this reading, I agree with the author on the limitation of the types of interaction we are using today. He criticized today’s digital device for only limited to “pictures under class”. Of course this type of interaction only allow the hands to feel and receive limited signals. The “pictures under class” leaves out the potentials of texture, weight and multiple other factors, for example, playing piano on a screen does not have the same satisfying feelings as playing on the real piano with hands interaction feedback.

However, I disagree with the author in two main points. The reading focuses too much on the capability of humans but does not take into consideration how we can use that capability to fit our needs. Of course we can have a more dynamic medium, but would that be better than the functionality of the Iphone? Iphones are designed compact and multifunctional, which I think can only achieved best through the “pictures under glass” interaction type. The second thing is that even though our hands are magnificent in many ways, other body parts can create interactions that are not less interesting. For example, the eyes can calculate the distance between objects, recognizing colors, emotions, etc. The “picture under glass” fits well to this capability of the eyes. Hence, I think it’s not a bad medium, it’s just a medium that makes use of other human capability.

Week 10 Assignment: instrument

Concept

For this assignment, we were inspired by the toy pianos that we’ve all played with at least a few times as children. These toys were often quite limited, as they only had enough space to accommodate the keys for a single octave. We decided to create a miniature piano that could play the notes from C to A, with a knob that can be turned to change octaves.

Setup and Code

We set up a row of buttons that play a note when pressed, a potentiometer that changes the octave, and a switch that switches the notes to their respective sharp notes.

We created arrays that store the frequencies of the natural notes across seven octaves, doing the same for the sharp notes. The range of the potentiometer is mapped to a range of 0-6 and determines which octave the notes will be played in. When the switch is low, natural notes are played, and vice versa.

To prevent the notes from continuously playing, the noTone() function stops the buzzer playing any sound when no button is pressed.

Full code on Github

if (buttonState7 == LOW && buttonState6 == LOW && buttonState5 == LOW && 

    buttonState4 == LOW && buttonState3 == LOW && buttonState2 == LOW) {

  noTone(8);  // Stop any tone on pin 8

}

 

Open photoDemo

Vid link

Reflection

Overall, this group project was a success as we managed to recreate our inspiration using Arduino. We were really satisfied with how we made sure to implement the natural notes as well as sharp notes in our project. A few struggles we faced was that the wires and resistors were really crowded on our small breadboards, making it a little difficult to reach the buttons. Ideally, we would want to get bigger breadboards in the future so that there would be more room for interacting with the product. Additionally, the bread board was just a little too small for us to add the 7th natural note, so we could only fit 6, as we’re missing a B note. 

In the future, we would like to add two different colored led lights that switch on and off along with the natural or sharp notes. This way, it would help users know if they are currently playing natural or sharp notes, as not everyone is familiar with musical notes in that sense. 

 

Week 10 reading reflection

A Brief rant on the Future of Interactive Design 

In this reading, the author voices his frustrations with current and future visions of input devices being centered on touch screens, or as he calls it Picture Under Glass. I aggree with the authors frustrations with ordinary objects moving to become less tactile and moving to resemble touch screens. For example, many new induction stoves utilize touch capacitive buttons, which become quite unresponsive when you’re cooking and have greasy fingers. Much of the “innovations” for the future now are fixing problems that were non-existent to begin with. As the author said, our hands and fingers can do much more than just touch and swipe, which is the only thin a touch screen allows us to do. What we need to innovate is a way to utilize more than this basic motion, to create technology that is able to make use off the various other features that our hands are capable of.

Responses: 

The author acknowledges that no solution is possible with the current state of our technology, however, what he is trying to tell us is that our hands are capable of much more, and instead of “improving” everything by transitioning to touch screen interfaces, we need to aim to keep the tactile sensations that we are much used to and are much more used to.

 

Assignment 7: How fairies play the piano

Concept:

For this assignment, my partner Fasya and I designed a musical instrument that merges digital and analog elements to create an interactive, light-based piano. Our idea centers around a glowing wand that functions as a control interface. When moved over the keys, the wand activates notes from the C major scale (C, D, E, F, and G) like magic, and the brightness of its glow determines the octave. We integrated a potentiometer to adjust brightness, allowing users to easily shift the octave up or down. Additionally, we added a switch to toggle the instrument off, which prevents accidental note activation—particularly useful to avoid unintended sounds from ambient light sources like flashlights.

Highlight:

To bring our vision to life, we used five photoresistors to detect light from the wand and mapped each sensor’s range to specific notes (C, D, E, F, and G) and their octave scales. By setting sensor thresholds from a default minimum to a maximum value that a flashlight might produce, we could dynamically adjust the octave based on the brightness the photoresistor detects. Essentially, the brighter the wand, the higher the octave, allowing for an expressive range in tone.

For the wand itself, we created a purple glow using a tricolored LED, giving the instrument an ethereal fairy-like visual quality. A potentiometer is attached to the LED to control brightness, making it easy for users to adjust octaves on the fly. The setup includes separate circuits for the keyboard and wand, creating flexibility for future enhancements, such as adding multiple wands for collaborative play.

Keyboard Code:

#include "pitches.h"
bool buttonState = false;
// Define the piezo buzzer pin
const int buzzerPin = 8;

// Define frequencies for each note across multiple octaves
const int fNotes[] = {NOTE_F4, NOTE_F5, NOTE_F6, NOTE_F7};  // Octaves of A
const int gNotes[] = {NOTE_G4, NOTE_G5, NOTE_G6, NOTE_G7};  // Octaves of B
const int cNotes[] = {NOTE_C4, NOTE_C5, NOTE_C6, NOTE_C7};  // Octaves of C
const int dNotes[] = {NOTE_D4, NOTE_D5, NOTE_D6, NOTE_D7};  // Octaves of D
const int eNotes[] = {NOTE_E4, NOTE_E5, NOTE_E6, NOTE_E7};  // Octaves of E

void setup() {
  // Initialize serial communication at 9600 bps for debugging
  Serial.begin(9600);
  pinMode(7,INPUT);
}

void loop() {
  // Array to store sensor values
  int sensorValues[5];
  int switchValue = digitalRead(7);
  if (switchValue == HIGH){
      buttonState = true;
  }

 
  // Read each sensor value and store in the array
  sensorValues[0] = analogRead(A3);  // f note
  sensorValues[1] = analogRead(A4);  // g note
  sensorValues[2] = analogRead(A0);  // C note
  sensorValues[3] = analogRead(A1);  // D note
  sensorValues[4] = analogRead(A2);  // E note

  // Play a note based on each sensor value
  for (int i = 0; i < 5; i++) {
    int note;
    if (sensorValues[i] < 850 || !buttonState) {
      // Stop any sound if the sensor value is below 900
      noTone(buzzerPin);
      continue;
    } else {
      // Map the sensor value (900 to 1100) to an index (0 to 3) for each note array
      int noteIndex = map(sensorValues[i], 850, 1100, 0, 3);

      // Assign the note based on the sensor index
      switch(i) {
        case 0: note = fNotes[noteIndex]; break;
        case 1: note = gNotes[noteIndex]; break;
        case 2: note = cNotes[noteIndex]; break;
        case 3: note = dNotes[noteIndex]; break;
        case 4: note = eNotes[noteIndex]; break;
      }
      
      // Play the mapped frequency on the piezo buzzer
      tone(buzzerPin, note);
    }
    
    // Delay to control the speed of tone change
    delay(100);
  }
}

Wand Code:

// *Interfacing RGB LED with Arduino 
// * Author: Osama Ahmed 

//Defining  variable and the GPIO pin on Arduino
int redPin= 9;
int greenPin = 10;
int  bluePin = 11;
int potPin = A2;
int sensorVal = 0;
double brightness = 0;

void setup() {
  Serial.begin(9600);
  //Defining the pins as OUTPUT
  pinMode(redPin,  OUTPUT);              
  pinMode(greenPin, OUTPUT);
  pinMode(bluePin, OUTPUT);
}
void  loop() {
  sensorVal = analogRead(potPin);
  brightness = (double)sensorVal / 1023;
  Serial.println(brightness);
  
  setColor(170, 0, 255, brightness); // Purple Color
  // delay(1000);
}
void setColor(int redValue, int greenValue,  int blueValue, double brightValue) {
  analogWrite(redPin, (double) redValue * brightValue);
  analogWrite(greenPin,  (double) greenValue  * brightValue);
  analogWrite(bluePin, (double) blueValue  * brightValue);
}

Demonstration:

Keyboard Circuit Schematic
Wand Circuit Schematic

 

WEEK 10 RADIO (Amna and Noura)

CONCEPT:

While brainstorming project ideas with Amna, we thought about how a radio works and decided it would be fun to make a simple version. Our goal was to create an Arduino “radio” that lets you switch between different songs, similar to tuning a real radio. We used a knob as the channel switch, allowing us to choose between three different songs that we got from Github and the exercises we did in class. Each channel has its own song, and turning the knob instantly switches to the next song, giving it a real radio-like feel. We also added a button that acts as a power switch to turn the radio on and off. This way, pressing the button starts the radio, and pressing it again turns it off. We even added a feature so that any song stops playing immediately when the channel changes, so you don’t have to wait for a song to finish before switching to a new one.

HIGHIGHT:

The part Amna and I are most proud of is getting the button to work smoothly with the debounce feature. At first, the button would trigger multiple times with a single press, turning the radio on and off too quickly. By adding a debounce function, we made sure the button only registers one press at a time, making it much more reliable. A former student in IM (Shereena) helped us understand how debounce works and guided us in fixing this issue, explaining how it makes the button’s response stable and accurate.

Here’s a snippet of the debounce code we used:

// Variables for debounce
int buttonState; // Current state of the button
int lastButtonState = LOW; // Previous state of the button
unsigned long lastDebounceTime = 0; // Last time the button state changed
unsigned long debounceDelay = 50; // Debounce time in milliseconds

void loop() {
int reading = digitalRead(buttonPin);

// Check if the button state has changed
if (reading != lastButtonState) {
lastDebounceTime = millis(); // Reset debounce timer
}

// If enough time has passed, check if the button is pressed
if ((millis() - lastDebounceTime) > de
bounceDelay) {
if (reading != buttonState) {
buttonState = reading;
if (buttonState == HIGH) {
radioState = !radioState; // Toggle radio on/off
}
}
}

lastButtonState = reading;
}

 

This debounce function prevents accidental multiple triggers, making the button interaction smoother. We’re also proud of how the radio switches songs instantly when we turn the knob, making it feel real.

REFLECTION:

Working on this project with Noura was a nice experience, as we got to share our thoughts and class experience by working together. One of our main struggles was making the button work without triggering multiple times, which led us to use debounce for stability. While adding Debounce solved the problem, in the future, we’d like to explore other ways to troubleshoot and fix issues like this without relying on debugging alone.

For future improvements, we’d like to add more interactive features, such as volume control with an additional knob and possibly a small speaker for clearer sound. We could also include more “channels” with various songs or sound effects, giving users a richer experience. Another idea is to add an LED indicator that lights up when the radio is on and changes brightness with volume, making the design even more engaging. These changes would make our project more realistic for a radio and enhance.

How it works:

CODE:

https://github.com/nouraalhosani/Intro-to-IM/blob/426c7d58639035c7822a4508f2e62dab34db0695/Radio.ino

WEEK 10 READING

This rant on “Pictures Under Glass” made me realize how strange it is that we’re so comfortable reducing our interactions to something as flat and lifeless as a touchscreen. We have our hands (the most useful body part), and yet we limit them to swiping and tapping like we’re settling for the simplest form of interaction possible. It’s almost like we’re choosing numbness over true engagement, relying on a screen instead of exploring the world with our hands.

To me, this flat, touch-only future ignores a huge part of being human. While reading, I thought about how much richer experiences are when you can actually feel things. Imagine the difference between seeing a piece of pottery on a screen versus shaping it yourself. It’s a completely immersive experience that teaches you in ways a screen never could. If tech continues down this path, I feel like we’re just training ourselves to be satisfied with empty interactions that don’t ask much of us.

Furthermore, I think relying so heavily on screens is what’s making us lazy; it feels like we’re choosing convenience over truly connecting with the world. Screens limit how we explore and interact, reducing everything to a simple swipe or tap. Technology should be enhancing our natural abilities, not shrinking them down to one basic motion.

Moreover, the push for brain-computer interfaces is even worse. It takes away the real, physical experiences that make life interesting. Instead of cutting out physical interaction, I believe tech should work with our bodies, such as the Wii console or the VR, helping us connect in more meaningful, hands-on ways.

READING WEEK – 10

Reading the rant, it made me realize how much we’ve settled for touchscreens as our main way of interacting with technology. Sure, they’re easy to use, but I think they’re holding us back from something much richer. Kids today are glued to screens, swiping and tapping all day, and I feel like it’s making them “finger-blind.”. They’re losing the chance to understand the world through real touch. Instead of feeling textures and weights, they’re just seeing everything through glass. In my opinion, that’s robbing them of a huge part of learning and growing.

And it’s not just kids. Even for adults, I feel like we’re missing out on deeper connections with tech. Take virtual reality games. Everyone loves the visuals, but games like Beat Saber feel empty to me without any sense of touch or resistance. It’s a hollow experience, one that relies only on what we see, not what we feel. I think this disconnect leaves us with flashy visuals but no real engagement.

Beat Saber – VR

In addition, I also think that the obsession with brain-computer interfaces is just as disappointing. Sure, it sounds futuristic, but I think it’s actually a step back. The idea of controlling things with thoughts alone ignores the importance of our bodies. We’re already sitting at screens too much, and this just pushes us deeper into that immobility. I believe technology should encourage movement and real interaction, not trap us in a still, disconnected state.

“Pictures Under Glass” sums up the issue perfectly. It’s like we’re numbing one of our most powerful senses for convenience. When I pick up a book or a glass, my hands get all kinds of feedback from weight, texture, and balance. But with a screen? Nothing. Even something as simple as making a sandwich taps into a level of interaction that a screen can’t touch. In my view, designers need to stop treating screens as the final answer and start creating tech that actually respects our bodies’ abilities.