Project Proposal – Stefania Petre

Hey there!

For this final, I have imagined stepping into a world where art and technology intertwine to create something truly special.

Envision an interactive space where art and technology fuse, transforming physical gestures into a cascade of colors on a digital canvas. My project is an invitation to express creativity through motion, as participants use a paintbrush to navigate a symphony of shapes and hues. This is not just an artwork but an experience that captures the essence of movement and translates it into a personalized digital masterpiece.

Arduino Part:

At the interaction’s core lies the Arduino Uno, paired with a ZX Distance and Gesture Sensor. The sensor is adept at tracking the proximity of the paintbrush and the subtle hand gestures of the artist.

  • Input: Proximity data and gesture commands from the ZX Sensor.
  • Output: Serial communication to relay the sensor data to the computer running the P5 sketch.
  • Data to P5: Real-time Z-axis data for proximity (distance of the hand or brush from the sensor) and X-axis data for lateral movement, along with gesture detections (swipes, taps).
  • From P5: Instructions may be sent back to calibrate gesture sensitivity or toggle the sensor’s active state.

P5JS Part:

On the digital front, P5 will act as the canvas and palette, responsive and mutable. It will process the incoming data from the Arduino, converting it into an array of colors and movements on the screen.

  • Receiving Data: Interpretation of proximity and gesture data from the Arduino.
  • Processing Movements: Real-time mapping of hand movements to strokes and splashes of color, varying in intensity and spread on the digital canvas.
  • Visual Feedback: Dynamic visual changes to represent the flow and dance of the user’s movements.
  • To Arduino: Signals for adjusting the ZX Sensor settings based on real-time performance and user experience feedback.

So far, this is how the painting should look like:

 

Development and User Testing:

The ZX Distance and Gesture Sensor is now integrated with Arduino. The creation of a smooth data flow to the P5 programme is the urgent objective. The system should react to hand motions by displaying appropriate visual changes on the screen by the time of user testing the following week. Assessing user interaction—specifically, how natural and fulfilling it is to paint in midair—will be largely dependent on this initial prototype. The documentation of user testing will involve multiple techniques such as gathering feedback from participants, recording interactions on video, and analysing the responsiveness of the system.

 

 

“Overengineered Guitar” by Pi Ko and Darko Skulikj

Introduction

This is the “Overengineered Guitar” by Pi Ko and Darko Skulikj.

We massacred an acoustic guitar with a single string into an automated musical device that can be played akin to a theremin.

Concept – A Cringy Skit

Darko does not know how to play the Pirates of the Caribbean theme song on guitar, so he decided to turn to Arduino for help.

Demonstration and Media

The video demo of the instrument is shown below.

The sound sucks 😡. Moral of the story : Pi should play the guitar instead of the machine 🫵😠🎸.

Concept

In principle, the “Overengineered Guitar” is a one-stringed setup acoustic guitar for playing by using an array of sensors and servo motors. It has a push button digital sensor and an analog ultrasonic sensor. The main idea is to control the musical notes by hand over the ultrasonic sensor. Then a propeller does the mechanical plucking, controlled through a push button.

This provides the user the opportunity to play the predefined sequence from “He’s a Pirate” from Pirates of the Caribbean over and over again.

Components

  • Arduino Uno: Serves as the main controlling box for all the sensors and actuators used.
  • Servo Motors (5x): Five servo motors are being attached all across the fretboard, pressing their respective frets according to the desired note.
  • Ultrasonic Sensor: Used to toggle the press on the fretboard by the motors.
  • Digital pushbutton: It is pressed down to start up the propeller, which plucks the string.
  • Propeller motor and DC Motor: It gives the mechanical pluck on the guitar string.
  • L293D motor driver IC: Takes care of the high current requirement for the propeller.
  • External Power Supply: This ensures that the system power is being properly distributed among the various components without having to necessarily overpower the Arduino.

(Arduino, ultrasonic sensor, switch and L293D IC)

(Propellor on DC Motor)

The motors are attached to the arduino as below.

Arduino Pin Motor/Music Note ID Open Servo Angle Press Servo Angle
11 5 180 0
10 4 0 180
6 3 180 0
5 2 180 0
3 1 180 0
N/A 0 (open string) N/A (no servo) N/A (no servo)

Challenges

Our main challenge involved managing power to ensure that each component the required current. Also, the wiring was a challenge since there were a lot of wires.

(Wire management 😎)

Task Allocation

Darko took care of the wiring and code for the ultrasonic sensor and the switch using non-blocking code. The rest is filled by Pi.

Code

The code is below. It has debouncing to guarantee reliable operation of the switch.

#include <Servo.h>

// Define a struct to hold servo data
struct ServoData {
  int pin;
  int openAngle;
  int pressAngle;
  Servo servo;
};

// Create an array of servos for each note
ServoData servos[] = {
  {3, 180, 0}, // Note 1
  {5, 180, 0}, // Note 2
  {6, 180, 0}, // Note 3
  {10, 0, 180}, // Note 4
  {11, 180, 0} // Note 5
};
const int numServos = sizeof(servos) / sizeof(ServoData);

// Note durations in milliseconds
int noteDurations[] = {500, 500, 2000, 500, 2000, 500, 1000, 500, 500, 1000};
int noteSequence[] = {0, 1, 2, 3, 4, 5, 3, 2, 1, 2};
const int numNotes = sizeof(noteSequence) / sizeof(int);

unsigned long previousMillis = 0;  // Stores last update time
int currentNoteIndex = 0;          // Index of the current note being played

// Push Button and Propeller control
const int buttonPin = 4; // Pushbutton pin
const int ledPin = 13; // LED pin (for debugging)
int enA = 9; // Enable pin for motor
int in1 = 8; // Motor control pin
int buttonState = 0; // Current button state
// Define the pins for the ultrasonic sensor
const int trigPin = 13;
const int echoPin = 12;

// Define variables for the duration and the distance
long duration;
int distance;


void setup() {
  // Setup for servos
  for (int i = 0; i < numServos; i++) {
    servos[i].servo.attach(servos[i].pin);
    servos[i].servo.write(servos[i].openAngle);
  }

 // Define pin modes for ultrasonic
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
  // Setup for button and propeller
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT);
  pinMode(enA, OUTPUT);
  pinMode(in1, OUTPUT);
  analogWrite(enA, 255); // Set propeller speed
  digitalWrite(in1, LOW); // Initially disable propeller
}

void loop() {
  unsigned long currentMillis = millis();
  // Darko - Switch
  // Improved button reading with debouncing
  int readButton = digitalRead(buttonPin);
  if (readButton != buttonState) {
    delay(50); // Debounce delay
    readButton = digitalRead(buttonPin);
    if (readButton == HIGH) {
      digitalWrite(ledPin, HIGH);
      digitalWrite(in1, HIGH); // Enable propeller
    } else {
      digitalWrite(ledPin, LOW);
      digitalWrite(in1, LOW); // Disable propeller
    }
    buttonState = readButton;
  }

  // Darko - Ultrasonic
  // Clear the trigPin condition
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  // Sets the trigPin HIGH (ACTIVE) for 10 microseconds
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
 
  // Reads the echoPin, returns the sound wave travel time in microseconds
  duration = pulseIn(echoPin, HIGH);
 
  // Calculating the distance
  distance = duration * 0.034 / 2; // Speed of sound wave divided by 2 (go and back)
 

  if(distance<=12){

  // Handling servo movements based on timing
  if (currentMillis - previousMillis >= noteDurations[currentNoteIndex]) {
    // Move to the next note
    if (noteSequence[currentNoteIndex] != 0) {
      // Release the previous servo, if any
      int prevNote = (currentNoteIndex == 0) ? -1 : noteSequence[currentNoteIndex - 1];
      if (prevNote != -1 && prevNote != 0) {
        servos[prevNote - 1].servo.write(servos[prevNote - 1].openAngle);
      }
      // Press the current servo
      int currentNote = noteSequence[currentNoteIndex];
      if (currentNote != 0) {
        servos[currentNote - 1].servo.write(servos[currentNote - 1].pressAngle);
      }
    } else {
      // Release all servos for open string
      for (int i = 0; i < numServos; i++) {
        servos[i].servo.write(servos[i].openAngle);
      }
    }

    previousMillis = currentMillis; // Update the last actuated time
    currentNoteIndex++;
    if (currentNoteIndex >= numNotes) {
      currentNoteIndex = 0; // Restart the sequence
    }
  }
  }
}

 

 

Raya Tabassum: Reading Response 6

Bret Victor’s “A Brief Rant on the Future of Interaction Design” raises some provocative questions about the trajectory of technology design. Why are we limiting ourselves to interactions with technology through mere touches on glass, when the human body has such a wide range of expressive capabilities? Victor challenges us to consider: What if we could design interactions that fully utilize the intricate ballet of our everyday physical movements? How might we transform our interactions with technology if we thought beyond the glass to harness the full potential of our tactile and kinesthetic intelligence? This essay pushes us to rethink how future technologies might enhance rather than restrict our natural interactions with the world. He champions a more expansive view that utilizes the full range of human capabilities, suggesting that true innovation in interaction design should engage more than just our fingertips — it should involve our entire bodies.
In his follow-up he addresses feedback from his initial “rant.” He clarifies that his critique was meant to highlight the limitations of current interface designs, like “Pictures Under Glass,” and to inspire research into more dynamic, tactile interfaces. He defends the potential of iPads while advocating for more dynamic, tactile mediums for interaction. Victor emphasizes that voice, gesture, and brain interfaces still miss crucial tactile feedback, and points out the dangers of technology that ignores the body’s natural abilities. He calls for technology to adapt to human capabilities, rather than bypass them.

Raya Tabassum: Final Project Primary Concept

I’m thinking of building a “Gesture-Based Musical Interface” for final which will be an interactive system that allows users to create music through hand movements. Combining Arduino’s sensing capabilities with P5.js’s graphical and sound processing, this project will transform physical gestures into a live audio-visual performance. It aims to provide an intuitive way for users to interact with technology and art. I’ll use two ultrasonic sensors to detect the hand movements in X-axis and Y-axis, speakers to output the generated musical notes, and webcam to visualize the movements on p5.

Week 11: Final Project Concept

For my final project, I would like to revisit generative art – especially since I really enjoyed the immersive nature and the aesthetic satisfaction that comes with making such works. And to stay true to the original thread that ran through a lot of my work this semester, I am deciding to once again incorporate butterflies. Since my midterm project was essentially a game, I want to pivot toward making an artistic piece. Thanks to Professor Aaron, who helped me sift through a bunch of ideas and finalize an initial concept, I am currently set on building a miniature installation with butterflies whose movements or shapes are activated/manipulated through touch. Essentially, I would make (perhaps fabricate, if time allows) butterflies and disperse them on a physical canvas. Each butterfly on the physical canvas would be connected via conductive material to a touch capacitive sensor. Using p5.js, I would create and control the butterflies’ shapes, colors, and animations. The p5.js sketch would then be projected and mapped onto the physical canvas. Sensory touch signals relayed by the Arduino would trigger a change in the animation/shape/color of the touched butterfly on the p5.js sketch and, by extension, the physical canvas.

Here’s a very rough visualization of what I have in mind:

Raya Tabassum: Digital Piano with Percussion Effects

Concept:

The Digital Piano with Percussion Effects is an innovative musical instrument that blends traditional piano elements with modern sensor technology to create a unique and interactive musical experience. This project uses an array of digital push buttons connected to an Arduino board to simulate a piano keyboard, where 8 different buttons trigger 8 distinct musical notes (‘C D E F G A B C#’ or ‘Sa Re Ga Ma Pa Dha Ni Sa’). In addition to the keyboard, the instrument incorporates an ultrasonic distance sensor, which introduces a dynamic layer of percussion sounds. These sounds vary depending on the distance of the player’s hand. Furthermore, a potentiometer is integrated to alter the pitch of the notes dynamically, offering the ability to manipulate the sound palette expressively.

Components Used:

  • Arduino Uno
  • Breadboard (x2)
  • Jumper Wires
  • Piezo Buzzer (x2)
  • Push Buttons (x8)
  • Potentiometer
  • 10k ohm resistors (x8)
  • Ultrasonic Sensor

Video:

Code:

int buzzerPin = 12;
int buzzer2 = 13;
int potPin = A0;
int keys[] = {2, 3, 4, 5, 6, 7, 8, 9};
// Frequencies for notes (C4 to C5)
int notes[] = {262, 294, 330, 349, 392, 440, 494, 523}; 
int trigPin = 10;
int echoPin = 11;
int bassDrum = 200; 
int snare = 250; 
int hiHat = 300;


void setup() {
  pinMode(buzzerPin, OUTPUT);
  pinMode(buzzer2,OUTPUT);
  pinMode(2,INPUT);
  pinMode(3,INPUT);
  pinMode(4,INPUT);
  pinMode(5,INPUT);
  pinMode(6,INPUT);
  pinMode(7,INPUT);
  pinMode(8,INPUT);
  pinMode(9,INPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  Serial.begin(9600);
}

void loop() {
  int potValue = analogRead(potPin);
  int volume = map(potValue, 0, 1023, 0, 255); // Map the potentiometer value to a volume range

  // Measure distance using the ultrasonic sensor
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2; // Calculate the distance

  Serial.print(distance);
  Serial.println(" cm");


  bool isAnyButtonPressed = false;
  for (int i = 0; i < 8; i++) {
    int modifiedNote = map(potValue, 0, 1023, notes[i] / 2, notes[i] * 2);
      if (digitalRead(keys[i]) == HIGH) {
          tone(buzzerPin, modifiedNote, 100);
          isAnyButtonPressed = true;
          break; // Stop the loop once a button is found pressed
      }
  }

  if (!isAnyButtonPressed) {
    noTone(buzzerPin);
  }
  if (distance < 10) {
    tone(buzzer2, bassDrum, 100);
  } else if (distance >= 10 && distance < 20) {
    tone(buzzer2, snare, 100);
  } else if (distance >= 20 && distance < 30) {
    tone(buzzer2, hiHat, 100);
  } else {
    noTone(buzzer2);
  }
    delay(100);
}

In the loop, the program first reads the potentiometer value and uses it to modify the frequency of the piano notes. Depending on the button pressed, it plays a modified note frequency. If no buttons are pressed, it stops any ongoing tone. Depending on the distance detected, it chooses a percussion sound to play, simulating a drum kit with different sounds for different ranges.

Work Process and Challenges: 

For this assignment I was paired up with Snehil. We both worked together to brainstorm the primary idea. First we only had the concept to build the digital piano with push buttons and then we combined the ultrasonic sensor to measure distance and make the percussion sounds to level up the musical instrument. For further improvement Snehil made the potentiometer work to change the pitch of the notes. We were first having trouble to make the ‘for loop’ work for the push buttons to make sound. We were also at first using only one piezzo for the whole instrument and the piano wasn’t working then. So we had to troubleshoot many times to fix those problems.

Final instrument with the potentiometer working:

Reading response / Final proposal – Hamdah Alsuwaidi

Reading response:

The narrative woven through the author’s critique and subsequent responses to reader feedback forms a cohesive argument for a transformative approach to interaction design. At the core of this discussion is a pointed critique of the reliance on touchscreens, or what the author refers to as “Pictures Under Glass.” This approach, the author argues, falls short of engaging the full spectrum of human abilities, especially the tactile and manipulative capacities intrinsic to the human experience.

Bringing expertise from a background in designing future interfaces, the author offers a credible perspective on the limitations of current technological trends and makes a clarion call for a reevaluation of our design ethos. The focus is on creating interfaces that not only address problems but also resonate with the rich physical faculties innate to users, suggesting that tools should amplify human capabilities and serve as an extension of human experience rather than its reduction.

The response to critiques emphasizes that the author’s original “rant” was not a dismissal of current devices like iPhones or iPads but a push for acknowledging their current utility while recognizing the imperative for future developments to transcend these existing paradigms. The author envisages a future where technology could tangibly represent and manipulate the environment in ways yet to be discovered, leveraging long-term research.

Voice interfaces, while useful, are seen as insufficient for fostering the depth of creativity and understanding that comes from a more tactile and hands-on interaction. Similarly, gestural interfaces and brain interfaces that circumvent the full use of the body are met with skepticism. The author stresses the need for interfaces that don’t make us sedentary or immobile, advocating against creating a future where interaction is disembodied.

In a profound reflection on the accessibility and apparent simplicity of touchscreens, the author warns against mistaking ease of use for comprehensive engagement, noting that while a two-year-old can swipe a touchscreen, such interaction doesn’t fully capitalize on adult capabilities.

Combining these perspectives, the author’s message is a wake-up call for designers, technologists, and visionaries. It’s an urging to reimagine interaction with the digital world in a way that brings our physical interaction with technology into a harmonious synergy, maximizing rather than diluting our human potential. This combined narrative is not just a response to current technological trends but a hopeful blueprint for the future of interaction design that is as rich and complex as the capabilities it aims to serve.

Final proposal:

Concept:
An Interactive Digital Jukebox, merges the nostalgic allure of classic jukeboxes with the functionalities of modern digital technology. Utilizing Arduino Uno for hardware control and p5.js for interactive web-based visualizations, this project aims to deliver a user-friendly music player that offers both physical and web interfaces for song selection, volume control, and music visualization.

Project Description:
The Interactive Digital Jukebox allows users to select and play music tracks from a stored library, offering real-time audio visualizations and track information through a p5.js . The system combines simple physical components for user interaction with sophisticated web-based graphics, creating a multifaceted entertainment device.

Components:
– SD Card Module: For storage and retrieval of MP3 music files.
– LCD Screen: To display track information directly on the device.
– Push Buttons: For navigating the music library and controlling playback.
– Speakers/Headphone Jack: To output audio.
– Potentiometer: To manually adjust the volume.

Functionality:
1. Music Playback: Music files stored on the SD card will be accessed and controlled via Arduino, with support for basic functions like play, pause, skip, and back.
2. User Interface: The p5.js interface will display a list of tracks, currently playing song details, and interactive playback controls.
3. Visual Feedback: The web interface will include dynamic visualizations corresponding to the music’s audio features, such as volume and frequency spectrum.
4. Physical Controls: Physical interactions will be enabled through push buttons or a touch screen connected to the Arduino.
5. Volume Control: A potentiometer will allow users to adjust the sound volume, with changes reflected both physically and on the web interface.

Implementation Steps:
1. SD Card Reader Setup: Install the SD card module to the Arduino and load it with MP3 files. Program the Arduino to use an MP3 decoder library for playback control.
2. Control Interface Development: Configure the input mechanisms (buttons, potentiometer) and ensure their functional integrity for user interaction.
3. Web Interface Creation: Develop the visual and interactive components of the p5.js interface, including song listings, playback controls, and visualization areas.
4. Audio Visualization Integration: Utilize the `p5.sound` library to analyze audio signals and generate corresponding visual effects on the web interface.
5. System Integration: Establish robust communication between Arduino and the p5.js application, likely via serial communication.

By bringing together the tactile satisfaction of traditional jukeboxes with the visual appeal and interactivity of modern interfaces, this project stands to offer not just a practical entertainment solution but also an educational tool that demonstrates the integration of different technologies. It promises to be an excellent demonstration of skills in both electronics and software development, appealing to a broad audience and fostering a deeper appreciation for the fusion of old and new media technologies.

 

 

 

 

 

Reading Reflection – Week 11 – Saeed Lootah

A Brief Rant on The Future of Interactive Design

Firstly, its clear that the author has strong feelings about the visions that people are suggesting for the future of technology but I felt that the author described what he felt should change clearly. I personally really liked the example of a hammer where the handle is made for the human and on the other end is the actual tool. It’s a simple object but it made me realize the importance of human centered design. The author had a large emphasis on hands and basically the torso. I liked the distinction he had made between touchscreen devices and how its just a “picture under glass” meaning that there was no tactile feedback based on what part of the screen you were pressing or based on what program you were using. A personal anecdote of mine is that touch-typing on a laptop or just any physical keyboard is much, much easier than on a phone for example. It’s hard to tell which key you are pressing an often the only way of getting it right is by sheer luck. While he did not give any specific solution (which was a common response to the article) it was still necessary to outline the problem no matter how obvious it seemed.

It’s a common trope that in the future we will be interacting with holograms and swiping or touching imaginary screens in the air which we can only see when wearing some kind of glasses, which is similar to what apple are attempting with the apple vision pro. While it makes a lot of sense I’m not sure how it can be made more tactile. For one there has to be a sense of weight when interacting with something to make it more intuitive and some kind of feedback at our fingertips. I imagine a way of making some kind of tactile feedback at our fingertips could be by using some kind of glove which magically gives feedback (maybe in the future the solution will be obvious but right now its magic, at least for me). In any case I found the reading interesting and there is a lot to digest and consider despite its conciseness (I think thats a word).

Final Project Proposal – ArSL to Arabic Translation System

Growing up, I always experienced a communication barrier with my grandfather’s brother, who is hard of hearing. At family gatherings, only a select few who understand Arabic Sign Language (ArSL) could successfully communicate with him. This situation has been frustrating, as he has many adventures and stories that remain unshared and misunderstood by most of our family.

While there are systems available that translate American Sign Language (ASL) into English, the representation of Arabic in the technological domain of sign language translation is lacking. This disparity has not only limited the communication within diverse linguistic communities but also shows the urgent need for inclusive technology that bridges linguistic and sensory gaps.

My goal is to develop a real-time translation system for Arabic Sign Language using pose estimation combined with proximity sensing. The goal is to enable direct communication for ArSL users by translating their sign language into written Arabic. It would be nice to use machine learning models that specialize in pose estimation but I would need to do more research.

Final Project Idea “Love Level” – Shaikha AlKaabi

Project Idea: Love Level Game

  

The Love Sensor game is an interactive experience designed for two players, each placing one hand on a heart rate monitor. This innovative game uses heart rate data to measure and display the level of affection or excitement between the participants. The faster the heartbeats, the higher the presumed love connection. 

How It Works: 

  1. Heart Rate Monitoring: Each player places a hand on the heart rate monitor, which captures their pulse in real time.
  2. Data Processing: An Arduino board processes the heart rate data, assessing the intensity and changes in each player’s heartbeat.
  3. Dynamic Feedback: The processed data is then sent to a computer where p5.js generates visual and auditive outputs. These outputs are designed to dynamically respond to the heart rate data. For instance, visuals willgrow more vibrant as the players’ heart rates increase, symbolizing a stronger connection. Audio feedback may also escalate, creating a rich sensory environment that reflects the intensity of the game.
  4. Interactive Experience: The game aims to create a playful and engaging experience that not only entertains but also visually and audibly illustrates the emotional bond between the players, making it a fascinating exploration of human interactions.