Week 11: Reading Reflection

Drawing upon the insightful critique presented in Bret Victor’s “A Brief Rant on the Future of Interaction Design,” I find myself resonating deeply with the underlying message of reimagining our interaction paradigms. Victor’s compelling argument against the myopic vision of future interfaces primarily reliant on fingertip interactions challenges us to broaden our conceptual horizons. Reflecting on the assertions made, it becomes evident that while touchscreen technology represents a significant advancement, it barely scratches the surface of our interactive potential.

Victor’s emphasis on utilizing the entire hand—not just the fingertips—invites us to explore the rich possibilities of tactile and gestural inputs. This approach not only enhances the depth of human-computer interaction but also aligns with ergonomic principles that advocate for natural, strain-free movements. The focus on whole-hand interaction could lead to more intuitive and physically engaging interfaces that leverage the full spectrum of human dexterity and sensory feedback.

Moreover, the notion of universal design emerges as a crucial consideration. By aspiring to create interfaces that are accessible to all, including those with different abilities, designers can cultivate inclusivity. This idea challenges us to think beyond the conventional target audience and design products that cater to a diverse user base without compromising on functionality or aesthetic appeal. Such an approach not only broadens the market reach but also reinforces the social responsibility of design.

In envisioning the future of interaction design, we should indeed consider a return to the basics, as suggested in the readings. The fundamental act of hands manipulating physical objects has shaped human tool use throughout history; thus, incorporating this intrinsic aspect of human behavior into modern technology could revolutionize how we interact with digital environments. It is not merely about enhancing existing technology but redefining what interaction entails in the digital age, moving from passive touchscreen gestures to dynamic, multi-dimensional engagements.

In summary, while advanced technologies like 3D cameras and AI-driven interfaces continue to push the boundaries, the essence of interaction design should remain grounded in the natural human experience. Emphasizing the full potential of our hands not only respects our biological heritage but also opens up a panorama of possibilities that could redefine the future landscape of technology, making it more intuitive, inclusive, and fundamentally human-centric. This perspective not only aligns with Victor’s critique but also propels it forward, suggesting a paradigm where technology complements rather than constrains human capabilities.

Reading response / Final proposal – Hamdah Alsuwaidi

Reading response:

The narrative woven through the author’s critique and subsequent responses to reader feedback forms a cohesive argument for a transformative approach to interaction design. At the core of this discussion is a pointed critique of the reliance on touchscreens, or what the author refers to as “Pictures Under Glass.” This approach, the author argues, falls short of engaging the full spectrum of human abilities, especially the tactile and manipulative capacities intrinsic to the human experience.

Bringing expertise from a background in designing future interfaces, the author offers a credible perspective on the limitations of current technological trends and makes a clarion call for a reevaluation of our design ethos. The focus is on creating interfaces that not only address problems but also resonate with the rich physical faculties innate to users, suggesting that tools should amplify human capabilities and serve as an extension of human experience rather than its reduction.

The response to critiques emphasizes that the author’s original “rant” was not a dismissal of current devices like iPhones or iPads but a push for acknowledging their current utility while recognizing the imperative for future developments to transcend these existing paradigms. The author envisages a future where technology could tangibly represent and manipulate the environment in ways yet to be discovered, leveraging long-term research.

Voice interfaces, while useful, are seen as insufficient for fostering the depth of creativity and understanding that comes from a more tactile and hands-on interaction. Similarly, gestural interfaces and brain interfaces that circumvent the full use of the body are met with skepticism. The author stresses the need for interfaces that don’t make us sedentary or immobile, advocating against creating a future where interaction is disembodied.

In a profound reflection on the accessibility and apparent simplicity of touchscreens, the author warns against mistaking ease of use for comprehensive engagement, noting that while a two-year-old can swipe a touchscreen, such interaction doesn’t fully capitalize on adult capabilities.

Combining these perspectives, the author’s message is a wake-up call for designers, technologists, and visionaries. It’s an urging to reimagine interaction with the digital world in a way that brings our physical interaction with technology into a harmonious synergy, maximizing rather than diluting our human potential. This combined narrative is not just a response to current technological trends but a hopeful blueprint for the future of interaction design that is as rich and complex as the capabilities it aims to serve.

Final proposal:

Concept:
An Interactive Digital Jukebox, merges the nostalgic allure of classic jukeboxes with the functionalities of modern digital technology. Utilizing Arduino Uno for hardware control and p5.js for interactive web-based visualizations, this project aims to deliver a user-friendly music player that offers both physical and web interfaces for song selection, volume control, and music visualization.

Project Description:
The Interactive Digital Jukebox allows users to select and play music tracks from a stored library, offering real-time audio visualizations and track information through a p5.js . The system combines simple physical components for user interaction with sophisticated web-based graphics, creating a multifaceted entertainment device.

Components:
– SD Card Module: For storage and retrieval of MP3 music files.
– LCD Screen: To display track information directly on the device.
– Push Buttons: For navigating the music library and controlling playback.
– Speakers/Headphone Jack: To output audio.
– Potentiometer: To manually adjust the volume.

Functionality:
1. Music Playback: Music files stored on the SD card will be accessed and controlled via Arduino, with support for basic functions like play, pause, skip, and back.
2. User Interface: The p5.js interface will display a list of tracks, currently playing song details, and interactive playback controls.
3. Visual Feedback: The web interface will include dynamic visualizations corresponding to the music’s audio features, such as volume and frequency spectrum.
4. Physical Controls: Physical interactions will be enabled through push buttons or a touch screen connected to the Arduino.
5. Volume Control: A potentiometer will allow users to adjust the sound volume, with changes reflected both physically and on the web interface.

Implementation Steps:
1. SD Card Reader Setup: Install the SD card module to the Arduino and load it with MP3 files. Program the Arduino to use an MP3 decoder library for playback control.
2. Control Interface Development: Configure the input mechanisms (buttons, potentiometer) and ensure their functional integrity for user interaction.
3. Web Interface Creation: Develop the visual and interactive components of the p5.js interface, including song listings, playback controls, and visualization areas.
4. Audio Visualization Integration: Utilize the `p5.sound` library to analyze audio signals and generate corresponding visual effects on the web interface.
5. System Integration: Establish robust communication between Arduino and the p5.js application, likely via serial communication.

By bringing together the tactile satisfaction of traditional jukeboxes with the visual appeal and interactivity of modern interfaces, this project stands to offer not just a practical entertainment solution but also an educational tool that demonstrates the integration of different technologies. It promises to be an excellent demonstration of skills in both electronics and software development, appealing to a broad audience and fostering a deeper appreciation for the fusion of old and new media technologies.

 

 

 

 

 

Assignment #11 – Code – ☆Surprise Surprise☆

Concept

For this assignment, we were inspired by the popular “Rick Rolling” meme. Essentially, this is a meme that tricks the viewer by surprisingly playing the song “Never Gonna Give You Up” by Rick Astley. Therefore, we decided to create a music box which plays this song when it’s opened.

How it works

For the music box, we had multiple components. First, we used a photoresistor to detect brightness (open box) and darkness (closed box), which would determine whether to play the song or not. Then, we had a Piezo buzzer to actually play the notes of the song. Moreover, we added a LED light that blinks to the rhythm of the music. We also added a button which when pressed, would change the speed at which the music plays. Finally, we used a potentiometer to control the volume of the music. In the code, we divided the song into the intro, verse, and chorus.

Components

  • 1 x Photoresistor
  • 1 x Piezo buzzer
  • 1 x LED Light
  • 1 x Button
  • 1 x Potentiometer
  • 3 x 330 ohm resistors
  • 1 x 10K ohm resistor
  • Wires
  • Arduino and Breadboard

Demo

Code Snippets

Our code is quite long, so here are some snippets:

This is an example of how we have created arrays for each part of the song. This is specifically for the chorus, but we also have them for the intro and the verse. We have one array for the melody, which is determined by the frequencies we have defined, and one for the rhythm, which determines the duration of each note when later multiplied with the beat length.

int song1_chorus_melody[] =
{ b4f, b4f, a4f, a4f,
   f5, f5, e5f, b4f, b4f, a4f, a4f, e5f, e5f, c5s, c5, b4f,
  c5s, c5s, c5s, c5s,
   c5s, e5f, c5, b4f, a4f, a4f, a4f, e5f, c5s,
  b4f, b4f, a4f, a4f,
  f5,   f5, e5f, b4f, b4f, a4f, a4f, a5f, c5, c5s, c5, b4f,
  c5s, c5s, c5s, c5s,
   c5s, e5f, c5, b4f, a4f, rest, a4f, e5f, c5s, rest
};

int song1_chorus_rhythmn[]   =
{ 1, 1, 1, 1,
  3, 3, 6, 1, 1, 1, 1, 3, 3, 3, 1, 2,
  1, 1, 1, 1,
   3, 3, 3, 1, 2, 2, 2, 4, 8,
  1, 1, 1, 1,
  3, 3, 6, 1, 1, 1, 1, 3, 3, 3,   1, 2,
  1, 1, 1, 1,
  3, 3, 3, 1, 2, 2, 2, 4, 8, 4
};

This is our setup function, which initializes the pins and the serial communication, and sets an interrupt on the button (which then trigger our  getFaster function).

void   setup()
{
  pinMode(piezo, OUTPUT);
  pinMode(led, OUTPUT);
  pinMode(button,   INPUT_PULLUP); // high voltage when button is not pressed; low voltage when pressed
  pinMode(sensor, INPUT);
  attachInterrupt(digitalPinToInterrupt(button), getFaster, FALLING); // interrupt activates when pin is pressed
  digitalWrite(led, LOW);
  Serial.begin(9600);
  flag   = false;
  a = 4;
  b = 0;
  threshold = analogRead(sensor) + 200; // adding a value to the sensor reading to control how much darkness/brightness is necessary for the music to start playing
}

This is our loop function, which ensures that the sensor value is constantly being read and that the song plays when it is bright enough/pauses when it is dark.

void loop()
{
  int sensorreading = analogRead(sensor);
   if (sensorreading < threshold) { // play when there is brightness
    flag = true;
  }
   else if (sensorreading > threshold) { // pause when there is darkness
    flag = false;
   }

  // play next step in song when flag is true, meaning it is bright enough
  if (flag == true) {
    play();
   }
}

This is part of our play function, which determines which part of the song plays and the corresponding melody/rhythm.

void play() {
  int notelength;
  if (a == 1 || a == 2) {
     // intro
    notelength = beatlength * song1_intro_rhythmn[b];
    if   (song1_intro_melody[b] > 0) {
      digitalWrite(led, HIGH); // LED on
      tone(piezo,   song1_intro_melody[b], notelength);
    }
    b++;
    if (b >= sizeof(song1_intro_melody)   / sizeof(int)) {
      a++;
      b = 0;
      c = 0;
    }

And finally, our getFaster function to increase tempo by the decreasing the beat length when the button is pressed.

void getFaster()   { // decrease beat length in order to increase tempo
  beatlength = beatlength   / 2;
  if (beatlength < 20) { // loop back to original tempo
    beatlength   = 100;
  }

Circuit

Lastly, here is a link to the tutorial we followed:
https://projecthub.arduino.cc/slagestee/rickroll-box-d94733

Reading Reflection – Week 11 – Saeed Lootah

A Brief Rant on The Future of Interactive Design

Firstly, its clear that the author has strong feelings about the visions that people are suggesting for the future of technology but I felt that the author described what he felt should change clearly. I personally really liked the example of a hammer where the handle is made for the human and on the other end is the actual tool. It’s a simple object but it made me realize the importance of human centered design. The author had a large emphasis on hands and basically the torso. I liked the distinction he had made between touchscreen devices and how its just a “picture under glass” meaning that there was no tactile feedback based on what part of the screen you were pressing or based on what program you were using. A personal anecdote of mine is that touch-typing on a laptop or just any physical keyboard is much, much easier than on a phone for example. It’s hard to tell which key you are pressing an often the only way of getting it right is by sheer luck. While he did not give any specific solution (which was a common response to the article) it was still necessary to outline the problem no matter how obvious it seemed.

It’s a common trope that in the future we will be interacting with holograms and swiping or touching imaginary screens in the air which we can only see when wearing some kind of glasses, which is similar to what apple are attempting with the apple vision pro. While it makes a lot of sense I’m not sure how it can be made more tactile. For one there has to be a sense of weight when interacting with something to make it more intuitive and some kind of feedback at our fingertips. I imagine a way of making some kind of tactile feedback at our fingertips could be by using some kind of glove which magically gives feedback (maybe in the future the solution will be obvious but right now its magic, at least for me). In any case I found the reading interesting and there is a lot to digest and consider despite its conciseness (I think thats a word).

Final Project Concept: Fruit Ninja with Gesture Recognition

Concept: This project aims to create a physically interactive version of the popular game Fruit Ninja using Arduino and P5.js. Instead of swiping on a touchscreen, players will use a glove equipped with flex sensors to control the game through hand gestures.

Components:
Arduino: An Arduino board will be used to read data from the flex sensors on the glove.
Flex Sensors: These sensors will be attached to the glove’s fingers, detecting the bending motion when the player performs slicing gestures.
P5.js: P5.js will be used to create the game environment, display the fruits, and detect collisions between the virtual sword (player’s hand) and the fruits.

Gameplay:
Calibration: The player will initially calibrate the system by performing various hand gestures, allowing the Arduino to establish a baseline for each sensor and determine the range of motion.
Fruit Slicing: Fruits will appear on the screen, similar to the original game. The player will slice through the air with their hand, and the flex sensors will detect the bending of the fingers.
Data Transmission: The Arduino will send data from the flex sensors to P5.js via serial communication.
Collision Detection: P5.js will analyze the sensor data and determine if the player’s “virtual sword” (hand movement) intersects with a fruit on the screen.

Feedback: Successful slices will result in visual and auditory feedback, such as the fruit splitting apart and a slicing sound effect.
Scoring: The game will keep track of the player’s score based on the number of fruits sliced and the accuracy of their movements.

Challenges:
Sensor Calibration: Ensuring accurate and consistent readings from the flex sensors will be crucial for responsive gameplay.
Gesture Recognition: Developing an algorithm in P5.js to reliably translate sensor data into slicing actions will require careful design and testing.
Latency: Minimizing lag between the player’s movements and the game’s response is essential for a smooth and enjoyable experience.

Potential Enhancements:
Haptic Feedback: Integrating vibration motors into the glove could provide physical feedback when the player slices a fruit, further enhancing immersion.

Reading response – Interaction Design

Reading Bret Victor’s passionate critique of interaction design felt like he was voicing my own frustrations. Why do so many interfaces feel clunky and unnatural, forcing us to think like computers instead of them adapting to us? His call for a future where technology interacts with us as seamlessly as human speech resonated deeply.
The comparison he draws between current interfaces and the early days of written language is particularly powerful. It highlights the vast potential for improvement and reminds us that we’re still in the infancy of designing truly intuitive and human-centered interactions.
Victor’s vision connects with projects like the Tangible Media Group’s shape-shifting displays, where the digital and physical blend seamlessly. It’s exciting to imagine a future where interacting with technology is as natural as speaking or gesturing.
This rant is a powerful call to action. It challenges us to break free from limitations and create technology that empowers and inspires. It’s a future I’m eager to contribute to, where technology enhances our lives instead of holding us back.

Final Project Proposal – ArSL to Arabic Translation System

Growing up, I always experienced a communication barrier with my grandfather’s brother, who is hard of hearing. At family gatherings, only a select few who understand Arabic Sign Language (ArSL) could successfully communicate with him. This situation has been frustrating, as he has many adventures and stories that remain unshared and misunderstood by most of our family.

While there are systems available that translate American Sign Language (ASL) into English, the representation of Arabic in the technological domain of sign language translation is lacking. This disparity has not only limited the communication within diverse linguistic communities but also shows the urgent need for inclusive technology that bridges linguistic and sensory gaps.

My goal is to develop a real-time translation system for Arabic Sign Language using pose estimation combined with proximity sensing. The goal is to enable direct communication for ArSL users by translating their sign language into written Arabic. It would be nice to use machine learning models that specialize in pose estimation but I would need to do more research.

Final Project Idea “Love Level” – Shaikha AlKaabi

Project Idea: Love Level Game

  

The Love Sensor game is an interactive experience designed for two players, each placing one hand on a heart rate monitor. This innovative game uses heart rate data to measure and display the level of affection or excitement between the participants. The faster the heartbeats, the higher the presumed love connection. 

How It Works: 

  1. Heart Rate Monitoring: Each player places a hand on the heart rate monitor, which captures their pulse in real time.
  2. Data Processing: An Arduino board processes the heart rate data, assessing the intensity and changes in each player’s heartbeat.
  3. Dynamic Feedback: The processed data is then sent to a computer where p5.js generates visual and auditive outputs. These outputs are designed to dynamically respond to the heart rate data. For instance, visuals willgrow more vibrant as the players’ heart rates increase, symbolizing a stronger connection. Audio feedback may also escalate, creating a rich sensory environment that reflects the intensity of the game.
  4. Interactive Experience: The game aims to create a playful and engaging experience that not only entertains but also visually and audibly illustrates the emotional bond between the players, making it a fascinating exploration of human interactions.

  

Musical Instrument

This project provided valuable insights into the potential of technology in musical expression and exploration. Despite its seemingly simple design, utilizing two push buttons for sound generation and an ultrasound sensor for frequency modulation, the project unveiled a range of creative possibilities and highlighted areas for further development.

The incorporation of the ultrasound sensor was particularly intriguing. By translating physical distance into audible frequencies, the sensor effectively transformed space into a controllable musical parameter. This interaction proved to be a captivating demonstration of how technology can facilitate new forms of musical interaction and expression. The concept invites further exploration, prompting questions about the potential for incorporating additional sensors to create a multi-dimensional instrument responsive to a wider range of environmental stimuli.

// Check if distance is within range for playing sound
if (distance < 200 && distance > 2) {
   // Map distance to frequency
   int frequency = map(distance, 2, 200, 200, 2000);
   tone(buzzerPin, frequency); // Play sound at calculated frequency
 } else {
   noTone(buzzerPin); // Stop sound if distance is out of range
 }

 

While the project successfully demonstrated the feasibility of generating sound through Arduino, the limitations of pre-programmed sounds became evident. The lack of nuance and complexity inherent in such sounds presents a challenge for creating truly expressive and dynamic musical experiences. This observation underscores the need for further exploration into sound generation techniques, potentially involving machine learning or other advanced algorithms, to expand the sonic palette and introduce more organic and evolving soundscapes.

Week 11 Reading – Jihad Jammal

Reflecting on the author’s critique of what is dubbed “Pictures Under Glass,” I find a compelling challenge to our current acceptance of interface norms. The disparity highlighted between the tactile richness of our everyday physical interactions and the simplified, touch-based interfaces that dominate our devices urges us to question whether we are curbing our innovative potential. By adhering so strictly to touchscreen technology, we risk neglecting developments in interfaces that could better harness human physicality and sensory feedback.

This critique serves as a springboard for considering alternative interaction paradigms that prioritize enhancing our sensory and physical engagement with technology. While touchscreens undoubtedly provide accessibility and ease, there’s a significant opportunity to explore more immersive and intuitive interfaces. Such technologies would not only extend our capabilities but also deepen our interaction with the digital world, suggesting a future where technology complements rather than constrains our interactions with our environment. This shift could pave the way for more meaningful and natural user experiences that align closely with human behavior and physicality.